Hello,
I want to know what is the difference between the #epochs and #iterations in the word2vec code. Is there any relation between #epochs, #iterations and batch-size?
Does one iteration means training over whole 17 Million words in text8 corpus?
Also, wanted to know what batch size is being used in the code.
Please, let me know.
Thanks.