Are there any survey papers on word embedding in NLP which covers the whole history of word embedding from simple topics like one-hot encoding to complex topics like w2v model?

Similar questions and discussions