For various text classification models I have been using one-hot encoding of words and tf-idf vectorizers to generate integer features to be fed to Embedding layers. Please suggest me some better encoding mechanisms.
There are so many advances in this topic using word embeddings and deep neural networks. One the most famous one is developed by facebook group, called FastText which has many extensions and also works well with unseen words.
The keywords you are looking for is "text classification" and then add "with word embeddings" or "with deep neural networks".