Prakhar Mishra

Posts

Word Sequence Decoding in Seq2Seq Architectures

Word Sequence Decoding in Seq2Seq Architectures


Natural Language Generation (NLG) is a task of generating text. Natural Language Generation tasks like Machine Translation, Summarization, Dialogue Systems have a core component of generating the sequence of words as an output conditioned on a given input.For example — For a machine translation system, given an input sentence in English, the model needs to generate its French translation. Today most such systems are built on Encoder-Decoder architecture and it’s variations.

Sentiment Classification with BERT Embeddings

Sentiment Classification with BERT Embeddings


Sentiment Classification has been one of the oldest and most important problems in the field on Natural Language Processing (NLP). It is the task of telling if someone likes or dislikes the particular thing that they're talking about. Getting domain specific annotated training data usually becomes a challenge, but with the help of word embeddings, we can build good sentiment classifiers even with only reasonably modest-size label training sets.