Challenges in building word embedding from scratch
Training word embeddings from scratch is possible but it is quite challenging due to large trainable parameters and sparsity of training data. These models need to be trained on a large number of datasets with rich vocabulary and as there are large number of parameters, it makes the training slower. So, it’s quite challenging to train a word embedding model on an individual level.
Pre-Trained Word Embedding in NLP
Word Embedding is an important term in Natural Language Processing and a significant breakthrough in deep learning that solved many problems. In this article, we’ll be looking into what pre-trained word embeddings in NLP are.
Table of Content
- Word Embeddings
- Challenges in building word embedding from scratch
- Pre Trained Word Embeddings
- Word2Vec
- GloVe
- BERT Embeddings