Word Embedding
Word embedding is a general term for language models and representation learning techniques in natural language processing. Conceptually, it refers to embedding words from a high-dimensional space into a low-dimensional continuous vector space, where each word or phrase is mapped to a vector in the real number field.
Current word embedding methods include artificial neural networks, dimensionality reduction of word syntactic matrices, probability models, and explicit representation of the context in which words are located. In the underlying input, word embedding methods that represent phrases can improve the effectiveness of grammatical analyzers and text sentiment analysis.
Word Embedding Algorithm
- Embedding layers: a method for jointly learning with neural network models for specific natural language processing tasks;
- Word2Vec: A statistical method for efficiently learning independent word embeddings from a text corpus.
- GloVe: An extension of the Word2Vec method that can efficiently learn word vectors.