Global Vectors for Word Representation (GloVe)
Original GloVe paper
: GloVe: Global Vectors for Word Representation
GloVe consists of a weighted least squares model that trains on global word-word co-occurrence counts and thus makes efficient use of statistics. The model produces a word vector space with meaningful sub-structure.
Crucial Insight: Ratios of co-occurrence probabilities can encode meaning components.
References
[1]. Word Vectors II: GloVe, Evaluation and Training
[2]. Interpreting GloVe model