background

Word2Vec


What is Word2Vec?

First published in 2013, the word2vec approach draws on ideas from distributional semantics. Using neural networks, the approach learns vectors for individual words based on associations in a large corpus of text. Semantically related words are located closer together within the vector space. The vectors for ‘red’ and ‘blue’ will point to the same street while the vectors for ‘red’ and ‘school’ will point to streets in different towns.

Words are linguistic objects while vectors are geometric objects. With words now represented as vectors, these linguistic objects can be explored and analysed using the toolbox of geometry and linear algebra. One of these tools is vector composition: two vectors can be added or subtracted from each other to obtain a new vector. One of the striking results of the word2vec model was the following example of composition: ‘king’ - ‘man’ + ‘woman’ = ‘queen’ . The model doesn’t just learn vectors for words it also learns patterns and relations between words such as gender and tense.

Related Terms