PROWAREtech

articles » current » playground » word2vec-math

Word2Vec Word Vector Math

Word2Vec is a neural network model designed to learn word associations from a large corpus of text; it converts words into a high-dimensional space where similar words are placed closer together; try vector math on these word vectors here.

Google's Word2Vec is a popular machine learning model used to generate word embeddings, which are vector representations of words that capture semantic meanings based on their usage in text. The approach uses neural networks to learn word associations from large corpora of text. Here’s a more detailed breakdown of how Word2Vec works:

1. Model Architecture

Word2Vec offers two model architectures:

  • CBOW (Continuous Bag of Words): In this architecture, the model predicts the current word based on its context (surrounding words). This means the input to the model is the context words, and the output is the target word.
  • Skip-gram: The skip-gram model works in the opposite way; it uses the current word to predict the surrounding context words. This is generally more effective when dealing with less frequent words.

2. Learning Process

Both architectures use a shallow neural network (usually just one hidden layer) to learn the weights, which eventually become the "embeddings" or vector representations. During training, Word2Vec uses:

  • A sliding window to move across the text.
  • Each word (or the current word in the case of skip-gram) and its adjacent words within the window are either used as inputs to predict the middle word (CBOW) or as outputs predicted by the middle word (skip-gram).
  • The objective is to adjust the vector representations to maximize the likelihood of the actual words in the context.

3. Vector Operations

Once trained, each word in the vocabulary is associated with a vector. Word2Vec allows performing interesting algebraic operations on these vectors, such as:

  • Similarity: Cosine similarity between vectors is typically used to find words that have similar meanings, though this page uses squared Euclidean distance as it is less computationally intensive.
  • Analogy: For example, "king + queen = princess" shows how relationships and analogies can be captured by vector arithmetic.

Word embeddings learned by Word2Vec can be manipulated algebraically to discover semantic relationships between words. As mentioned, an example is "king + queen = princess".


Continuous Bag of Words

Add and subtract words, and find the nearest word.

 » 
Sentence...

This site uses cookies. Cookies are simple text files stored on the user's computer. They are used for adding features and security to this site. Read the privacy policy.
CLOSE