Toolkit chevron_right Word Embeddings

Word Embeddings Workbench

Explore vector arithmetic, semantic relationships, and bias in word embeddings.

functions Word Vector Arithmetic

Enter words to explore semantic relationships through vector operations. E.g., King − Man + Woman ≈ Queen

+

info How It Works

1

Vector Representation

Each word is mapped to a high-dimensional vector that captures its semantic meaning.

2

Arithmetic Operations

Subtracting and adding vectors reveals semantic relationships (e.g., royalty, gender).

3

Nearest Neighbor Search

We find words whose vectors are closest (by cosine similarity) to the result vector.

Cosine Similarity

lightbulb Try These Examples

memory

Loading GloVe Embeddings

Real Pre-trained Vectors • 50 Dimensions

LOADING

Loading GloVe 6B pre-trained word vectors (50d, ~5,000 words). All computation is client-side. No data is sent to any server.

Model
GloVe 6B
Dimension
50d
Vocabulary
~5,000
File Size
~2.1 MB
Loading Progress 0%