-->

KMWorld 2024, Washington, DC - November 18 - 21 

Links conquer the universe

Article Featured Image

But what does “distance” mean in this context? Surprisingly, it’s quite literal: It refers to how closely related words like “king” and “queen” are based on their co-occurrence and contextual similarity in the sentences the LLM encountered during training. The model also learns the “distance” between “king” and other words, such as “crab,” “cobra,” and “Elvis.” For instance, every time the model encounters the phrase “king cobra” during training, the vectors of each of those words are adjusted to bring the relationship between them slightly closer.

It seems impossible that, with this information, GenAI models can answer questions, be they simple or complex, and can respond to our queries with a reasonable approximation of the truth. This seems impossible because our wee human brains can’t imagine, much less deal with, scale. And the scale of these systems is brain-busting. While the companies generally don’t like to say too much about the details, it seems that LLMs have somewhere between 50,000 and 200,000 tokens and hundreds of billions of weighted relationships (“parameters”) among them.

Now let’s drop back down to look at a single token. What do you see? The word “king”? No, you see a random number. Do you see a definition? Definitely not. Do you see its connection to all other words? Not exactly. You’d see its vector, which is a set of numbers showing where the token is positioned in each of the model’s dimensions depending on their context: This is why “king” is near “queen” in one context and near “Kong” in another.

We can talk about these relationships as links. They’re not expressed in blue underlined text, and you can’t click on them. But they are the relationships among words that matter in any particular circumstance. They are the relationships that give words meaning. And as in life, those meanings are multiple and contextual. Without those relationships, there is no language.

LLMs challenge our naive belief that a word is a tag attached to a thing. A word is its relationship to other words. Words don’t label the world so much as reveal it in all the complexity of the potentiality of the relationships among words taken together.

In fact, not only do we see links instead of definitions, those links are more complicated than we can imagine. They are an unrealized and unrealizable potential, which is why words can still surprise us.

Obviously, this isn’t how hyperlinks work. Those blue underlinings are created by us when we see a relationship that enriches the idea we’re expressing. Such relationships are far up the ladder of thought, and such thought is literally unthinkable without the conceptual links hyperlinks enhance. What’s true of words is also true of the things we encounter: They have meaning only in their dynamic relationship to everything else. Life is lived in links and perhaps in nothing but links.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues