LLMs have no access to truth or to reality. An LLM’s map of the world, so to speak, is a map of how we’ve put words together. In fact, it doesn’t even have words, just tokens—random numbers assigned to each distinct word (or part of a word). The map is like a table of relationships among those tokens. There are no sentences, true or false, to be found. And it doesn’t even always give the most likely response to an input, so it will sound like a human, and we humans are full of small surprises.