
Are dreams event sequence predictions? Are brains LLMs for events rather than words?
LLMs (large language models) are changing the world. Widely used LLMs were built by training algorithms to predict the next word based on the preceding word sequence. (Technically, LLMs operate on “tokens,” not words. Words and tokens are related but distinct. A typical LLM would probably represent the word “isn’t” with two ordered tokens, the first representing “is” and the second representing “not” because whether someone says “isn’t” or “is not” is basically irrelevant.) ...