LLMs and Beyond: All Roads Lead to Latent Space

Eric Drexler at AI Prospects: Toward Global Goal Alignment:

Today’s AI technologies are based on deep learning, yet “AI” is commonly equated with large language models, and the fundamental nature of deep learning and LLMs is obscured by talk of using “statistical patterns” to “predict tokens”. This token-focused framing has been a remarkably effective way to misunderstand AI.

The key concept to grasp is “latent space” (not statistics, or tokens, or algorithms). It’s representation and processing in “latent space” that are fundamental to today’s AI and to understanding future prospects. This article offers some orientation and perhaps some new perspectives.

More here.

Enjoying the content on 3QD? Help keep us going by donating now.