They basically figured out how to give AI both short-term and long-term memory that actually works. Like, imagine your brain being able to remember an entire book while still processing new info efficiently.
The whole search-time learning thing is starting to look more and more like what Sutton was talking about.
This thing can handle 2M+ tokens while being faster than regular transformers. That’s like going from a USB stick to a whole SSD of memory, but for AI.
This is a dope step forward. 2025’s starting strong ngl.
What this means (quoting a redditor):
This is pretty wild.
They basically figured out how to give AI both short-term and long-term memory that actually works. Like, imagine your brain being able to remember an entire book while still processing new info efficiently.
The whole search-time learning thing is starting to look more and more like what Sutton was talking about.
This thing can handle 2M+ tokens while being faster than regular transformers. That’s like going from a USB stick to a whole SSD of memory, but for AI.
This is a dope step forward. 2025’s starting strong ngl.
NotebookLM explaining why we're back