Physical Intelligence Presents: Giving 15-Minute Short-Term Memory Context Window to Robots

Mar 6, 2026 5:32 PM

TheGreatMani

Views

23200

Likes

18

Dislikes

11

artificial_intelligence

Got any more of that short term memory?

1 month ago | Likes 7 Dislikes 0

I love how this post frames it as if it's some big revelation and now how all algorithms and robots have always worked before LLMs came on the scene.

1 month ago | Likes 6 Dislikes 1

This just isn't true. Robotics barely worked before LLMs same goes with old symbolic means of computer vision recognition. You're feigning like this is old tech to downplay recent advances.

1 month ago | Likes 1 Dislikes 3

The current gen of AI, from any company, goes insane when the context size is too large.

1 month ago | Likes 3 Dislikes 2

Not this, obviously. Also there are infinite token context windows coming down the pipes of frontier research. Look up continuous learning models like Sakana AI's Gödel Machine

1 month ago | Likes 1 Dislikes 2

Granted, there are limited workarounds to improve the performance of existing models; however, it's the model itself, not the method, that remembers the past and becomes unstable in higher-context scenarios. I am certain this issue will be solved, but it's nowhere near solved at the moment. There's an architecture change that needs to occur before that can be solved.

1 month ago | Likes 2 Dislikes 1

I'm letting you know that the architectural change that you're alluding to has arrived. Please read this: https://research.google/blog/introducing-nested-learning-a-new-ml-paradigm-for-continual-learning/ This won best paper at this year's NeurIPS conference. Its a legitimate step function change in the narrative. As methods like this get implemented, we're going to see an insanely fast past of improvement this year as systems become more capable of performing long term research on themselves.

1 month ago | Likes 1 Dislikes 2