EM-LLM: Human-Inspired Episodic Memory for Infinite Context LLMs github.com 37 points by jbotz 4 days ago
mountainriver 3 days ago TTT, cannon layers, and titans seem like a stronger approach IMO.Information needs to be compressed into latent space or it becomes computationally intractable searchguy 5 hours ago do you have references to> TTT, cannon layers, and titans
MacsHeadroom 3 days ago So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.
TTT, cannon layers, and titans seem like a stronger approach IMO.
Information needs to be compressed into latent space or it becomes computationally intractable
do you have references to
> TTT, cannon layers, and titans
So, infinite context length by making it compute bound instead of memory bound. Curious how much longer this takes to run and when it makes sense to use vs RAG.