UHG
Search
Close this search box.

Researchers Recreate Human Episodic Memory to Give LLMs Infinite Context

EM-LLM integrates important aspects of episodic memory and event cognition, used by the human brain, into LLMs.

Share

In trying to address the problem of limited contexts for large language models (LLMs), researchers have tried to emulate human episodic memory with EM-LLM.

The paper, titled “Human-like Episodic Memory for Infinite Context LLMs”, was released by researchers from Huawei and University College London.

EM-LLM (wherein EM stands for episodic memory) integrates important aspects of episodic memory and event cognition, used by the human brain, into LLMs. Through this, researchers have suggested that LLMs using EM-LLM (wherein EM stands for episodic memory) can potentially have infinite context lengths while maintaining their regular functioning.

This is particularly interesting as the method can be scalable without increasing the amount of computing power needed for the LLM to function. “By bridging insights from cognitive science with machine learning, our approach not only enhances the performance of LLMs on long-context tasks but also provides a scalable computational framework for testing hypotheses about human memory,” the researchers said.

EM-LLM goes about recreating human episodic memory by organising tokens into episodic events using Bayesian surprise and graph-theoretic boundary refinement. Following this, they employ a two-stage retrieval process, based on time and similarity, to allow for human-like access and retrieval of information.

Interestingly, when compared to similar proposals to improve context windows for LLMs, like InfLLM, EM-LLM excelled, with an overall improvement of 4.3%. Under the PassageRetrieval task, EM-LLM had a 33% improvement over InfLLM. 

“Our analysis reveals strong correlations between EM-LLM’s event segmentation and human-perceived events, suggesting a bridge between this artificial system and its biological counterpart,” the researchers stated.

This is yet another step towards tackling the problem of context lengths when it comes to interacting with LLMs. Research has been ongoing to increase the context lengths, with major companies like Google and Meta releasing their own papers on pursuing infinite context lengths.

📣 Want to advertise in AIM? Book here

Picture of Donna Eva

Donna Eva

Donna is a technology journalist at AIM, hoping to explore AI and its implications in local communities, as well as its intersections with the space, defence, education and civil sectors.
Related Posts
19th - 23rd Aug 2024
Generative AI Crash Course for Non-Techies
Upcoming Large format Conference
Sep 25-27, 2024 | 📍 Bangalore, India
Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

Flagship Events

Rising 2024 | DE&I in Tech Summit
April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore
Data Engineering Summit 2024
May 30 and 31, 2024 | 📍 Bangalore, India
MachineCon USA 2024
26 July 2024 | 583 Park Avenue, New York
MachineCon GCC Summit 2024
June 28 2024 | 📍Bangalore, India
Cypher USA 2024
Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA
Cypher India 2024
September 25-27, 2024 | 📍Bangalore, India
discord-icon
AI Forum for India
Our Discord Community for AI Ecosystem, In collaboration with NVIDIA.