Contributed Talk Sessions | Poster Sessions | All Posters | Search Papers
Poster Session A: Tuesday, August 12, 1:30 – 4:30 pm, de Brug & E‑Hall
Evaluating Models of Naturalistic Episodic Memory
Mathis Pink1, Shashwat Saxena2, Mariya Toneva3; 1MPI-SWS, 2Indian Institute of Technology, Delhi, 3Max Planck Institute for Software Systems
Presenter: Mathis Pink
Recent advances in large language models (LLMs) have enabled them to process extended naturalistic inputs, making them promising candidates for modeling human episodic memory. However, standard transformer-based LLMs rely on full self-attention and positional encoding, which diverge from human episodic memory by supporting soft, parallel attention over complete input sequences. EM-LLM, a recent modification, inspired by episodic memory, replaces full attention with episodic retrieval from a non-parametric memory filled with discrete past episodes that were segmented via surprisal. Here, we evaluate whether EM-LLM captures a core property of episodic memory: the ability to recall the temporal order of events. Using a recency judgment task on segments from a full-length novel and comparing to human behavioral data, we find that a standard full-attention LLM aligns with human performance, while EM-LLM fails to recover temporal order across long sequences. These findings reveal a key limitation in EM-LLM's current design and suggest that temporal organization may require either additional architectural biases or learned representations—highlighting new directions for modeling episodic memory in naturalistic contexts.
Topic Area: Memory, Spatial Cognition & Skill Learning
Extended Abstract: Full Text PDF