Skip to main content
eScholarship
Open Access Publications from the University of California

Decoding Sequential Information: the Language of Thought for Human Cognitive Processing of Temporal Structure

Abstract

Sequential information is encoded through various systems, among which, chunking, rule recognition and nested tree structures. However, the computational and neural mechanisms connecting these systems remain largely unknown. Dehaene et al. (2022) propose that humans possess internal languages governed by symbolic rules, coined Language of Thought (LoT). Based on this assumption we developed the Language of Thought (LoT) algorithm, which processes sequences and produces descriptions as minimal programs. In an online experiment, participants reproduced spatial sequences. Structured sequences, defined by temporal regularities, were notably better reproduced than controls for temporal structure. Participants demonstrated the ability to compress structured sequences in working memory. Response times and performance suggested chunking around a repetition rule. Further analysis, suggested hierarchical organization of those chunks, following a syntactic rule - recursive repetition. LoT-complexity, equal to minimal description length (MDL) of the sequence in our LoT, outperformed other information theory models, aligning best with the data.

Main Content
For improved accessibility of PDF content, download the file to your device.
Current View