Back to Search Start Over

Local prediction-learning in high-dimensional spaces enables neural networks to plan

Authors :
Christoph Stöckl
Yukun Yang
Wolfgang Maass
Source :
Nature Communications, Vol 15, Iss 1, Pp 1-16 (2024)
Publication Year :
2024
Publisher :
Nature Portfolio, 2024.

Abstract

Abstract Planning and problem solving are cornerstones of higher brain function. But we do not know how the brain does that. We show that learning of a suitable cognitive map of the problem space suffices. Furthermore, this can be reduced to learning to predict the next observation through local synaptic plasticity. Importantly, the resulting cognitive map encodes relations between actions and observations, and its emergent high-dimensional geometry provides a sense of direction for reaching distant goals. This quasi-Euclidean sense of direction provides a simple heuristic for online planning that works almost as well as the best offline planning algorithms from AI. If the problem space is a physical space, this method automatically extracts structural regularities from the sequence of observations that it receives so that it can generalize to unseen parts. This speeds up learning of navigation in 2D mazes and the locomotion with complex actuator systems, such as legged bodies. The cognitive map learner that we propose does not require a teacher, similar to self-attention networks (Transformers). But in contrast to Transformers, it does not require backpropagation of errors or very large datasets for learning. Hence it provides a blue-print for future energy-efficient neuromorphic hardware that acquires advanced cognitive capabilities through autonomous on-chip learning.

Subjects

Subjects :
Science

Details

Language :
English
ISSN :
20411723
Volume :
15
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Nature Communications
Publication Type :
Academic Journal
Accession number :
edsdoj.1e3ba5ca598e4e0c81399d6f2cf0d6ba
Document Type :
article
Full Text :
https://doi.org/10.1038/s41467-024-46586-0