Back to Search Start Over

Revealing the mechanisms of semantic satiation with deep learning models

Authors :
Xinyu Zhang
Jing Lian
Zhaofei Yu
Huajin Tang
Dong Liang
Jizhao Liu
Jian K. Liu
Source :
Communications Biology, Vol 7, Iss 1, Pp 1-12 (2024)
Publication Year :
2024
Publisher :
Nature Portfolio, 2024.

Abstract

Abstract The phenomenon of semantic satiation, which refers to the loss of meaning of a word or phrase after being repeated many times, is a well-known psychological phenomenon. However, the microscopic neural computational principles responsible for these mechanisms remain unknown. In this study, we use a deep learning model of continuous coupled neural networks to investigate the mechanism underlying semantic satiation and precisely describe this process with neuronal components. Our results suggest that, from a mesoscopic perspective, semantic satiation may be a bottom-up process. Unlike existing macroscopic psychological studies that suggest that semantic satiation is a top-down process, our simulations use a similar experimental paradigm as classical psychology experiments and observe similar results. Satiation of semantic objectives, similar to the learning process of our network model used for object recognition, relies on continuous learning and switching between objects. The underlying neural coupling strengthens or weakens satiation. Taken together, both neural and network mechanisms play a role in controlling semantic satiation.

Subjects

Subjects :
Biology (General)
QH301-705.5

Details

Language :
English
ISSN :
23993642
Volume :
7
Issue :
1
Database :
Directory of Open Access Journals
Journal :
Communications Biology
Publication Type :
Academic Journal
Accession number :
edsdoj.7c462d45cc1462a8b38d9dbec8943f8
Document Type :
article
Full Text :
https://doi.org/10.1038/s42003-024-06162-0