1. Directional information flow analysis in memory retrieval: a comparison between exaggerated and normal pictures.
- Author
-
Zanjani, Mani Farajzadeh and Ghoshuni, Majid
- Abstract
Working memory plays an important role in cognitive science and is a basic process for learning. While working memory is limited in regard to capacity and duration, different cognitive tasks are designed to overcome these difficulties. This study investigated information flow during a novel visual working memory task in which participants respond to exaggerated and normal pictures. Ten healthy men (mean age 28.5 ± 4.57 years) participated in two stages of the encoding and retrieval tasks. The electroencephalogram (EEG) signals are recorded. Moreover, the adaptive directed transfer function (ADTF) method is used as a computational tool to investigate the dynamic process of visual working memory retrieval on the extracted event-related potentials (ERPs) from the EEG signal. Network connectivity and P300 sub-components (P3a, P3b, and LPC) are also extracted during visual working memory retrieval. Then, the nonparametric Wilcoxon test and five classifiers are applied to network properties for features selection and classification between exaggerated-old and normal-old pictures. The Z-values of Ge is more distinctive rather than other network properties. In terms of the machine learning approach, the accuracy, F1-score, and specificity of the k-nearest neighbors (KNN), classifiers are 81%, 77%, and 81%, respectively. KNN classifier ranked first compared with other classifiers. Furthermore, the results of in-degree/out-degree matrices show that the information flows continuously in the right hemisphere during the retrieval of exaggerated pictures, from P3a to P3b. During the retrieval of visual working memory, the networks associated with attentional processes show greater activation for exaggerated pictures compared to normal pictures. This suggests that the exaggerated pictures may have captured more attention and thus required greater cognitive resources for retrieval. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF