Back to Search Start Over

Brain-inspired Distributed Memorization Learning for Efficient Feature-free Unsupervised Domain Adaptation

Authors :
Lv, Jianming
Liang, Depin
Liang, Zequan
Zhang, Yaobin
Xia, Sijun
Publication Year :
2024

Abstract

Compared with gradient based artificial neural networks, biological neural networks usually show a more powerful generalization ability to quickly adapt to unknown environments without using any gradient back-propagation procedure. Inspired by the distributed memory mechanism of human brains, we propose a novel gradient-free Distributed Memorization Learning mechanism, namely DML, to support quick domain adaptation of transferred models. In particular, DML adopts randomly connected neurons to memorize the association of input signals, which are propagated as impulses, and makes the final decision by associating the distributed memories based on their confidence. More importantly, DML is able to perform reinforced memorization based on unlabeled data to quickly adapt to a new domain without heavy fine-tuning of deep features, which makes it very suitable for deploying on edge devices. Experiments based on four cross-domain real-world datasets show that DML can achieve superior performance of real-time domain adaptation compared with traditional gradient based MLP with more than 10% improvement of accuracy while reducing 87% of the timing cost of optimization.<br />Comment: 15 pages,15 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2402.14598
Document Type :
Working Paper