1. Towards Biologically Plausible Computing: A Comprehensive Comparison
- Author
-
Lv, Changze, Gu, Yufei, Guo, Zhengkang, Xu, Zhibo, Wu, Yixin, Zhang, Feiran, Shi, Tianyuan, Wang, Zhenghua, Yin, Ruicheng, Shang, Yu, Zhong, Siqi, Wang, Xiaohua, Wu, Muling, Liu, Wenhao, Li, Tianlong, Zhu, Jianhao, Zhang, Cenyuan, Ling, Zixuan, and Zheng, Xiaoqing
- Subjects
Computer Science - Neural and Evolutionary Computing - Abstract
Backpropagation is a cornerstone algorithm in training neural networks for supervised learning, which uses a gradient descent method to update network weights by minimizing the discrepancy between actual and desired outputs. Despite its pivotal role in propelling deep learning advancements, the biological plausibility of backpropagation is questioned due to its requirements for weight symmetry, global error computation, and dual-phase training. To address this long-standing challenge, many studies have endeavored to devise biologically plausible training algorithms. However, a fully biologically plausible algorithm for training multilayer neural networks remains elusive, and interpretations of biological plausibility vary among researchers. In this study, we establish criteria for biological plausibility that a desirable learning algorithm should meet. Using these criteria, we evaluate a range of existing algorithms considered to be biologically plausible, including Hebbian learning, spike-timing-dependent plasticity, feedback alignment, target propagation, predictive coding, forward-forward algorithm, perturbation learning, local losses, and energy-based learning. Additionally, we empirically evaluate these algorithms across diverse network architectures and datasets. We compare the feature representations learned by these algorithms with brain activity recorded by non-invasive devices under identical stimuli, aiming to identify which algorithm can most accurately replicate brain activity patterns. We are hopeful that this study could inspire the development of new biologically plausible algorithms for training multilayer networks, thereby fostering progress in both the fields of neuroscience and machine learning.
- Published
- 2024