Back to Search Start Over

FFCL: Forward-Forward Net with Cortical Loops, Training and Inference on Edge Without Backpropagation

Authors :
Karkehabadi, Ali
Homayoun, Houman
Sasan, Avesta
Publication Year :
2024

Abstract

The Forward-Forward Learning (FFL) algorithm is a recently proposed solution for training neural networks without needing memory-intensive backpropagation. During training, labels accompany input data, classifying them as positive or negative inputs. Each layer learns its response to these inputs independently. In this study, we enhance the FFL with the following contributions: 1) We optimize label processing by segregating label and feature forwarding between layers, enhancing learning performance. 2) By revising label integration, we enhance the inference process, reduce computational complexity, and improve performance. 3) We introduce feedback loops akin to cortical loops in the brain, where information cycles through and returns to earlier neurons, enabling layers to combine complex features from previous layers with lower-level features, enhancing learning efficiency.<br />Comment: Accepted at the Great Lakes Symposium on VLSI 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2405.12443
Document Type :
Working Paper