Back to Search Start Over

A comprehensive review of Binary Neural Network.

Authors :
Yuan, Chunyu
Agaian, Sos S.
Source :
Artificial Intelligence Review; Nov2023, Vol. 56 Issue 11, p12949-13013, 65p
Publication Year :
2023

Abstract

Deep learning (DL) has recently changed the development of intelligent systems and is widely adopted in many real-life applications. Despite their various benefits and potentials, there is a high demand for DL processing in different computationally limited and energy-constrained devices. It is natural to study game-changing technologies such as Binary Neural Networks (BNN) to increase DL capabilities. Recently remarkable progress has been made in BNN since they can be implemented and embedded on tiny restricted devices and save a significant amount of storage, computation cost, and energy consumption. However, nearly all BNN acts trade with extra memory, computation cost, and higher performance. This article provides a complete overview of recent developments in BNN. This article focuses exclusively on 1-bit activations and weights 1-bit convolution networks, contrary to previous surveys in which low-bit works are mixed in. It conducted a complete investigation of BNN's developmentā€”from their predecessors to the latest BNN algorithms/techniques, presenting a broad design pipeline and discussing each module's variants. Along the way, it examines BNN (a) purpose: their early successes and challenges; (b) BNN optimization: selected representative works that contain essential optimization techniques; (c) deployment: open-source frameworks for BNN modeling and development; (d) terminal: efficient computing architectures and devices for BNN and (e) applications: diverse applications with BNN. Moreover, this paper discusses potential directions and future research opportunities in each section. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02692821
Volume :
56
Issue :
11
Database :
Complementary Index
Journal :
Artificial Intelligence Review
Publication Type :
Academic Journal
Accession number :
172313620
Full Text :
https://doi.org/10.1007/s10462-023-10464-w