Back to Search Start Over

An Efficient Hardware Architecture for a Neural Network Activation Function Generator.

Authors :
Wang, Jun
Yi, Zhang
Zurada, Jacek M.
Lu, Bao-Liang
Yin, Hujun
Larkin, Daniel
Kinane, Andrew
Muresan, Valentin
O'Connor, Noel
Source :
Advances in Neural Networks - ISNN 2006 (9783540344827); 2006, p1319-1327, 9p
Publication Year :
2006

Abstract

This paper proposes an efficient hardware architecture for a function generator suitable for an artificial neural network (ANN). A spline-based approximation function is designed that provides a good trade-off between accuracy and silicon area, whilst also being inherently scalable and adaptable for numerous activation functions. This has been achieved by using a minimax polynomial and through optimal placement of the approximating polynomials based on the results of a genetic algorithm. The approximation error of the proposed method compares favourably to all related research in this field. Efficient hardware multiplication circuitry is used in the implementation, which reduces the area overhead and increases the throughput. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISBNs :
9783540344827
Database :
Supplemental Index
Journal :
Advances in Neural Networks - ISNN 2006 (9783540344827)
Publication Type :
Book
Accession number :
32862563
Full Text :
https://doi.org/10.1007/11760191_192