Back to Search Start Over

A high bit resolution FPGA implementation of a FNN with a new algorithm for the activation function

Authors :
Ferreira, Pedro
Ribeiro, Pedro
Antunes, Ana
Dias, Fernando Morgado
Source :
Neurocomputing. Dec2007, Vol. 71 Issue 1-3, p71-77. 7p.
Publication Year :
2007

Abstract

Abstract: Several implementations of Feedforward Neural Networks have been reported in scientific papers. These implementations do not allow the direct use of off-line trained networks. Usually, the problem is the lower precision (compared to the software used for training) or modifications in the activation function. In the present work, a hardware solution called Artificial Neural Network Processor, using a FPGA, fits the requirements for a direct implementation of Feedforward Neural Networks, because of the high precision and accurate activation function that were obtained. The resulting hardware solution is tested with data from a real system to confirm that it can correctly implement the models prepared off-line with MATLAB. [Copyright &y& Elsevier]

Details

Language :
English
ISSN :
09252312
Volume :
71
Issue :
1-3
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
27354147
Full Text :
https://doi.org/10.1016/j.neucom.2006.11.028