1. Topology Variations of an Amplifier-based MOS Analog Neural Network Implementation and Weights Optimization
- Author
-
Fabian L. Cabrera, Tiago Weber, and Diogo da Silva Labres
- Subjects
Quantitative Biology::Neurons and Cognition ,Artificial neural network ,Computer science ,Amplifier ,Activation function ,Topology (electrical circuits) ,Topology ,Network topology ,Surfaces, Coatings and Films ,law.invention ,CMOS ,Hardware and Architecture ,law ,Signal Processing ,Netlist ,Resistor - Abstract
Neural networks are achieving state-of-the-art performance in many applications, from speech recognition to computer vision. A neuron in a multi-layer network needs to multiply each input by its weight, sum the results and perform an activation function. This paper is an extended version of the article in which we present an implementation of an amplifier-based MOS analog neuron and the optimization of the synaptic weights using in-loop circuit simulations. In addition to the base topology, we present two variations of the original conference paper topology to reduce area and power. MOS transistors operating in the triode region are used as variable resistors to convert the input and weight voltage to proportional input current. To test the analog neuron in full networks, an automatic generator is developed to produce a netlist based on the number of neurons on each layer, inputs, and weights. Simulation results using a CMOS 180 nm technology for all topologies demonstrate the neuron proper transfer function and its functionality while trained in test datasets.
- Published
- 2021