1. Multi-Activation Hidden Units for Neural Networks with Random Weights
- Author
-
Patrikar, Ajay M.
- Subjects
Computer Science - Neural and Evolutionary Computing ,Computer Science - Machine Learning - Abstract
Single layer feedforward networks with random weights are successful in a variety of classification and regression problems. These networks are known for their non-iterative and fast training algorithms. A major drawback of these networks is that they require a large number of hidden units. In this paper, we propose the use of multi-activation hidden units. Such units increase the number of tunable parameters and enable formation of complex decision surfaces, without increasing the number of hidden units. We experimentally show that multi-activation hidden units can be used either to improve the classification accuracy, or to reduce computations., Comment: 4 pages, 4 figures. arXiv admin note: substantial text overlap with arXiv:2008.10425
- Published
- 2020