Back to Search Start Over

Evolution of Activation Functions for Deep Learning-Based Image Classification

Authors :
Lapid, Raz
Sipper, Moshe
Source :
Proceedings of 2022 Genetic and Evolutionary Computation Conference
Publication Year :
2022

Abstract

Activation functions (AFs) play a pivotal role in the performance of neural networks. The Rectified Linear Unit (ReLU) is currently the most commonly used AF. Several replacements to ReLU have been suggested but improvements have proven inconsistent. Some AFs exhibit better performance for specific tasks, but it is hard to know a priori how to select the appropriate one(s). Studying both standard fully connected neural networks (FCNs) and convolutional neural networks (CNNs), we propose a novel, three-population, coevolutionary algorithm to evolve AFs, and compare it to four other methods, both evolutionary and non-evolutionary. Tested on four datasets -- MNIST, FashionMNIST, KMNIST, and USPS -- coevolution proves to be a performant algorithm for finding good AFs and AF architectures.

Details

Database :
arXiv
Journal :
Proceedings of 2022 Genetic and Evolutionary Computation Conference
Publication Type :
Report
Accession number :
edsarx.2206.12089
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3520304.3533949