Back to Search
Start Over
Rolling the dice for better deep learning performance: A study of randomness techniques in deep neural networks
- Source :
- Information Sciences, p.120500 (2024)
- Publication Year :
- 2024
-
Abstract
- This paper investigates how various randomization techniques impact Deep Neural Networks (DNNs). Randomization, like weight noise and dropout, aids in reducing overfitting and enhancing generalization, but their interactions are poorly understood. The study categorizes randomness techniques into four types and proposes new methods: adding noise to the loss function and random masking of gradient updates. Using Particle Swarm Optimizer (PSO) for hyperparameter optimization, it explores optimal configurations across MNIST, FASHION-MNIST, CIFAR10, and CIFAR100 datasets. Over 30,000 configurations are evaluated, revealing data augmentation and weight initialization randomness as main performance contributors. Correlation analysis shows different optimizers prefer distinct randomization types. The complete implementation and dataset are available on GitHub.
Details
- Database :
- arXiv
- Journal :
- Information Sciences, p.120500 (2024)
- Publication Type :
- Report
- Accession number :
- edsarx.2404.03992
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1016/j.ins.2024.120500