Back to Search
Start Over
Activation functions performance in multilayer perceptron for time series forecasting.
- Source :
-
AIP Conference Proceedings . 2024, Vol. 3123 Issue 1, p1-10. 10p. - Publication Year :
- 2024
-
Abstract
- Activation functions are important hyperparameters in neural networks, applied to calculate the weighted sum of inputs and biases and determine whether a neuron can be activated. Choosing the most suitable activation function can assist neural networks in training faster without sacrificing accuracy. This study aims to evaluate the performance of three activation functions, Sigmoid, Hyperbolic Tangent (Tanh), and Rectified Linear Unit (ReLU) in the hidden layer of Multilayer Perceptron (MLP) for time series forecasting. To evaluate the activation functions, three simulated non-linear time series were generated using the Threshold Autoregressive (TAR) model, and two real datasets, the Canadian Lynx series and Wolf's Sunspot data, were employed. The Mean Square Error (MSE) and Mean Absolute Error (MAE) were computed to measure the performance accuracy. The analysis of the real data revealed that the Tanh function exhibited the lowest MSE and MAE, with values of 1.345 and 0.945, respectively. The Sigmoid function yielded MSE and MAE values of 1.520 and 1.005, while the ReLU function resulted in values of 1.562 and 1.018. These findings align with the simulation results, confirming that the Tanh function is the most effective for time series forecasting. Therefore, it is recommended to replace the commonly used Sigmoid function with Tanh for an accurate forecast. [ABSTRACT FROM AUTHOR]
- Subjects :
- *LYNX
*TIME series analysis
*FORECASTING
*SUNSPOTS
*DATA analysis
Subjects
Details
- Language :
- English
- ISSN :
- 0094243X
- Volume :
- 3123
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- AIP Conference Proceedings
- Publication Type :
- Conference
- Accession number :
- 179273861
- Full Text :
- https://doi.org/10.1063/5.0223864