Back to Search
Start Over
Activation functions of deep neural networks for polar decoding applications
- Source :
- PIMRC
- Publication Year :
- 2017
- Publisher :
- IEEE, 2017.
-
Abstract
- Among various deep neural network (DNN) components, this paper studies the activation functions especially for deep feed-forward networks with applications to channel decoding problems of polar code. In line with our previous study, this paper considers the ReLU (Rectified Linear Unit) and its variants for activation functions of DNN. We devise a new ReLU variant, called Sloped ReLU, by varying the slope of the ReLU for the positive domain range. This is analogous to tree architectures between the likelihood function in successive decoding of channel codes and the activation function in DNN. Our numerical results show that the polar decoding performance with the Sloped ReLU improves as the slope increases, up to a certain level. We believe that the idea of utilizing this analogy for determining activation functions of DNN can be applied to other decoding problems as well, which remains as a future work.
- Subjects :
- Artificial neural network
Computer science
Polar code
Activation function
020206 networking & telecommunications
020302 automobile design & engineering
02 engineering and technology
Tree (data structure)
Range (mathematics)
0203 mechanical engineering
0202 electrical engineering, electronic engineering, information engineering
Likelihood function
Algorithm
Decoding methods
Communication channel
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2017 IEEE 28th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC)
- Accession number :
- edsair.doi...........c9364d01378bb0bedeb110003ab6f5da
- Full Text :
- https://doi.org/10.1109/pimrc.2017.8292678