Back to Search Start Over

Artificial to Spiking Neural Networks Conversion for Scientific Machine Learning

Authors :
Zhang, Qian
Wu, Chenxi
Kahana, Adar
Kim, Youngeun
Li, Yuhang
Karniadakis, George Em
Panda, Priyadarshini
Publication Year :
2023

Abstract

We introduce a method to convert Physics-Informed Neural Networks (PINNs), commonly used in scientific machine learning, to Spiking Neural Networks (SNNs), which are expected to have higher energy efficiency compared to traditional Artificial Neural Networks (ANNs). We first extend the calibration technique of SNNs to arbitrary activation functions beyond ReLU, making it more versatile, and we prove a theorem that ensures the effectiveness of the calibration. We successfully convert PINNs to SNNs, enabling computational efficiency for diverse regression tasks in solving multiple differential equations, including the unsteady Navier-Stokes equations. We demonstrate great gains in terms of overall efficiency, including Separable PINNs (SPINNs), which accelerate the training process. Overall, this is the first work of this kind and the proposed method achieves relatively good accuracy with low spike rates.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2308.16372
Document Type :
Working Paper