1. A Sample-Efficient OPF Learning Method Based on Annealing Knowledge Distillation
- Author
-
Ziheng Dong, Kai Hou, Zeyu Liu, Xiaodan Yu, Hongjie Jia, and Chi Zhang
- Subjects
Optimal power flow ,sample efficiency ,annealing knowledge distillation ,focal loss function ,stacked denoising autoencoder ,deep learning ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
To quickly respond to variations in the state of network load demand, a solution using data-driven techniques to predict optimal power flow (OPF) has emerged in recent years. However, most of the existing methods are highly dependent on large data volumes. This limits their application on the newly established or expanded systems. In this regard, this work proposes a sample-efficient OPF learning method to maximize the utilization of limited samples. By decomposing the OPF task before knowledge distillation, deep learning complexity is reduced. Thereafter, knowledge distillation is used to integrate decoupled tasks and improve accuracy in low-data setups. Unsupervised pre-training is introduced to alleviate the demand for labeled data. Additionally, the focal loss function and teacher annealing strategy are adopted to achieve higher accuracy without extra samples. Numerical tests on different systems corroborate the advanced accuracy and training speed over other training methods, especially on fewer-sample occasions.
- Published
- 2022
- Full Text
- View/download PDF