Back to Search Start Over

Suppressing simulation bias using multi-modal data

Authors :
Kustowski, Bogdan
Gaffney, Jim A.
Spears, Brian K.
Anderson, Gemma J.
Anirudh, Rushil
Bremer, Peer-Timo
Thiagarajan, Jayaraman J.
Kruse, Michael K. G.
Nora, Ryan C.
Publication Year :
2021

Abstract

Many problems in science and engineering require making predictions based on few observations. To build a robust predictive model, these sparse data may need to be augmented with simulated data, especially when the design space is multi-dimensional. Simulations, however, often suffer from an inherent bias. Estimation of this bias may be poorly constrained not only because of data sparsity, but also because traditional predictive models fit only one type of observed outputs, such as scalars or images, instead of all available output data modalities, which might have been acquired and simulated at great cost. To break this limitation and open up the path for multi-modal calibration, we propose to combine a novel, transfer learning technique for suppressing the bias with recent developments in deep learning, which allow building predictive models with multi-modal outputs. First, we train an initial neural network model on simulated data to learn important correlations between different output modalities and between simulation inputs and outputs. Then, the model is partially retrained, or transfer learned, to fit the experiments; a method that has never been implemented in this type of architecture. Using fewer than 10 inertial confinement fusion experiments for training, transfer learning systematically improves the simulation predictions while a simple output calibration, which we design as a baseline, makes the predictions worse. We also offer extensive cross-validation with real and carefully designed synthetic data. The method described in this paper can be applied to a wide range of problems that require transferring knowledge from simulations to the domain of experiments.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2104.09684
Document Type :
Working Paper