1. Using available and incoming data for reducing and updating seismic source ensembles for probabilistic tsunami forecasting (PTF) in early-warning and urgent computing
- Author
-
Louise Cordrie, Jacopo Selva, Fabrizio Bernardi, and Roberto Tonini
- Abstract
Tsunami urgent computing procedures quantify the potential hazard due to an earthquake right after its occurrence, that is within a few hours. The hazard is quantified by simulating the propagation of the tsunami waves in the sea, accounting for the uncertainty due to the scarce knowledge of the source parameters and wave modelling uncertainty.In the context of the European project eflows4HPC, a workflow is currently in development for tsunamis hazard urgent computing, which consists of the following steps: 1) Retrieval of information about the tsunamigenic seismic event (magnitude, hypocentre and their uncertainties); 2) Definition of an ensemble of seismic sources; 3) Simulation of seismic/tsunamigenic waves propagation for each scenario in the ensemble; 4) Results aggregation to produce an estimate of seismic and tsunami hazard, which also incorporates a basic treatment of modelling uncertainty. The ensembles cover the uncertainty on source characteristics and may consequently be very large (generally 10,000 to 100,000 of scenarios; Selva et al., Nat. Comm.), requiring very high computational resources for the urgent computing context. It is thus necessary to reduce the size of these ensembles to limit the number of simulations and to converge faster towards stable results of hazard calculation.We developed and tested several sampling procedures aiming to reduce the number of scenarios in the ensemble and, at the same time, to integrate the new incoming information as they become available (e.g. solutions for focal mechanisms, seismic or tsunami records). When applied to several past earthquakes and tsunamis (e.g., the 2003 Boumerdes and the 2017 Kos-Bodrum earthquakes), our novel sampling strategies yielded a reduction of 1 or 2 order of magnitudes of the ensemble size, allowing a drastic reduction of the computational effort. Also, the update of the ensemble based on the incoming of new data, which strongly reduce the uncertainty, yields to an update of the probabilistic forecasts without compromising its accuracy. This may result very important for mitigating the risk far from the seismic source, as well as improving the risk management by better informing decision making in a frame of urgency.
- Published
- 2023
- Full Text
- View/download PDF