Data assimilation (DA) is a powerful tool to optimally combine uncertain model simulations and observations. Among DA techniques, the particle filter (PF) has gained attention for its capacity to deal with nonlinear systems and for its relaxation of the Gaussian assumption. However, the PF may suffer from degeneracy and sample impoverishment. In this study, we propose an innovative approach, based on a tempered particle filter (TPF), aiming at mitigating PFs issues, thus extending over time the assimilation benefits. Probabilistic flood maps derived from synthetic aperture radar data are assimilated into a flood forecasting model through an iterative process including a particle mutation in order to keep diversity within the ensemble. Results show an improvement of the model forecasts accuracy, with respect to the Open Loop: on average the root mean square error (RMSE) of water levels decrease by 80% at the assimilation time and by 60% 2 days after the assimilation. A comparison with the Sequential Importance Sampling (SIS) is carried out showing that although SIS performances are generally comparable to the TPF ones at the assimilation time, they tend to decrease more quickly. For instance, on average TPF-based RMSE are 20% lower compared to the SIS-based ones 2 days after the assimilation. The application of the TPF determines higher critical success index values compared to the SIS. On average the increase in performances lasts for almost 3 days after the assimilation. Our study provides evidence that the application of the variant of the TPF enables more persistent benefits compared to the SIS.
Keywords: data assimilation; degeneracy; flood extent map; flood model; particle filter; tempering.
© 2022. The Authors.