Explainable Graph Neural Networks for Observation Impact Analysis in Atmospheric State Estimation
This paper investigates the impact of observations on atmospheric state estimation in
weather forecasting systems using graph neural networks (GNNs) and explainability
methods. We integrate observation and Numerical Weather Prediction (NWP) points into a
meteorological graph, extracting $ k $-hop subgraphs centered on NWP points. Self-
supervised GNNs are employed to estimate the atmospheric state by aggregating data
within these $ k $-hop radii. The study applies gradient-based explainability methods to …
weather forecasting systems using graph neural networks (GNNs) and explainability
methods. We integrate observation and Numerical Weather Prediction (NWP) points into a
meteorological graph, extracting $ k $-hop subgraphs centered on NWP points. Self-
supervised GNNs are employed to estimate the atmospheric state by aggregating data
within these $ k $-hop radii. The study applies gradient-based explainability methods to …
This paper investigates the impact of observations on atmospheric state estimation in weather forecasting systems using graph neural networks (GNNs) and explainability methods. We integrate observation and Numerical Weather Prediction (NWP) points into a meteorological graph, extracting -hop subgraphs centered on NWP points. Self-supervised GNNs are employed to estimate the atmospheric state by aggregating data within these -hop radii. The study applies gradient-based explainability methods to quantify the significance of different observations in the estimation process. Evaluated with data from 11 satellite and land-based observations, the results highlight the effectiveness of visualizing the importance of observation types, enhancing the understanding and optimization of observational data in weather forecasting.
arxiv.org
Showing the best result for this search. See all results