Publication Date:
2023-08-08
Description:
Statistical earthquake forecast models are a critical component of several products released by the U.S. Geological Survey as part of its mission to deliver actionable information to decision makers in the US. These include the National Seismic Hazard Model (NSHM) and Operational Aftershock Forecasts (OAF), the latter of which can include swarm forecasts. Both products require earthquake forecast models that are applicable on different timescales, ranging from days to years (OAF) to decades (NSHM). OAF uses short-term forecast models that are based primarily on catalog statistics and include clustering behavior, such as encompassed in Reasenberg and Jones (1989), the ETAS model (Ogata, 1988), and swarm models (Llenos and van der Elst, 2019). The NSHM uses a long-term (50-year) earthquake forecast, one component of which is a smoothed seismicity model based on an earthquake catalog. Clusters are removed when forecasting the location of future seismicity but retained when estimating the future earthquake rate. Both OAF and the NSHM face challenges in catalog incompleteness, in both the short term and the long term. And both may be affected by anomalous earthquake rate changes, such as natural earthquake swarms, human-induced seismicity, or volcanic events. In this presentation, I will give an overview of the statistical models that are used for these two products, updates that have been made to address some of these challenges, future updates that could be considered such as the use of machine learning methods, and how the forecasts are used and presented to the public.
Language:
English
Type:
info:eu-repo/semantics/conferenceObject
Permalink