ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2022-03-16
    Description: Modelling atmospheric dispersal of volcanic ash and aerosols is becoming increasingly valuable for assessing the potential impacts of explosive volcanic eruptions on infrastructures, air quality, and aviation. Management of volcanic risk and reduction of aviation impacts can strongly benefit from quantitative forecasting of volcanic ash. However, an accurate prediction of volcanic aerosol concentrations using numerical modelling relies on proper estimations of multiple model parameters which are prone to errors. Uncertainties in key parameters such as eruption column height, physical properties of particles or meteorological fields, represent a major source of error affecting the forecast quality. The availability of near-real-time geostationary satellite observations with high spatial and temporal resolutions provides the opportunity to improve forecasts in an operational context by incorporating observations into numerical models. Specifically, ensemble-based filters aim at converting a prior ensemble of system states into an analysis ensemble by assimilating a set of noisy observations. Previous studies dealing with volcanic ash transport have demonstrated that a significant improvement of forecast skill can be achieved by this approach. In this work, we present a new implementation of an ensemble-based Data Assimilation (DA) method coupling the FALL3D dispersal model and the Parallel Data Assimilation Framework (PDAF). The FALL3D+PDAF system runs in parallel, supports online-coupled DA and can be efficiently integrated into operational workflows by exploiting high-performance computing (HPC) resources. Two numerical experiments are considered: (i) a twin experiment using an incomplete dataset of synthetic observations of volcanic ash and, (ii) an experiment based on the 2019 Raikoke eruption using real observations of SO2 mass loading. An ensemble-based Kalman filtering technique based on the Local Ensemble Transform Kalman Filter (LETKF) is used to assimilate satellite-retrieved data of column mass loading. We show that this procedure may lead to nonphysical solutions and, consequently, conclude that LETKF is not the best approach for the assimilation of volcanic aerosols. However, we find that a truncated state constructed from the LETKF solution approaches the real solution after a few assimilation cycles, yielding a dramatic improvement of forecast quality when compared to simulations without assimilation.
    Description: Published
    Description: 1773–1792
    Description: 5V. Processi eruttivi e post-eruttivi
    Description: JCR Journal
    Keywords: Explosive volcanic eruptions ; Data assimilation ; Satellite data
    Repository Name: Istituto Nazionale di Geofisica e Vulcanologia (INGV)
    Type: article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2023-03-20
    Description: The evolution of High-Performance Computing (HPC) platforms enables the design and execution of progressively larger and more complex workflow applications in these systems. The complexity comes not only from the number of elements that compose the workflows but also from the type of computations they perform. While traditional HPC workflows target simulations and modelling of physical phenomena, current needs require in addition data analytics (DA) and artificial intelligence (AI) tasks. However, the development of these workflows is hampered by the lack of proper programming models and environments that support the integration of HPC, DA, and AI, as well as the lack of tools to easily deploy and execute the workflows in HPC systems. To progress in this direction, this paper presents use cases where complex workflows are required and investigates the main issues to be addressed for the HPC/DA/AI convergence. Based on this study, the paper identifies the challenges of a new workflow platform to manage complex workflows. Finally, it proposes a development approach for such a workflow platform addressing these challenges in two directions: first, by defining a software stack that provides the functionalities to manage these complex workflows; and second, by proposing the HPC Workflow as a Service (HPCWaaS) paradigm, which leverages the software stack to facilitate the reusability of complex workflows in federated HPC infrastructures. Proposals presented in this work are subject to study and development as part of the EuroHPC eFlows4HPC project.
    Description: Published
    Description: 414-429
    Description: 6T. Studi di pericolosità sismica e da maremoto
    Description: 8T. Sismologia in tempo reale e Early Warning Sismico e da Tsunami
    Description: 4V. Processi pre-eruttivi
    Description: 6V. Pericolosità vulcanica e contributi alla stima del rischio
    Description: 3IT. Calcolo scientifico
    Description: JCR Journal
    Keywords: High performance computing ; Distributed computing ; Parallel programming ; HPC-DA-AI convergence ; Workflow development ; Workflow orchestration
    Repository Name: Istituto Nazionale di Geofisica e Vulcanologia (INGV)
    Type: article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...