ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Other Sources  (18)
  • machine learning
  • English  (18)
  • German
Collection
  • Books  (670)
  • Other Sources  (18)
Source
Keywords
Language
  • English  (18)
  • German
Years
  • 1
    Publication Date: 2024-05-22
    Description: 〈title xmlns:mml="http://www.w3.org/1998/Math/MathML"〉Abstract〈/title〉〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉Mineral dust is one of the most abundant atmospheric aerosol species and has various far‐reaching effects on the climate system and adverse impacts on air quality. Satellite observations can provide spatio‐temporal information on dust emission and transport pathways. However, satellite observations of dust plumes are frequently obscured by clouds. We use a method based on established, machine‐learning‐based image in‐painting techniques to restore the spatial extent of dust plumes for the first time. We train an artificial neural net (ANN) on modern reanalysis data paired with satellite‐derived cloud masks. The trained ANN is applied to cloud‐masked, gray‐scaled images, which were derived from false color images indicating elevated dust plumes in bright magenta. The images were obtained from the Spinning Enhanced Visible and Infrared Imager instrument onboard the Meteosat Second Generation satellite. We find up to 15% of summertime observations in West Africa and 10% of summertime observations in Nubia by satellite images miss dust plumes due to cloud cover. We use the new dust‐plume data to demonstrate a novel approach for validating spatial patterns of the operational forecasts provided by the World Meteorological Organization Dust Regional Center in Barcelona. The comparison elucidates often similar dust plume patterns in the forecasts and the satellite‐based reconstruction, but once trained, the reconstruction is computationally inexpensive. Our proposed reconstruction provides a new opportunity for validating dust aerosol transport in numerical weather models and Earth system models. It can be adapted to other aerosol species and trace gases.〈/p〉
    Description: Plain Language Summary: Most dust and sand particles in the atmosphere originate from North Africa. Since ground‐based observations of dust plumes in North Africa are sparse, investigations often rely on satellite observations. Dust plumes are frequently obscured by clouds, making it difficult to study the full extent. We use machine‐learning methods to restore information about the extent of dust plumes beneath clouds in 2021 and 2022 at 9, 12, and 15 UTC. We use the reconstructed dust patterns to demonstrate a new way to validate the dust forecast ensemble provided by the World Meteorological Organization Dust Regional Center in Barcelona, Spain. Our proposed method is computationally inexpensive and provides new opportunities for assessing the quality of dust transport simulations. The method can be transferred to reconstruct other aerosol and trace gas plumes.〈/p〉
    Description: Key Points: 〈list list-type="bullet"〉 〈list-item〉 〈p xml:lang="en"〉We present the first fast reconstruction of cloud‐obscured Saharan dust plumes through novel machine learning applied to satellite images〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉The reconstruction algorithm utilizes partial convolutions to restore cloud‐induced gaps in gray‐scaled Meteosat Second Generation‐Spinning Enhanced Visible and Infrared Imager Dust RGB images〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉World Meteorological Organization dust forecasts for North Africa mostly agree with the satellite‐based reconstruction of the dust plume extent〈/p〉〈/list-item〉 〈/list〉 〈/p〉
    Description: GEOMAR Helmholtz Centre for Ocean Research Kiel
    Description: University of Cologne
    Description: https://doi.org/10.5281/zenodo.6475858
    Description: https://github.com/tobihose/Masterarbeit
    Description: https://dust.aemet.es/
    Description: https://ads.atmosphere.copernicus.eu/cdsapp#!/dataset/cams-global-reanalysis-eac4?tab=overview
    Description: https://navigator.eumetsat.int/product/EO:EUM:DAT:MSG:DUST
    Description: https://navigator.eumetsat.int/product/EO:EUM:DAT:MSG:CLM
    Description: https://doi.org/10.5067/KLICLTZ8EM9D
    Description: https://disc.gsfc.nasa.gov/datasets?project=MERRA-2
    Description: https://doi.org/10.5067/MODIS/MOD08_D3.061
    Description: https://doi.org/10.5067/MODIS/MYD08_D3.061
    Description: https://doi.org/10.5281/ZENODO.8278518
    Keywords: ddc:551.5 ; mineral dust ; North Africa ; MSG SEVIRI ; machine learning ; cloud removal ; satellite remote sensing
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2024-02-15
    Description: 〈title xmlns:mml="http://www.w3.org/1998/Math/MathML"〉Abstract〈/title〉〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉Machine learning (ML) has been increasingly applied to space weather and ionosphere problems in recent years, with the goal of improving modeling and forecasting capabilities through a data‐driven modeling approach of nonlinear relationships. However, little work has been done to quantify the uncertainty of the results, lacking an indication of how confident and reliable the results of an ML system are. In this paper, we implement and analyze several uncertainty quantification approaches for an ML‐based model to forecast Vertical Total Electron Content (VTEC) 1‐day ahead and corresponding uncertainties with 95% confidence intervals (CI): (a) Super‐Ensemble of ML‐based VTEC models (SE), (b) Gradient Tree Boosting with quantile loss function (Quantile Gradient Boosting, QGB), (c) Bayesian neural network (BNN), and (d) BNN including data uncertainty (BNN + D). Techniques that consider only model parameter uncertainties (a and c) predict narrow CI and over‐optimistic results, whereas accounting for both model parameter and data uncertainties with the BNN + D approach leads to a wider CI and the most realistic uncertainties quantification of VTEC forecast. However, the BNN + D approach suffers from a high computational burden, while the QGB approach is the most computationally efficient solution with slightly less realistic uncertainties. The QGB CI are determined to a large extent from space weather indices, as revealed by the feature analysis. They exhibit variations related to daytime/nightime, solar irradiance, geomagnetic activity, and post‐sunset low‐latitude ionosphere enhancement.〈/p〉
    Description: Plain Language Summary: Space weather describes the varying conditions in the space environment between the Sun and Earth that can affect satellites and technologies on Earth, such as navigation systems, power grids, radio, and satellite communications. The manifestation of space weather in the ionosphere can be characterized using the Vertical Total Electron Content (VTEC) derived from Global Navigation Satellite Systems observations. In this study, the machine learning (ML) approach is applied to approximate the nonlinear relationships of Sun‐Earth processes using data on solar activity, solar wind, magnetic field, and VTEC. However, the measurements and the modeling approaches are subject to errors, increasing the uncertainty of the results when forecasting future instances. For reliable forecasting, it is necessary to quantify the uncertainties. Quantifying the uncertainty is also helpful for understanding the ML‐based model and the problem of VTEC and space weather forecasting. Therefore, in this study, ML‐based models are developed to forecast VTEC within the ionosphere, including the manifestation of space weather, while the degree of reliability is quantified with a target value of 95% confidence.〈/p〉
    Description: Key Points: 〈list list-type="bullet"〉 〈list-item〉 〈p xml:lang="en"〉Machine learning‐based Vertical Total Electron Content models with 95% confidence intervals (CI) are developed for the first time using four approaches to quantify uncertainties〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉Bayesian Neural Network quantifying model and data uncertainties contains ground truth within CIs, but is computationally intensive〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉Quantile Gradient Boosting is fastest with comparable performance in terms of uncertainty; CIs largely determined from space weather indices〈/p〉〈/list-item〉 〈/list〉 〈/p〉
    Description: Deutscher Akademischer Austauschdienst http://dx.doi.org/10.13039/501100001655
    Description: https://www.tensorflow.org/
    Description: https://doi.org/10.21105/joss.03021
    Description: http://www.aiub.unibe.ch/download/CODE
    Description: https://kauai.ccmc.gsfc.nasa.gov/instantrun/iri
    Description: https://doi.org/10.5281/zenodo.7741342
    Description: https://doi.org/10.5281/zenodo.7858906
    Description: https://doi.org/10.5281/zenodo.7858661
    Keywords: ddc:551.5 ; machine learning ; uncertainty quantification ; confidence intervals ; probabilistic ionosphere forecast ; space weather ; ensemble ; Bayesian neural network ; quantile gradient boosting
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2024-02-15
    Description: Hydraulic fracturing (HF) operations are widely associated with induced seismicity in the Western Canadian Sedimentary Basin. This study correlates injection parameters of 12,903 HF stages in the Kiskatinaw area in northeast British Columbia with an enhanced catalog containing 40,046 earthquakes using a supervised machine learning approach. It identifies relevant combinations of geological and operational parameters related to individual HF stages in efforts to decipher fault activation mechanisms. Our results suggest that stages targeting specific geological units (here, the Lower Montney formation) are more likely to induce an earthquake. Additional parameters positively correlated with earthquake likelihood include target formation thickness, injection volume, and completion date. Furthermore, the COVID‐19 lockdown may have reduced the potential cumulative effect of HF operations. Our results demonstrate the value of machine learning approaches for implementation as guidance tools that help facilitate safe development of unconventional energy technologies.
    Description: Plain Language Summary: Hydraulic fracturing (HF), a technique used in unconventional energy production, increases rock permeability to enhance fluid movement. Its use has led to an unprecedented increase of associated earthquakes in the Western Canadian Sedimentary Basin in the last decade, among other regions. Numerous studies have investigated the relationship between induced earthquakes and HF operations, but the connection between specific geological and operational parameters and earthquake occurrence is only partly understood. Here, we use a supervised machine learning approach with publicly available injection data from the British Columbia Oil and Gas Commission to identify influential HF parameters for increasing the likelihood of a specific operation inducing an earthquake. We find that geological parameters, such as the target formation and its thickness, are most influential. A small number of operational parameters are also important, such as the injected fluid volume and the operation date. Our findings demonstrate an approach with the potential to develop tools to help enable the continued development of alternative energy technology. They also emphasize the need for public access to operational data to estimate and reduce the hazard and associated risk of induced seismicity.
    Description: Key Points: We use supervised machine learning to investigate the relationship between hydraulic fracturing operation parameters and induced seismicity. Geological properties and a limited number of operational parameters predominantly influence the probability of an induced earthquake. The approach has the potential to guide detailed investigations of injection parameters critical for inducing earthquakes.
    Description: Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659
    Description: Gouvernement du Canada Natural Sciences and Engineering Research Council of Canada http://dx.doi.org/10.13039/501100000038
    Description: https://doi.org/10.5281/zenodo.5501399
    Description: https://ds.iris.edu/gmap/XL
    Description: https://files.bcogc.ca/thinclient/
    Description: https://open.canada.ca/data/en/dataset/7f245e4d-76c2-4caa-951a-45d1d2051333
    Description: https://github.com/obspy/obspy
    Description: https://github.com/eqcorrscan/EQcorrscan
    Description: https://github.com/smousavi05/EQTransformer
    Description: https://github.com/Dal-mzhang/REAL
    Description: https://scikit-learn.org/stable/
    Description: https://docs.fast.ai/
    Description: https://xgboost.readthedocs.io/en/stable/
    Description: https://github.com/slundberg/shap
    Description: https://docs.generic-mapping-tools.org/latest/
    Keywords: ddc:551.22 ; induced seismicity ; machine learning ; hydraulic fracturing
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2024-02-14
    Description: X‐ray crystallography has witnessed a massive development over the past decade, driven by large increases in the intensity and brightness of X‐ray sources and enabled by employing high‐frame‐rate X‐ray detectors. The analysis of large data sets is done via automatic algorithms that are vulnerable to imperfections in the detector and noise inherent with the detection process. By improving the model of the behaviour of the detector, data can be analysed more reliably and data storage costs can be significantly reduced. One major requirement is a software mask that identifies defective pixels in diffraction frames. This paper introduces a methodology and program based upon concepts of machine learning, called robust mask maker (RMM), for the generation of bad‐pixel masks for large‐area X‐ray pixel detectors based on modern robust statistics. It is proposed to discriminate normally behaving pixels from abnormal pixels by analysing routine measurements made with and without X‐ray illumination. Analysis software typically uses a Bragg peak finder to detect Bragg peaks and an indexing method to detect crystal lattices among those peaks. Without proper masking of the bad pixels, peak finding methods often confuse the abnormal values of bad pixels in a pattern with true Bragg peaks and flag such patterns as useful regardless, leading to storage of enormous uninformative data sets. Also, it is computationally very expensive for indexing methods to search for crystal lattices among false peaks and the solution may be biased. This paper shows how RMM vastly improves peak finders and prevents them from labelling bad pixels as Bragg peaks, by demonstrating its effectiveness on several serial crystallography data sets.
    Description: Attention is focused on perhaps the biggest bottleneck in data analysis for serial crystallography at X‐ray free‐electron lasers, which has not received serious enough examination to date. An effective and reliable way is presented to identify anomalies in detectors, using machine learning and recently developed mathematical methods in the field referred to as `robust statistics'. image
    Keywords: ddc:548 ; bad‐pixel masks ; robust mask maker ; machine learning ; robust statistics ; serial crystallography
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2024-02-14
    Description: Machine learning (ML) has received enormous attention in science and beyond. Discussed here are the status, opportunities, challenges and limitations of ML as applied to X‐ray and neutron scattering techniques, with an emphasis on surface scattering. Typical strategies are outlined, as well as possible pitfalls. Applications to reflectometry and grazing‐incidence scattering are critically discussed. Comment is also given on the availability of training and test data for ML applications, such as neural networks, and a large reflectivity data set is provided as reference data for the community.
    Description: The status, opportunities, challenges and limitations of machine learning are discussed as applied to X‐ray and neutron scattering techniques, with an emphasis on surface scattering.
    Keywords: ddc:548 ; surface scattering ; X‐ray diffraction ; neutron scattering ; machine learning ; data analysis
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2023-12-12
    Description: 〈title xmlns:mml="http://www.w3.org/1998/Math/MathML"〉Abstract〈/title〉〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉To first order, the magnetopause (MP) is defined by a pressure balance between the solar wind and the magnetosphere. The boundary moves under the influence of varying solar wind conditions and transient foreshock phenomena, reaching unusually large and small distances from the Earth. We investigate under which solar wind conditions such extreme MP distortions occur. Therefore, we construct a database of magnetopause crossings (MPCs) observed by the THEMIS spacecraft in the years 2007 to mid‐2022 using a simple Random Forest Classifier. Roughly 7% of the found crossing events deviate beyond reported errors in the stand‐off distance from the Shue et al. (1998, 〈ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1029/98JA01103"〉https://doi.org/10.1029/98JA01103〈/ext-link〉) MP model and thus are termed extreme distortions. We find the occurrence of these extreme events in terms of expansion or compression of the MP to be linked to different solar wind parameters, most notably to the IMF magnitude, cone angle, velocity, Alfvén Mach number and temperature. Foreshock transients like hot‐flow anomalies and foreshock bubbles could be responsible for extreme magnetospheric expansions. The results should be incorporated into future magnetopause models and may be helpful for the reconstruction of the MP locations out of soft x‐ray images, relevant for the upcoming SMILE mission.〈/p〉
    Description: Key Points: 〈list list-type="bullet"〉 〈list-item〉 〈p xml:lang="en"〉More than 160.000 magnetopause crossings (MPCs) identified in THEMIS data between 2007 and 2022 using a Random Forest Classifier〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉Magnetopause crossings that extremely deviate in location from the Shue et al. (1998, 〈ext-link ext-link-type="uri" xlink:href="https://doi.org/10.1029/98JA01103"〉https://doi.org/10.1029/98JA01103〈/ext-link〉) model are quite common〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉Important solar wind parameters associated with deviations include the interplanetary magnetic field cone angle, solar wind velocity and Alfvén Mach number〈/p〉〈/list-item〉 〈/list〉 〈/p〉
    Description: German Ministerium für Wirtschaft und Klimaschutz and Deutsches Zentrum für Luft‐und Raumfahrt http://dx.doi.org/10.13039/501100002946
    Description: UKRI Stephen Hawking Fellowship
    Description: German Ministry for Economy and Technology and
    Description: German Center for Aviation and Space
    Description: https://osf.io/b6kux/
    Description: https://github.com/spedas/pyspedas
    Description: http://themis.ssl.berkeley.edu/data/themis/
    Description: https://omniweb.gsfc.nasa.gov/
    Description: https://scikit-learn.org/stable/supervised_learning.html#supervised-learning
    Keywords: ddc:538.7 ; magnetopause ; solar wind ; statistics ; machine learning ; THEMIS
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2023-12-05
    Description: A promising approach to improve cloud parameterizations within climate models and thus climate projections is to use deep learning in combination with training data from storm‐resolving model (SRM) simulations. The ICOsahedral Non‐hydrostatic (ICON) modeling framework permits simulations ranging from numerical weather prediction to climate projections, making it an ideal target to develop neural network (NN) based parameterizations for sub‐grid scale processes. Within the ICON framework, we train NN based cloud cover parameterizations with coarse‐grained data based on realistic regional and global ICON SRM simulations. We set up three different types of NNs that differ in the degree of vertical locality they assume for diagnosing cloud cover from coarse‐grained atmospheric state variables. The NNs accurately estimate sub‐grid scale cloud cover from coarse‐grained data that has similar geographical characteristics as their training data. Additionally, globally trained NNs can reproduce sub‐grid scale cloud cover of the regional SRM simulation. Using the game‐theory based interpretability library SHapley Additive exPlanations, we identify an overemphasis on specific humidity and cloud ice as the reason why our column‐based NN cannot perfectly generalize from the global to the regional coarse‐grained SRM data. The interpretability tool also helps visualize similarities and differences in feature importance between regionally and globally trained column‐based NNs, and reveals a local relationship between their cloud cover predictions and the thermodynamic environment. Our results show the potential of deep learning to derive accurate yet interpretable cloud cover parameterizations from global SRMs, and suggest that neighborhood‐based models may be a good compromise between accuracy and generalizability.
    Description: Plain Language Summary: Climate models, such as the ICOsahedral Non‐hydrostatic climate model, operate on low‐resolution grids, making it computationally feasible to use them for climate projections. However, physical processes –especially those associated with clouds– that happen on a sub‐grid scale (inside a grid box) cannot be resolved, yet they are critical for the climate. In this study, we train neural networks that return the cloudy fraction of a grid box knowing only low‐resolution grid‐box averaged variables (such as temperature, pressure, etc.) as the climate model sees them. We find that the neural networks can reproduce the sub‐grid scale cloud fraction on data sets similar to the one they were trained on. The networks trained on global data also prove to be applicable on regional data coming from a model simulation with an entirely different setup. Since neural networks are often described as black boxes that are therefore difficult to trust, we peek inside the black box to reveal what input features the neural networks have learned to focus on and in what respect the networks differ. Overall, the neural networks prove to be accurate methods of reproducing sub‐grid scale cloudiness and could improve climate model projections when implemented in a climate model.
    Description: Key Points: Neural networks can accurately learn sub‐grid scale cloud cover from realistic regional and global storm‐resolving simulations. Three neural network types account for different degrees of vertical locality and differentiate between cloud volume and cloud area fraction. Using a game theory based library we find that the neural networks tend to learn local mappings and are able to explain model errors.
    Description: EC ERC HORIZON EUROPE European Research Council
    Description: Partnership for Advanced Computing in Europe (PRACE)
    Description: NSF Science and Technology Center, Center for Learning the Earth with Artificial Intelligence and Physics (LEAP)
    Description: Deutsches Klimarechenzentrum
    Description: Columbia sub‐award 1
    Description: https://github.com/agrundner24/iconml_clc
    Description: https://doi.org/10.5281/zenodo.5788873
    Description: https://code.mpimet.mpg.de/projects/iconpublic
    Keywords: ddc:551.5 ; cloud cover ; parameterization ; machine learning ; neural network ; explainable AI ; SHAP
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2023-11-17
    Description: One important component of precipitating convection is the formation of convective downdrafts. They can terminate the initial updraft, affect the mean properties of the boundary layer, and cause strong winds at the surface. While the basic forcing mechanisms for downdrafts are well understood, it is difficult to formulate general relationships between updrafts, environmental conditions, and downdrafts. To better understand what controls different downdraft properties, we analyze downdrafts over tropical oceans in a global storm resolving simulation. Using a global model allows us to examine a large number of downdrafts under naturally varying environmental conditions. We analyze the various factors affecting downdrafts using three alternative methods. First, hierarchical clustering is used to examine the correlation between different downdraft, updraft, and environmental variables. Then, either random forests or multiple linear regression are used to estimate the relationships between downdraft properties and the updraft and environmental predictors. We find that these approaches yield similar results. Around 75% of the variability in downdraft mass flux and 37% of the variability in downdraft velocity are predictable. Analyzing the relative importance of our various predictors, we find that downdrafts are coupled to updrafts via the precipitation generation argument. In particular, updraft properties determine rain amount and rate, which then largely control the downdraft mass flux and, albeit to a lesser extent, the downdraft velocity. Among the environmental variables considered, only lapse rate is a valuable predictor: a more unstable environment favors a higher downdraft mass flux and a higher downdraft velocity.
    Description: Plain Language Summary: Once a cloud begins to rain, the air inside or below the cloud can gain negative buoyancy and sink to the ground. This downward movement of air is called a downdraft. Downdrafts can end the life cycle of a cloud and also result in strong, sometimes destructive, wind gusts at the surface. The basic driving forces for downdrafts are well understood. For example, we know that evaporation of rain and the associated latent cooling of air is usually critical in causing the air to become negatively buoyant. Even though the basic driving forces are known, many interrelated processes contribute simultaneously to the strength of the downdraft, making it difficult to predict the strength of a downdraft under specific conditions. In this study, we use an atmospheric simulation whose model domain spans the globe and can explicitly resolve rain clouds. Compared to previous studies, the use of a global domain allows us to study a very large number of rain clouds, and their associated downdrafts, which form under very different, naturally varying environmental conditions. Machine learning techniques and traditional statistical methods agree on the result that the strength of the downdraft can be well predicted if we know the strength of the updraft that caused the downdraft or, even better, if we know the amount of rain that an updraft produced. Surprisingly, we have found that downdrafts can be predicted only slightly better if we also know other environmental conditions of the air surrounding the downdraft, such as the temperature and/or humidity profiles.
    Description: Key Points: The best predictors of downdraft mass flux and velocity are rain amount and rate, respectively. Updraft properties impact downdraft properties through their control on rain formation. For a given rain amount and rate, environmental conditions add little skill to downdraft prediction.
    Description: Max Planck Institute for Meteorology
    Description: ARC Centre of Excellence for Climate Extremes
    Description: https://mpimet.mpg.de/en/science/modeling-with-icon/code-availability
    Description: http://hdl.handle.net/21.11116/0000-0009-A854-B
    Keywords: ddc:551.6 ; convective downdrafts ; global storm resolving simulation ; machine learning ; random forest ; multiple linear regression
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2023-11-16
    Description: 〈title xmlns:mml="http://www.w3.org/1998/Math/MathML"〉Abstract〈/title〉〈p xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en"〉Floods cause average annual losses of more than US$30 billion in the US and are estimated to significantly increase due to global change. Flood resilience, which currently differs strongly between socio‐economic groups, needs to be substantially improved by proactive adaptive measures, such as timely purchase of flood insurance. Yet, knowledge about the state and uptake of private adaptation and its drivers is so far scarce and fragmented. Based on interpretable machine learning and large insurance and socio‐economic open data sets covering the whole continental US we reveal that flood insurance purchase is characterized by reactive behavior after severe flood events. However, we observe that the Community Rating System helps overcome this behavior by effectively fostering proactive insurance purchase, irrespective of socio‐economic backgrounds in the communities. Thus, we recommend developing additional targeted measures to help overcome existing inequalities, for example, by providing special incentives to the most vulnerable and exposed communities.〈/p〉
    Description: Plain Language Summary: Flood resilience of individuals and communities can be improved by bottom‐up strategies, such as insurance purchase, or top‐down measures like the US National Flood Insurance Program's Community Rating System (CRS). Our interpretable machine learning approach shows that flood insurances are mostly purchased reactively, after the occurrence of a flood event. Yet, reactive behaviors are ill‐suited as more extreme events are expected under future climate, also in areas that were not previously flooded. The CRS counteracts this behavior by fostering proactive adaptation across a widespread range of socio‐economic backgrounds. Future risk management including the CRS should support and motivate individuals' proactive adaptation with a particular focus on highly vulnerable social groups to overcome existing inequalities in flood risk.〈/p〉
    Description: Key Points: 〈list list-type="bullet"〉 〈list-item〉 〈p xml:lang="en"〉Flood insurance purchase in the US is dominated by reactive behavior after severe floods〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉The Community Rating System (CRS) fosters proactive insurance adoption irrespective of socio‐economic background〈/p〉〈/list-item〉 〈list-item〉 〈p xml:lang="en"〉The CRS should further balance existing inequalities by targeting specific population segments〈/p〉〈/list-item〉 〈/list〉 〈/p〉
    Description: https://api.census.gov/data/2018/acs/
    Description: https://www.fema.gov/about/openfema/data-sets#nfip
    Description: https://www.fema.gov/fact-sheet/community-rating-system-overview-and-participation
    Description: https://msc.fema.gov/portal/home
    Description: https://www.fema.gov/case-study/information-about-community-rating-system
    Description: https://doi.org/10.5281/zenodo.8067448
    Keywords: ddc:363.34 ; FEMA ; machine learning ; flood insurance ; human behavior ; flood resilience
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2023-11-14
    Description: Viscosity is of great importance in governing the dynamics of volcanoes, including their eruptive style. The viscosity of a volcanic melt is dominated by temperature and chemical composition, both oxides and water content. The changes in melt structure resulting from the interactions between the various chemical components are complex, and the construction of a physical viscosity model that depends on composition has not yet been achieved. We therefore train an artificial neural network (ANN) on a large database of measured compositions, including water, and viscosities that spans virtually the entire chemical space of terrestrial magmas, as well as some technical and extra‐terrestrial silicate melts. The ANN uses composition, temperature, a structural parameter reflecting melt polymerization and the alkaline ratio as input parameters. It successfully reproduces and predicts measurements in the database with significantly higher accuracy than previous global models for volcanic melt viscosities. Viscosity measurements are restricted to low and high viscosity ranges, which exclude typical eruptive temperatures. Without training data at such conditions, the ANN cannot reliably predict viscosities for this important temperature range. To overcome this limitation, we use the ANN to create synthetic viscosity data in the high and low viscosity range and fit these points using a physically motivated, temperature‐dependent viscosity model. Our study introduces a synthetic data approach for the creation of a physically motivated model predicting volcanic melt viscosities based on ANNs.
    Description: Plain Language Summary: Magma viscosity is a key parameter that controls the style of a volcanic eruption, whether it will be effusive or explosive. For this reason, any volcanic hazard mitigation plan requires detailed knowledge of this property. Melt viscosity can vary by up to 15 orders of magnitude (a factor of a quadrillion) with temperature and composition. Unfortunately, it is not possible to perform measurements over this range continuously in the laboratory, but only in two distinct temperature regimes, termed high and low viscosity ranges. In order to obtain a model to predict how composition and temperature control viscosity, we use machine learning and train an artificial neural network on a large viscosity database. This allows us to calculate high‐ and low‐temperature viscosity data that we call synthetic. Since most magmas are erupted at temperatures between the high‐ and low‐temperature ranges, we combine the synthetic data and a physically motivated equation to describe the dependence of viscosity on temperature. This model can compute viscosities in the region without measurements, including typical eruption temperatures of volcanoes. Our model serves the scientific community studying volcanic eruption mechanisms and its future prediction on a data driven basis.
    Description: Key Points: We train an artificial neural network that calculates temperature‐ and composition‐dependent viscosity of volcanic melts. The neural network reproduces and predicts experimental viscosity significantly better than previous global models. A synthetic data approach based on the neural network is combined with a physical model to predict viscosity at eruptive temperatures.
    Description: Deutsche Forschungsgemeinschaft http://dx.doi.org/10.13039/501100001659
    Description: https://share.streamlit.io/domlang/visc_calc/main/final_script.py
    Description: https://doi.org/10.5281/zenodo.7317803
    Keywords: ddc:550.728 ; volcanoes ; viscosity ; silicate melt ; machine learning ; artificial neural network ; magma
    Language: English
    Type: doc-type:article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...