ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2019-09-10
    Description: A dataset containing quality controlled wind observations from 222 tall towers has been created. Wind speed and wind direction measurements have been collected from existing tall towers around the world in an effort to boost the utilisation of these non-standard atmospheric datasets, especially within the wind energy and research fields. The observations taken at several heights greater than 10 metres above ground level have been retrieved from various sparse datasets and compiled in a unique collection with a common format, access, documentation and quality control. For the latter, a total of 18 Quality Control checks have been considered to ensure the high quality of the wind records. Non-quality-controlled temperature, relative humidity and barometric pressure data from the towers have also been obtained and included in the dataset. The Tall Tower Dataset (Ramon and Lledó, 2019a) is published in the repository EUDAT and made available at https://doi.org/10.23728/b2share.0d3a99db75df4238820ee548f35ee36b.
    Electronic ISSN: 1866-3591
    Topics: Geosciences
    Published by Copernicus
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2016-10-25
    Description: The Decadal Climate Prediction Project (DCPP) is a coordinated multi-model investigation into decadal climate prediction, predictability, and variability. The DCPP makes use of past experience in simulating and predicting decadal variability and forced climate change gained from the fifth Coupled Model Intercomparison Project (CMIP5) and elsewhere. It builds on recent improvements in models, in the reanalysis of climate data, in methods of initialization and ensemble generation, and in data treatment and analysis to propose an extended comprehensive decadal prediction investigation as a contribution to CMIP6 (Eyring et al., 2016) and to the WCRP Grand Challenge on Near Term Climate Prediction (Kushnir et al., 2016). The DCPP consists of three components. Component A comprises the production and analysis of an extensive archive of retrospective forecasts to be used to assess and understand historical decadal prediction skill, as a basis for improvements in all aspects of end-to-end decadal prediction, and as a basis for forecasting on annual to decadal timescales. Component B undertakes ongoing production, analysis and dissemination of experimental quasi-real-time multi-model forecasts as a basis for potential operational forecast production. Component C involves the organization and coordination of case studies of particular climate shifts and variations, both natural and naturally forced (e.g. the “hiatus”, volcanoes), including the study of the mechanisms that determine these behaviours. Groups are invited to participate in as many or as few of the components of the DCPP, each of which are separately prioritized, as are of interest to them.The Decadal Climate Prediction Project addresses a range of scientific issues involving the ability of the climate system to be predicted on annual to decadal timescales, the skill that is currently and potentially available, the mechanisms involved in long timescale variability, and the production of forecasts of benefit to both science and society.
    Print ISSN: 1991-959X
    Electronic ISSN: 1991-9603
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2019-06-13
    Description: Most Earth System Models (ESMs) are running under different high-performance computing (HPC) environments. This has several advantages, from allowing different groups to work with the same tool in parallel to leveraging the burden of ensemble climate simulations but also offering alternative solutions in case of shutdown (expected or not) of any of the environments. However, for obvious scientific reasons, it is critical to ensure that ESMs provide identical results under changes in computing environment. While strict bit-for-bit reproducibility is not always guaranteed with ESMs, it is desirable that results obtained under one computing environment are at least statistically indistinguishable from those obtained under another environment, which we term a replicability condition following the metrology nomenclature. Here, we develop a protocol to assess the replicability of the EC-Earth ESM. Using two versions of EC-Earth, we present one case of non-replicability and one case of replicability. The non-replicable case occurs with the older version of the model and likely finds its origin in the treatment of river runoffs along Antarctic coasts. By contrast, the more recent version of the model provides replicable results. The methodology presented here has been adopted as a standard test by the EC-Earth consortium (27 institutions in Europe) to evaluate the replicability of any new model version across platforms, including for CMIP6 experiments. To a larger extent, it can be used to assess whether other ESMs can safely be ported from one HPC environment to another for studying climate-related questions. Our results and experience with this work suggest that the default assumption should be that ESMs are not replicable under changes in the HPC environment, until proven otherwise.
    Print ISSN: 1991-9611
    Electronic ISSN: 1991-962X
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019-02-15
    Description: Mixed-precision approaches can provide substantial speed-ups for both computing- and memory-bound codes requiring little effort. Most scientific codes have overengineered the numerical precision leading to a situation where models are using more resources than required without having a clue about where these resources are unnecessary and where are really needed. Consequently, there is the possibility to obtain performance benefits from using a more appropriate choice of precision and the only thing that is needed is a method to determine which real variables can be represented with fewer bits without affecting the accuracy of the results. This paper presents a novel method to enable modern and legacy codes to benefit from a reduction of precision without sacrificing accuracy. It consists in a simple idea: if we can measure how reducing the precision of a group of variables affects the outputs, we can evaluate the level of precision this group of variables need. Modifying and recompiling the code for each case that has to be evaluated would require an amount of effort that makes this task prohibitive. Instead, the method presented in this paper relies on the use of a tool called Reduced Precision Emulator (RPE) that can significantly streamline the process . Using the RPE and a list of parameters containing the precisions that will be used for each real variable in the code, it is possible within a single binary to emulate the effect on the outputs of a specific choice of precision. Once we have the potential of emulating the effects of reduced precision, we can proceed with the design of the tests required to obtain knowledge about all the variables in the model. The number of possible combinations is prohibitively large and impossible to explore. The alternative of performing a screening of the variables individually can give certain insight about the precision needed by the variables, but on the other hand some more complex interactions that involve several variables may remain hidden. Instead, we use a divide-and-conquer algorithm that identifies the parts that cannot handle reduced precision and builds a set of variables that can. The method has been put to proof using two state-of-the-art ocean models, NEMO and ROMS, with very promising results. Obtaining this information is crucial to build afterwards an actual mixed precision version of the code that will bring the promised performance benefits.
    Print ISSN: 1991-9611
    Electronic ISSN: 1991-962X
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019-07-24
    Description: Mixed-precision approaches can provide substantial speed-ups for both computing- and memory-bound codes with little effort. Most scientific codes have overengineered the numerical precision, leading to a situation in which models are using more resources than required without knowing where they are required and where they are not. Consequently, it is possible to improve computational performance by establishing a more appropriate choice of precision. The only input that is needed is a method to determine which real variables can be represented with fewer bits without affecting the accuracy of the results. This paper presents a novel method that enables modern and legacy codes to benefit from a reduction of the precision of certain variables without sacrificing accuracy. It consists of a simple idea: we reduce the precision of a group of variables and measure how it affects the outputs. Then we can evaluate the level of precision that they truly need. Modifying and recompiling the code for each case that has to be evaluated would require a prohibitive amount of effort. Instead, the method presented in this paper relies on the use of a tool called a reduced-precision emulator (RPE) that can significantly streamline the process. Using the RPE and a list of parameters containing the precisions that will be used for each real variable in the code, it is possible within a single binary to emulate the effect on the outputs of a specific choice of precision. When we are able to emulate the effects of reduced precision, we can proceed with the design of the tests that will give us knowledge of the sensitivity of the model variables regarding their numerical precision. The number of possible combinations is prohibitively large and therefore impossible to explore. The alternative of performing a screening of the variables individually can provide certain insight about the required precision of variables, but, on the other hand, other complex interactions that involve several variables may remain hidden. Instead, we use a divide-and-conquer algorithm that identifies the parts that require high precision and establishes a set of variables that can handle reduced precision. This method has been tested using two state-of-the-art ocean models, the Nucleus for European Modelling of the Ocean (NEMO) and the Regional Ocean Modeling System (ROMS), with very promising results. Obtaining this information is crucial to build an actual mixed-precision version of the code in the next phase that will bring the promised performance benefits.
    Print ISSN: 1991-959X
    Electronic ISSN: 1991-9603
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2020-02-18
    Description: A dataset containing quality-controlled wind observations from 222 tall towers has been created. Wind speed and wind direction measurements covering the 1984–2017 period have been collected from existing tall towers around the world in an effort to boost the utilization of these non-standard atmospheric datasets, especially within the wind energy and research fields. Observations taken at several heights greater than 10 m above ground level have been retrieved from various sparse datasets and compiled in a unique collection with a common format, access, documentation and quality control. For the last, a total of 18 quality control checks have been considered to ensure the high quality of the wind records. Non-quality-controlled temperature, relative humidity and barometric pressure data from the towers have also been obtained and included in the dataset. The Tall Tower Dataset (Ramon and Lledó, 2019a) is published in the repository EUDAT and made available at https://doi.org/10.23728/b2share.136ecdeee31a45a7906a773095656ddb.
    Print ISSN: 1866-3508
    Electronic ISSN: 1866-3516
    Topics: Geosciences
    Published by Copernicus
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2020-03-12
    Description: Most Earth system models (ESMs) are running under different high-performance computing (HPC) environments. This has several advantages, from allowing different groups to work with the same tool in parallel to leveraging the burden of ensemble climate simulations, but it also offers alternative solutions in the case of shutdown (expected or not) of any of the environments. However, for obvious scientific reasons, it is critical to ensure that ESMs provide identical results under changes in computing environment. While strict bit-for-bit reproducibility is not always guaranteed with ESMs, it is desirable that results obtained under one computing environment are at least statistically indistinguishable from those obtained under another environment, which we term a “replicability” condition following the metrology nomenclature. Here, we develop a protocol to assess the replicability of the EC-Earth ESM. Using two versions of EC-Earth, we present one case of non-replicability and one case of replicability. The non-replicable case occurs with the older version of the model and likely finds its origin in the treatment of river runoff along Antarctic coasts. By contrast, the more recent version of the model provides replicable results. The methodology presented here has been adopted as a standard test by the EC-Earth consortium (27 institutions in Europe) to evaluate the replicability of any new model version across platforms, including for CMIP6 experiments. To a larger extent, it can be used to assess whether other ESMs can safely be ported from one HPC environment to another for studying climate-related questions. Our results and experience with this work suggest that the default assumption should be that ESMs are not replicable under changes in the HPC environment, until proven otherwise.
    Print ISSN: 1991-959X
    Electronic ISSN: 1991-9603
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2021-02-11
    Description: In this paper, we present and evaluate the skill of an EC-Earth3.3 decadal prediction system contributing to the Decadal Climate Prediction Project – Component A (DCPP-A). This prediction system is capable of skilfully simulating past global mean surface temperature variations at interannual and decadal forecast times as well as the local surface temperature in regions such as the tropical Atlantic, the Indian Ocean and most of the continental areas, although most of the skill comes from the representation of the external radiative forcings. A benefit of initialization in the predictive skill is evident in some areas of the tropical Pacific and North Atlantic oceans in the first forecast years, an added value that is mostly confined to the south-east tropical Pacific and the eastern subpolar North Atlantic at the longest forecast times (6–10 years). The central subpolar North Atlantic shows poor predictive skill and a detrimental effect of initialization that leads to a quick collapse in Labrador Sea convection, followed by a weakening of the Atlantic Meridional Overturning Circulation (AMOC) and excessive local sea ice growth. The shutdown in Labrador Sea convection responds to a gradual increase in the local density stratification in the first years of the forecast, ultimately related to the different paces at which surface and subsurface temperature and salinity drift towards their preferred mean state. This transition happens rapidly at the surface and more slowly in the subsurface, where, by the 10th forecast year, the model is still far from the typical mean states in the corresponding ensemble of historical simulations with EC-Earth3. Thus, our study highlights the Labrador Sea as a region that can be sensitive to full-field initialization and hamper the final prediction skill, a problem that can be alleviated by improving the regional model biases through model development and by identifying more optimal initialization strategies.
    Print ISSN: 2190-4979
    Electronic ISSN: 2190-4987
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...