ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (29,188)
  • Other Sources
  • Open Access-Papers  (29,188)
  • 2020-2022  (29,188)
Collection
  • Articles  (29,188)
  • Other Sources
Keywords
Language
Years
Year
  • 1
    Publication Date: 2021-12-29
    Description: Commission Decision of 25 February 2016 setting up a Scientific, Technical and Economic Committee for Fisheries, C(2016) 1084, OJ C 74, 26.2.2016, p. 4–10. The Commission may consult the group on any matter relating to marine and fisheries biology, fishing gear technology, fisheries economics, fisheries governance, ecosystem effects of fisheries, aquaculture or similar disciplines. This report, on methods for supporting stock assessment in the Mediterranean (STECF-21-02), addresses the data checking and preparation for stock assessment once the data has been submitted following the annual data calls. The report provides an overview of the data errors and quality control carried out on both commercial landings data and MEDITS survey data. The analyses reported also address the small fraction of commercial catch with sampling gaps, and how these are assigned appropriate length frequency distributions. The results of these check and assignments are provided by species, GSA and country. Quality checks were carried out on Medits data check consistency of the main reporting files and highlighting where data inconsistencies occurred. Additionally the total landings reported to the European Commission under the Black & Med-Sea data call, the Fisheries Independent Data call and the Annual Economic Report data call were compared at species aggregated to GSA. Some important differences were observed and these are reported. In addition the EWG reviewed a technical report on the sampling of commercial catch in the Greek Fisheries, the review and some suggested further work are included in this report.
    Description: European Union, Joint Research Centre
    Description: Published
    Description: Refereed
    Keywords: Stock assessment ; Fisheries management
    Repository Name: AquaDocs
    Type: Report
    Format: 1269pp.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    facet.materialart.
    Unknown
    In:  EPIC32020 IEEE/ACM HPC for Urgent Decision Making (UrgentHPC), pp. 21-26
    Publication Date: 2021-12-28
    Description: In the context of response to disasters, fast and accurate simulations can be integrated into a real-time flow, allowing best situation estimates to be provided at required deadlines for warnings and decisions. In such a pilot centered on earthquakes, the TsunAWI tsunami simulation code has been adapted and improved to provide both fast results with a coarse mesh and more accurate results with a fine mesh. Improvements to the TsunAWI code have allowed simulation runs to reach runtimes compatible with the needs of a workflow with real-time deadlines. A significant component of TsunAWI is the underlying mesh traversal order; the use of heuristics from the gpu rendering community can reach comparable performance to the original space filing curve ordering, and validation against historical cases is underway for the fast simulation results.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , isiRev , info:eu-repo/semantics/article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2021-12-28
    Description: Accurate and rapid earthquake loss assessments and tsunami early warnings are critical in modern society to allow for appropriate and timely emergency response decisions. In the LEXIS project, we seek to enhance the workflow of rapid loss assessments and emergency decision support systems by leveraging an orchestrated heterogeneous environment combining high-performance computing resources and Cloud infrastructure. The workflow consists of three main applications: firstly, after an earthquake occurs, its shaking distribution (ShakeMap) is computed based on the OpenQuake code. Secondly, if a tsunami may have been triggered by the earthquake, tsunami simulations (first a fast and coarse and later a high-resolution and computationally intensive analysis) are performed based on the TsunAWI simulation code that allows for an early warning in potentially affected areas. Finally, based on the previous results, a loss assessment based on a dynamic exposure model using open data such as OpenStreetMap is performed. To consolidate the workflow and ensure respect of the time constraints, we are developing an extension of a time-constrained dataflow model of computation, layered above and below the workflow management tools of both the high-performance computing resources and the Cloud infrastructure. This model of computation is also used to express tasks in the workflow at the right granularity to benefit from the data management optimisation facilities of the LEXIS project. This paper describes the workflow, the associated computations and the model of computation within the LEXIS platform.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Article , peerRev , info:eu-repo/semantics/article
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2021-12-28
    Description: State of the art tsunami warning systems employ a combined approach of pre-computed scenarios and on the fly tsunami simulation in case of an event. The on the fly simulations are performed on rather coarse meshes (approx. 1km resolution), usually neglect e.g., the non-linear advection in the shallow water equations, and can deliver a reasonable estimate of the wave height at the coast within a few seconds of computation time. As in the early warning situation, the earthquake source is the major unknown, they can improve the hazard assessment compared to pre-computed scenarios based on idealized sources. On the other hand, it requires a resolution of approximately 10m on land and the non-linear shallow water equations augmented by terms like the bottom roughness to simulate the inundation in the quality needed to derive risk maps for civil protection measures. With the simulation code TsunAWI, which employs an unstructured triangular mesh to seamlessly change the spatial resolution from a few meters in an area of interest to a few kilometers in the deep ocean, such simulations can be performed with a regional focus in less than 20min computation time. Hence, with a coarsened resolution, a first estimate of the inundation could be provided within a few minutes, improving the near-realtime assessment of the hazard. We investigate which quality of inundation result can be achieved within a limited computation time, regarding computing platforms based on various generations of Intel Xeon from Broadwell to Cascade Lake. This investigation is part of the EU funded LEXIS project lead by It4Innovations, Ostrava, Czech Republic. The overall aim is to build an advanced engineering platform at the confluence of HPC, Cloud and Big Data. Of particular interest is the development of time constrained workflows over HPC and cloud resources, with a pilot combining tsunami simulations and earthquake damage assessment. Fast tsunami inundation estimates are a key element of that pilot.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    facet.materialart.
    Unknown
    In:  EPIC3EGU General Assembly 2021, online, 2021-04-19-2021-04-30
    Publication Date: 2021-12-28
    Description: Based on the shallow water equations,the tsunami wave propagation in the deep ocean and an assessment of the wave height at the coast can easily be simulated online during an event. To simulate the estimated inundation, however, poses higher demands on model physics and mesh resolution. Whereas in the deep ocean, a simple balance between pressure gradient force and acceleration is sufficient for first estimates of the wave propagation, additional nonlinear factors like bottom friction and momentum advection gain importance close to the coast. For a seamless simulation of the transition from wave propagation to inundation, the finite element model TsunAWI has been developed as part of the efforts within the GITEWS project (German Indonesian Tsunami Early Warning System) and in the meantime, the code has evolved considerably with applications in several projects. The triangular mesh approach allows for large freedom in the resolution of coastline and bathymetric features, however is also numerically demanding. In the ongoing EU-project LEXIS (Large-scale Execution for Industry & Society), the simulation of earthquake and tsunami events is one of the pilot study cases and on the tsunami side puts focus on the optimization of TsunAWI on modern HPC architectures. Targeting FPGAs, an accelerator for TsunAWI is being designed. It relies on a software-distributed shared memory (S-DSM) allowing sharing of the memory between distributed nodes and the accelerator(s), and is showing that TsunAWI optimisations, namely single precision and unstructured mesh traversal, are key elements to reach high performance and efficiency. For HPC systems, an MPI parallelization was implemented, based on domain decomposition. The MPI parallel code shows good scaling, making high resolution simulations feasible during an event. The developments are evaluated in simulations of tsunami inundation in hypothetical and real events in Indonesia and Chile. It turns out that the optimized approach allows for improved fast estimates of the tsunami impact in the application cases.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    facet.materialart.
    Unknown
    In:  EPIC3Workshop: Multi-annual to Decadal Climate Predictability in the North Atlantic-Arctic Sector, Online, September 20-22, 2021
    Publication Date: 2021-12-28
    Description: The Parallel Data Assimilation Framework (PDAF, http://pdaf.awi.de) is an open-source software framework for highly efficient ensemble data assimilation with complex models on supercomputers. PDAF was developed to simplify the generation of a data assimilation system from existing models. For coupled data assimilation, PDAF is used for example with the coupled atmosphere-ocean model AWI-CM, with different coupled ocean biogeochemical models, and with the atmosphere-land surface-subsurface model TerrSysMP. However, there is a wide range of further applications of PDAF. PDAF provides functionality to perform ensemble integrations, which can be used for ensemble predictions and ensemble data assimilation. Further, PDAF provides several fully-implemented ensemble filter and smoother methods for data assimilation. One can build the data assimilation application either by using model restart files or by directly augmenting the different compartment models of a coupled system with data assimilation functionality. The ensemble data assimilation can then be applied in an efficient way with complex models like AWI-CM on supercomputers with excellent scalability and efficiency. PDAF directly supports both weakly and strongly coupled data assimilation. Discussed will be the features of PDAF and the structure of data assimilation systems for coupled data assimilation with PDAF.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    facet.materialart.
    Unknown
    In:  EPIC3WCRP-WWRP Symposium on Data Assimilation and Reanalysis, online, September 13-18, 2021
    Publication Date: 2021-12-28
    Description: PDAF, the Parallel Data Assimilation Framework (http://pdaf.awi.de), is an open-source framework for ensemble data assimilation (DA). PDAF is designed to be particularly easy to use and a DA system can be quickly build, while PDAF ensures the computational efficiency. PDAF's ensemble-component provides online-coupled DA functionality, thus data transfers in memory and by using the MPI parallelization standard, by inserting 3 function calls into the model code. These additions convert a numerical model into a data-assimilative model, which can be run like the original model, but with additional options. Alternatively, one can use separate programs to compute the forecasts and the DA analysis update. PDAF further provides DA methods (solvers), in particular ensemble Kalman filters and particle filters. Tools for diagnostics, ensemble generation, and for generating synthetic observations for OSSEs or twin experiments, provide additional functionality for DA. PDAF is used for research purposes, teaching, but also operationally. In the operational context, PDAF is e.g. used at the CMEMS marine forecasting center for the Baltic Sea and in the Chinese Global Ocean Forecasting System (CGOFS). A recent addition to PDAF is OMI, the Observation Module Infrastructure, a library extension for observation handling. OMI is inspired by object-oriented programming, but for ease of use, it is not coded using classes. Recent developments further include support for strongly-coupled DA across components of Earth system models, model bindings for NEMO, SCHISM, and the climate model AWI-CM and ensemble-variational solvers. This presentation discusses the PDAF's features and recent infrastructure developments in PDAF.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev , info:eu-repo/semantics/conferenceObject
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    facet.materialart.
    Unknown
    In:  EPIC3Joint ECMWF/OceanPredict workshop on Advances in Ocean Data Assimilation, online, May 17-20, 2021
    Publication Date: 2021-12-28
    Description: The second-order exact particle filter NETF (nonlinear ensemble transform filter) is combined with local ensemble transform Kalman filter (LETKF) to build a hybrid filter method (LKNETF). The filter combines the stability of the LETKF with the nonlinear properties of the NETF to obtain improved assimilation results for small ensemble sizes. Both filter components are localized in a consistent way so that the filter can be applied with high-dimensional models. The degree of filter nonlinearity is defined by a hybrid weight, which shifts the analysis between the LETKF and NETF. Since the NETF is more sensitive to sampling errors than the LETKF, the latter filter should be preferred in linear cases. It is discussed how an adaptive hybrid weight can be defined based on the nonlinearity of the system so that the adaptivity yields a good filter performance in linear and nonlinear situations. The filter behavior is exemplified based on experiments with the chaotic Lorenz-96 model, in which the nonlinearity can be controlled by the length of the forecast phase, and an idealized configuration of the ocean model NEMO.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev , info:eu-repo/semantics/conferenceObject
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    facet.materialart.
    Unknown
    In:  EPIC3BOOS Annual Meeting, online, November 25, 2021
    Publication Date: 2021-12-28
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev , info:eu-repo/semantics/conferenceObject
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    facet.materialart.
    Unknown
    In:  EPIC3WCRP-WWRP Symposium on Data Assimilation and Reanalysis, online, September 13-18, 2021
    Publication Date: 2021-12-28
    Description: A hybrid nonlinear-Kalman ensemble transform filter (LKNETF) algorithm is build by combining the second-order exact particle filter NETF (nonlinear ensemble transform filter) with the local ensemble transform Kalman filter (LETKF). The hybrid filter combines the stability of the LETKF with the nonlinear properties of the NETF to obtain improved assimilation results for small ensemble sizes. Both filter components are localized in a consistent way so that the filter can be applied with high-dimensional models. The degree of filter nonlinearity is defined by a hybrid weight, which shifts the analysis between the LETKF and NETF. Since the NETF is more sensitive to sampling errors than the LETKF, the latter filter should be preferred in linear Gaussian cases. An adaptive hybrid weight can be defined based on the nonlinearity of the system so that the adaptivity yields a good filter performance in both linear and nonlinear situations. In particular the skewness and kurtosis of the ensemble can be applied to quantify the non-Gaussianity. The filter behavior is exemplified based on experiments with the chaotic Lorenz-63 und -96 models, in which the nonlinearity can be controlled by the length of the forecast phase. In these experiments the hybrid filter can yield an error reduction of up to 28% compared to the LETKF.
    Repository Name: EPIC Alfred Wegener Institut
    Type: Conference , notRev
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...