ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The River Plume Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences in close collaboration with Helmholtz-Zentrum Hereon. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The focus of the River Plume Workflow is the impact of riverine flood events on the marine environment. At the end of a flood event chain, an unusual amount of nutrients and pollutants is washed into the North Sea, which can have consequences, such as increased algae blooms. The workflow aims to enable users to detect a river plume in the North Sea and to determine its spatio-temporal extent. Identifying river plume candidates can either happen manually in the visual interface or also through an automatic anomaly detection algorithm, using Gaussian regression. In both cases a combination of observational data, namely FerryBox transects and satellite data, and model data are used. Once a river plume candidate is found, a statistical analysis supplies additional detail on the anomaly and helps to compare the suspected river plume to the surrounding data. Simulated trajectories of particles starting on the FerryBox transect at the time of the original observation and modelled backwards and forwards in time help to verify the origin of the river plume and allow users to follow the anomaly across the North Sea. An interactive map enables users to load additional observational data into the workflow, such as ocean colour satellite maps, and provides them with an overview of the flood impacts and the river plume’s development on its way through the North Sea. In addition, the workflow offers the functionality to assemble satellite-based chlorophyll observations along model trajectories as a time series. They allow scientists to understand processes inside the river plume and to determine the timescales on which these developments happen. For example, chlorophyll degradation rates in the Elbe river plume are currently investigated using these time series. The workflow's added value lies in the ease with which users can combine observational FerryBox data with relevant model data and other datasets of their choice. Furthermore, the workflow allows users to visually explore the combined data and contains methods to find and highlight anomalies. The workflow’s functionalities also enable users to map the spatio-temporal extent of the river plume and investigate the changes in productivity that occur in the plume. All in all, the River Plume Workflow simplifies the investigation and monitoring of flood events and their impacts in marine environments.
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflow ; river plume ; ferrybox ; impact ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The Socio-Economic Flood Impacts Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences . It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The Socio-Economic Flood Impacts Workflow aims to support the identification of relevant controls and useful indicators for the assessment of flood impacts. It should support answering the question What are useful indicators to assess socio-economic flood impacts?. Floods impact individuals and communities and may have significant social, economic and environmental consequences. These impacts result from the interplay of hazard - the meteo-hydrological processes leading to high water levels and inundation of usually dry land, exposure - the elements affected by flooding such as people, build environment or infrastructure, and vulnerability - the susceptibility of exposed elements to be harmed by flooding. In view of the complex interactions of hazard and impact processes a broad range of data from disparate sources need to be compiled and analysed across the boundaries of climate and atmosphere, catchment and river network, and socio-economic domains. The workflow approaches this problem and supports scientists to integrate observations, model outputs and other datasets for further analysis in the region of interest. The workflow provides functionalities to select the region of interest, access hazard, exposure and vulnerability related data from different sources, identifying flood periods as relevant time ranges, and calculate defined indices. The integrated input data set is further filtered for the relevant flood event periods in the region of interest to obtain a new comprehensive flood data set. This spatio-temporal dataset is analysed using data-science methods such as clustering, classification or correlation algorithms to explore and identify useful indicators for flood impacts. For instance, the importance of different factors or the interrelationships among multiple variables to shape flood impacts can be explored. The added value of the Socio-Economic Flood Impacts Workflow is twofold. First, it integrates scattered data from disparate sources and makes it accessible for further analysis. As such, the effort to compile, harmonize and combine a broad range of spatio-temporal data is clearly reduced. Also, the integration of new datasets from additional sources is much more straightforward. Second, it enables a flexible analysis of multivariate data and by reusing algorithms from other workflows it fosters a more efficient scientific work that can focus on data analysis instead of tedious data wrangling.
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflow ; hydrometeorological controls ; indicators ; impact assessment ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The Flood Similarity Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences . It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). River floods and associated adverse consequences are caused by complex interactions of hydro-meteorological and socio-economic pre-conditions and event characteristics. The Flood Similarity Workflow supports the identification, assessment and comparison of hydro-meteorological controls of flood events. The analysis of flood events requires the exploration of discharge time series data for hundreds of gauging stations and their auxiliary data. Data availability and accessibility and standard processing techniques are common challenges in that application and addressed by this workflow. The Flood Similarity Workflow allows the assessment and comparison of arbitrary flood events. The workflow includes around 500 gauging stations in Germany comprising discharge data and the associated extreme value statistics as well as precipitation and soil moisture data. This provides the basis to identify and compare flood events based on antecedent catchment conditions, catchment precipitation, discharge hydrographs, and inundation maps. The workflow also enables the analysis of multidimensional flood characteristics including aggregated indicators (in space and time), spatial patterns and time series signatures. The added value of the Flood Event Explorer comprises two major points. First, scientist work on a common, homogenized database of flood events and their hydro-meteorological controls for a large spatial and temporal domain , with fast and standardized interfaces to access the data. Second, the standardized computation of common flood indicators allows a consistent comparison and exploration of flood events.
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflow ; hydrometeorological controls ; compare ; assess ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The Smart Monitoring Workflow (Tocap) is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences in close collaboration with the Helmholtz-Centre for Environmental Research UFZ Leipzig. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). A deeper understanding of the Earth system as a whole and its interacting sub-systems depends not only on accurate mathematical approximations of the physical processes but also on the availability of environmental data across time and spatial scales. Even though advanced numerical simulations and satellite-based remote sensing in conjunction with sophisticated algorithms such as machine learning tools can provide 4D environmental datasets, local and mesoscale measurements continue to be the backbone in many disciplines such as hydrology. Considering the limitations of human and technical resources, monitoring strategies for these types of measurements should be well designed to increase the information gain provided. One helpful set of tools to address these tasks are data exploration frameworks providing qualified data from different sources and tailoring available computational and visual methods to explore and analyse multi-parameter datasets. In this context, we developed a Smart Monitoring Workflow to determine the most suitable time and location for event-driven, ad-hoc monitoring in hydrology using soil moisture measurements as our target variable. The Smart Monitoring Workflow consists of three main steps. First is the identification of the region of interest, either via user selection or recommendation based on spatial environmental parameters provided by the user. Statistical filters and different color schemes can be applied to highlight different regions. The second step is accessing time-dependent environmental parameters (e.g., rainfall and soil moisture estimates of the recent past, weather predictions from numerical weather models and swath forecasts from Earth observation satellites) for the region of interest and visualizing the results. Lastly, a detailed assessment of the region of interest is conducted by applying filter and weight functions in combination with multiple linear regressions on selected input parameters. Depending on the measurement objective (e.g highest/lowest values, highest/lowest change), most suitable areas for monitoring will subsequently be visually highlighted. In combination with the provided background map, an efficient route for monitoring can be planned directly in the exploration environment. The added value of the Smart Monitoring Workflow is multifold. The workflow gives the user a set of tools to visualize and process their data on a background map and in combination with data from public environmental datasets. For raster data from public databases, tailor-made routines are provided to access the data in the spatial-temporal limits required by the user. Aiming to facilitate the design of terrestrial monitoring campaigns, the platform and device-independent approach of the workflow gives the user the flexibility to design a campaign at the desktop computer first and to refine it later in the field using mobile devices. In this context, the ability of the workflow to plot time-series of forecast data for the region of interest empowers the user to react quickly to changing conditions, e.g thunderstorm showers, by adapting the monitoring strategy, if necessary. Finally, the integrated routing algorithm assists to calculate the duration of a planned campaign as well as the optimal driving route between often scattered monitoring locations.
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflow ; smart monitoring ; campaign planning ; tocap ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The Climate Change Workflow is part of the Flood Event Explorer (FEE, Eggert et al., 2022), developed at the GFZ German Research Centre for Geosciences in close collaboration with Helmholtz-Zentrum Hereon , Climate Service Center Germany. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). The goal of the Climate Change Workflow is to support the analysis of climate-driven changes in flood-generating climate variables, such as precipitation or soil moisture, using regional climate model simulations from the Earth System Grid Federation (ESGF) data archive. It should support to answer the geoscientific question How does precipitation change over the course of the 21st century under different climate scenarios, compared to a 30-year reference period over a certain region? Extraction of locally relevant data over a region of interest (ROI) requires climate expert knowledge and data processing training to correctly process large ensembles of climate model simulations, the Climate Change Workflow tackles this problem. It supports scientists to define the regions of interest, customize their ensembles from the climate model simulations available on the Earth System Grid Federation (ESGF), define variables of interest, and relevant time ranges. The Climate Change Workflow provides: (1) a weighted mask of the ROI ; (2) weighted climate data of the ROI; (3) time series evolution of the climate over the ROI for each ensemble member; (4) ensemble statistics of the projected change; and lastly, (5) an interactive visualization of the region’s precipitation change projected by the ensemble of selected climate model simulations for different Representative Concentration Pathways (RCPs). The visualization includes the temporal evolution of precipitation change over the course of the 21st century and statistical characteristics of the ensembles for two selected 30 year time periods for the mid and the end of the 21st century (e.g. median and various percentiles). The added value of the Climate Change Workflow is threefold. First, there is a reduction in the number of different software programs necessary to extract locally relevant data. Second, the intuitive generation and access to the weighted mask allows for the further development of locally relevant climate indices. Third, by allowing access to the locally relevant data at different stages of the data processing chain, scientists can work with a vastly reduced data volume allowing for a greater number of climate model ensembles to be studied; which translates into greater scientific robustness. Thus, the Climate Change Workflow provides much easier access to an ensemble of high-resolution simulations of precipitation, over a given ROI, presenting the region’s projected precipitation change using standardized approaches and supporting the development of additional locally relevant climate indices.
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflow ; Climate Change ; ESGF ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2022-02-01
    Description: Abstract
    Description: The Digital Earth Flood Event Explorer supports geoscientists and experts to analyse flood events along the process cascade event generation, evolution and impact across atmospheric, terrestrial, and marine disciplines. It applies the concept of scientific workflows and the component-based Data Analytics Software Framework (DASF, Eggert and Dransch, 2021) to an exemplary showcase. It aims at answering the following geoscientific questions: - How does precipitation change over the course of the 21st century under different climate scenarios over a certain region? - What are the main hydro-meteorological controls of a specific flood event? - What are useful indicators to assess socio-economic flood impacts? - How do flood events impact the marine environment? - What are the best monitoring sites for upcoming flood events? The Flood Event Explorer developed scientific workflows for each geoscientific question providing enhanced analysis methods from statistics, machine learning, and visual data exploration that are implemented in different languages and software environments, and that access data form a variety of distributed databases. The collaborating scientists are from different Helmholtz research centers and belong to different scientific fields such as hydrology, climate-, marine-, and environmental science, and computer- and data science. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/).
    Description: TechnicalInfo
    Description: Copyright 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DE Flood Event Explorer Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; DASF ; Workflows ; EARTH SCIENCE 〉 HUMAN DIMENSIONS 〉 NATURAL HAZARDS 〉 FLOODS ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2022-05-20
    Description: Abstract
    Description: floodsimilarity provides classes and methods to conduct a similarity analysis between multiple flood events. The library mainly consists of two parts: (1) algorithms to compute indices and other statistics based on pandas and xarray (2) well-defined data structures for data exchange (e.g. through the Similarity Backend Module) floodsimilarity is used by the Digital Earth Similarity Backend Module (Eggert, 2021) as part of the Digital Earth Flood Event Explorer. It is developed at the GFZ German Research Centre for Geosciences and funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project.
    Description: TechnicalInfo
    Description: Copyright © 2022 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Keywords: Digital Earth ; Flood ; Flood Event Explorer ; EARTH SCIENCE 〉 TERRESTRIAL HYDROSPHERE 〉 SURFACE WATER 〉 FLOODS ; EARTH SCIENCE 〉 TERRESTRIAL HYDROSPHERE 〉 SURFACE WATER 〉 RUNOFF ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2022-09-27
    Description: Abstract
    Description: DASF: Web is part of the Data Analytics Software Framework (DASF, https://git.geomar.de/digital-earth/dasf), developed at the GFZ German Research Centre for Geosciences (https://www.gfz-potsdam.de). It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). DASF: Web collects all web components for the data analytics software framework DASF. It provides ready to use interactive data visualization components like time series charts, radar plots, stacked-parameter-relation (spr) and more, as well as a powerful map component for the visualization of spatio-temporal data. Moreover dasf-web includes the web bindings for the DASF RCP messaging protocol and therefore allows to connect any algorithm or method (e.g. via the dasf-messaging-python implementation) to the included data visualization components. Because of the component based architecture the integrated method could be deployed anywhere (e.g. close to the data it is processing), while the interactive data visualizations are executed on the local machine. dasf-web is implemented in Typescript and uses Vuejs/Vuetify, Openlayers and D3 as a technical basis.
    Description: TechnicalInfo
    Description: Copyright 2021 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DASF Data Analytics Software Framework Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Description: Other
    Description: The data analytics software framework DASF, developed at the GFZ German Research Centre for Geosciences and funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/), provides a framework for scientists to conduct data analysis in distributed environments. DASF supports scientists to conduct data analysis in distributed IT infrastructures by sharing data analysis tools and data. For this purpose, DASF defines a remote procedure call (RCP) messaging protocol that uses a central message broker instance. Scientists can augment their tools and data with this protocol to share them with others. DASF supports many programming languages and platforms since the implementation of the protocol uses WebSockets. It provides two ready-to-use language bindings for the messaging protocol, one for Python and one for the Typescript programming language. In order to share a python method or class, users add an annotation in front of it. In addition, users need to specify the connection parameters of the message broker. The central message broker approach allows the method and the client calling the method to actively establish a connection, which enables using methods deployed behind firewalls. DASF uses Apache Pulsar (https://pulsar.apache.org/) as its underlying message broker. The Typescript bindings are primarily used in conjunction with web frontend components, which are also included in the DASF-Web library. They are designed to attach directly to the data returned by the exposed RCP methods. This supports the development of highly exploratory data analysis tools. DASF also provides a progress reporting API that enables users to monitor long-running remote procedure calls.
    Keywords: DASF ; Data Analytics Software Framework ; RCP ; remote procedure call ; interactive visualization ; web components ; typescript ; vuetify ; openlayers ; d3 ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION 〉 VISUALIZATION/IMAGE PROCESSING
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2022-09-27
    Description: Abstract
    Description: The success of scientific projects increasingly depends on using data analysis tools and data in distributed IT infrastructures. Scientists need to use appropriate data analysis tools and data, extract patterns from data using appropriate computational resources, and interpret the extracted patterns. Data analysis tools and data reside on different machines because the volume of the data often demands specific resources for their storage and processing, and data analysis tools usually require specific computational resources and run-time environments. The data analytics software framework DASF, developed at the GFZ German Research Centre for Geosciences (https://www.gfz-potsdam.de) and funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/), provides a framework for scientists to conduct data analysis in distributed environments. The data analytics software framework DASF supports scientists to conduct data analysis in distributed IT infrastructures by sharing data analysis tools and data. For this purpose, DASF defines a remote procedure call (RCP) messaging protocol that uses a central message broker instance. Scientists can augment their tools and data with this protocol to share them with others. DASF supports many programming languages and platforms since the implementation of the protocol uses WebSockets. It provides two ready-to-use language bindings for the messaging protocol, one for Python and one for the Typescript programming language. In order to share a python method or class, users add an annotation in front of it. In addition, users need to specify the connection parameters of the message broker. The central message broker approach allows the method and the client calling the method to actively establish a connection, which enables using methods deployed behind firewalls. DASF uses Apache Pulsar (https://pulsar.apache.org/) as its underlying message broker. The Typescript bindings are primarily used in conjunction with web frontend components, which are also included in the DASF-Web library. They are designed to attach directly to the data returned by the exposed RCP methods. This supports the development of highly exploratory data analysis tools. DASF also provides a progress reporting API that enables users to monitor long-running remote procedure calls. One application using the framework is the Digital Earth Flood Event Explorer (https://git.geomar.de/digital-earth/flood-event-explorer). The Digital Earth Flood Event Explorer integrates several exploratory data analysis tools and remote procedures deployed at various Helmholtz centers across Germany.
    Keywords: DASF ; RCP ; Python ; Progress ; Data Analytics Software Framework ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION ; EARTH SCIENCE SERVICES 〉 DATA ANALYSIS AND VISUALIZATION 〉 STATISTICAL APPLICATIONS ; EARTH SCIENCE SERVICES 〉 DATA MANAGEMENT/DATA HANDLING ; EARTH SCIENCE SERVICES 〉 DATA MANAGEMENT/DATA HANDLING 〉 DATA NETWORKING/DATA TRANSFER TOOLS
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2022-09-27
    Description: Abstract
    Description: DASF: Messaging Python is part of the Data Analytics Software Framework (DASF, https://git.geomar.de/digital-earth/dasf), developed at the GFZ German Research Centre for Geosciences. It is funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/). DASF: Messaging Python is a RCP (remote procedure call) wrapper library for the python programming language. As part of the data analytics software framework DASF, it implements the DASF RCP messaging protocol. This message broker based RCP implementation supports the integration of algorithms and methods implemented in python in a distributed environment. It utilizes pydantic (https://pydantic-docs.helpmanual.io/) for data and model validation using python type annotations. Currently the implementation relies on Apache Pulsar (https://pulsar.apache.org/) as a central message broker instance.
    Description: TechnicalInfo
    Description: Copyright 2021 Helmholtz Centre Potsdam GFZ German Research Centre for Geosciences, Potsdam, Germany / DASF Data Analytics Software Framework Licensed under the Apache License, Version 2.0 (the "License"); you may not use these files except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.
    Description: Other
    Description: The data analytics software framework DASF, developed at the GFZ German Research Centre for Geosciences (https://www.gfz-potsdam.de) and funded by the Initiative and Networking Fund of the Helmholtz Association through the Digital Earth project (https://www.digitalearth-hgf.de/), provides a framework for scientists to conduct data analysis in distributed environments. DASF supports scientists to conduct data analysis in distributed IT infrastructures by sharing data analysis tools and data. For this purpose, DASF defines a remote procedure call (RCP) messaging protocol that uses a central message broker instance. Scientists can augment their tools and data with this protocol to share them with others. DASF supports many programming languages and platforms since the implementation of the protocol uses WebSockets. It provides two ready-to-use language bindings for the messaging protocol, one for Python and one for the Typescript programming language. In order to share a python method or class, users add an annotation in front of it. In addition, users need to specify the connection parameters of the message broker. The central message broker approach allows the method and the client calling the method to actively establish a connection, which enables using methods deployed behind firewalls. DASF uses Apache Pulsar (https://pulsar.apache.org/) as its underlying message broker. The Typescript bindings are primarily used in conjunction with web frontend components, which are also included in the DASF-Web library. They are designed to attach directly to the data returned by the exposed RCP methods. This supports the development of highly exploratory data analysis tools. DASF also provides a progress reporting API that enables users to monitor long-running remote procedure calls.
    Keywords: DASF ; Data Analytics Software Framework ; RCP ; remote procedure call ; message broker ; distributed analysis ; python ; EARTH SCIENCE SERVICES 〉 DATA MANAGEMENT/DATA HANDLING ; EARTH SCIENCE SERVICES 〉 DATA MANAGEMENT/DATA HANDLING 〉 DATA NETWORKING/DATA TRANSFER TOOLS
    Type: Software , Software
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...