ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (9)
  • Copernicus  (9)
Collection
  • Articles  (9)
Years
Topic
  • 1
    Publication Date: 2012-02-27
    Description: The development of methods for estimating the parameters of hydrologic models considering uncertainties has been of high interest in hydrologic research over the last years. In particular methods which understand the estimation of hydrologic model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008) presented a first Robust Parameter Estimation Method (ROPE) and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. The basic idea of this algorithm is to identify a set of model parameter vectors with high model performance called good parameters and subsequently generate a set of parameter vectors with high data depth with respect to the first set. Both steps are repeated iteratively until a stopping criterion is met. The results estimated in this case study show the high potential of the principle of data depth to be used for the estimation of hydrologic model parameters. In this paper we present some further developments that address the most important shortcomings of the original ROPE approach. We developed a stratified depth based sampling approach that improves the sampling from non-elliptic and multi-modal distributions. It provides a higher efficiency for the sampling of deep points in parameter spaces with higher dimensionality. Another modification addresses the problem of a too strong shrinking of the estimated set of robust parameter vectors that might lead to overfitting for model calibration with a small amount of calibration data. This contradicts the principle of robustness. Therefore, we suggest to split the available calibration data into two sets and use one set to control the overfitting. All modifications were implemented into a further developed ROPE approach that is called Advanced Robust Parameter Estimation (AROPE). However, in this approach the estimation of the good parameters is still based on an ineffective Monte Carlo approach. Therefore we developed another approach called ROPE with Particle Swarm Optimisation (ROPE-PSO) that substitutes the Monte Carlo approach with a more effective and efficient approach based on Particle Swarm Optimisation. Two case studies demonstrate the improvements of the developed algorithms when compared with the first ROPE approach and two other classical optimisation approaches calibrating a process oriented hydrologic model with hourly time step. The focus of both case studies is on modelling flood events in a small catchment characterised by extreme process dynamics. The calibration problem was repeated with higher dimensionality considering the uncertainty in the soil hydraulic parameters and another conceptual parameter of the soil module. We discuss the estimated results and propose further possibilities in order to apply ROPE as a well-founded parameter estimation and uncertainty analysis tool.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2012-10-15
    Description: Process-oriented rainfall-runoff models are designed to approximate the complex hydrologic processes within a specific catchment and in particular to simulate the discharge at the catchment outlet. Most of these models exhibit a high degree of complexity and require the determination of various parameters by calibration. Recently, automatic calibration methods became popular in order to identify parameter vectors with high corresponding model performance. The model performance is often assessed by a purpose-oriented objective function. Practical experience suggests that in many situations one single objective function cannot adequately describe the model's ability to represent any aspect of the catchment's behaviour. This is regardless of whether the objective is aggregated of several criteria that measure different (possibly opposite) aspects of the system behaviour. One strategy to circumvent this problem is to define multiple objective functions and to apply a multi-objective optimisation algorithm to identify the set of Pareto optimal or non-dominated solutions. Nonetheless, there is a major disadvantage of automatic calibration procedures that understand the problem of model calibration just as the solution of an optimisation problem: due to the complex-shaped response surface, the estimated solution of the optimisation problem can result in different near-optimum parameter vectors that can lead to a very different performance on the validation data. Bárdossy and Singh (2008) studied this problem for single-objective calibration problems using the example of hydrological models and proposed a geometrical sampling approach called Robust Parameter Estimation (ROPE). This approach applies the concept of data depth in order to overcome the shortcomings of automatic calibration procedures and find a set of robust parameter vectors. Recent studies confirmed the effectivity of this method. However, all ROPE approaches published so far just identify robust model parameter vectors with respect to one single objective. The consideration of multiple objectives is just possible by aggregation. In this paper, we present an approach that combines the principles of multi-objective optimisation and depth-based sampling, entitled Multi-Objective Robust Parameter Estimation (MOROPE). It applies a multi-objective optimisation algorithm in order to identify non-dominated robust model parameter vectors. Subsequently, it samples parameter vectors with high data depth using a further developed sampling algorithm presented in Krauße and Cullmann (2012a). We study the effectivity of the proposed method using synthetical test functions and for the calibration of a distributed hydrologic model with focus on flood events in a small, pre-alpine, and fast responding catchment in Switzerland.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2011-04-15
    Description: Process-oriented rainfall-runoff models are designed to approximate the complex hydrologic processes within a specific catchment and in particular to simulate the discharge at the catchment outlet. Most of these models exhibit a high degree of complexity and require the determination of various parameters by calibration. Recently automatic calibration methods became popular in order to identify parameter vectors with high corresponding model performance. The model performance is often assessed by a purpose-oriented objective function. Practical experience suggests that in many situations one single objective function cannot adequately describe the model's ability to represent any aspect of the catchment's behaviour. This is regardless whether the objective is aggregated of several criteria that measure different (possibly opposite) aspects of the system behaviour. One strategy to circumvent this problem is to define multiple objective functions and to apply a multi-objective optimisation algorithm to identify the set of Pareto optimal or non-dominated solutions. One possible approach to estimate the Pareto set effectively and efficiently is the particle swarm optimisation (PSO). It has already been successfully applied in various other fields and has been reported to show effective and efficient performance. Krauße and Cullmann (2011b) presented a method entitled ROPEPSO which merges the strengths of PSO and data depth measures in order to identify robust parameter vectors for hydrological models. In this paper we present a multi-objective parameter estimation algorithm, entitled the Multi-Objective Robust Particle Swarm Parameter Estimation (MO-ROPE). The algorithm is a further development of the previously mentioned single-objective ROPEPSO approach. It applies a newly developed multi-objective particle swarm optimisation algorithm in order to identify non-dominated robust model parameter vectors. Subsequently it samples robust parameter vectors by the application of data depth metrics. In a preliminary assessment MO-PSO-GA is compared with other multi-objective optimisation algorithms. In the frame of a real world case study MO-ROPE is applied identifying robust parameter vectors of a distributed hydrological model with focus on flood events in a small, pre-alpine, and fast responding catchment in Switzerland. The method is compared with existing robust parameter estimation methods.
    Print ISSN: 1812-2108
    Electronic ISSN: 1812-2116
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2011-03-07
    Description: The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. In particular methods which understand the estimation of hydrological model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth found growing research interest. Bárdossy and Singh (2008) presented a first proposal and applied it for the calibration of a conceptual rainfall-runoff model with daily time step. Krauße and Cullmann (2011) further developed this method and applied it in a case study to calibrate a process oriented hydrological model with hourly time step focussing on flood events in a fast responding catchment. The results of both studies showed the potential of the application of the principle of data depth. However, also the weak point of the presented approach got obvious. The algorithm identifies a set of model parameter vectors with high model performance and subsequently generates a set of parameter vectors with high data depth with respect to the first set. These both steps are repeated iteratively until a stopping criterion is met. In the first step the estimation of the good parameter vectors is based on the Monte Carlo method. The major shortcoming of this method is that it is strongly dependent on a high number of samples exponentially growing with the dimensionality of the problem. In this paper we present another robust parameter estimation strategy which applies an approved search strategy for high-dimensional parameter spaces, the particle swarm optimisation in order to identify a set of good parameter vectors with given uncertainty bounds. The generation of deep parameters is according to Krauße and Cullmann (2011). The method was compared to the Monte Carlo based robust parameter estimation algorithm on the example of a case study in Krauße and Cullmann (2011) to calibrate the process-oriented distributed hydrological model focussing for flood forecasting in a small catchment characterised by extreme process dynamics. In a second case study the comparison is repeated on a problem with higher dimensionality considering further parameters of the soil module.
    Print ISSN: 1812-2108
    Electronic ISSN: 1812-2116
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2015-09-10
    Description: Global climate change can have impacts on characteristics of rainfall-runoff events and subsequently on the hydrological regime. Meanwhile, the catchment itself changes due to anthropogenic influences. In this context, it can be meaningful to detect the temporal changes of catchments independent from climate change by investigating existing long term discharge records. For this purpose, a new stochastic system based on copulas for time series analysis is introduced. While widely used time series models are based on linear combinations of correlations assuming a Gaussian behavior of variables, a statistical tool like copula has the advantage to scrutinize the dependence structure of the data in the uniform domain independent of the marginal. Two measures in the copula domain are introduced herein: 1. Copula asymmetry is defined for copulas and calculated for discharges; this measure describes the non symmetric property of the dependence structure and differs from one catchment to another due to the intrinsic nature of both runoff and catchment. 2. Copula distance is defined as Cramér-von Mises type distance calculated between two copula densities of different time scales. This measure describes the variability and interdependency of dependence structures similar to variance and covariance, which can assist in identifying the catchment changes. These measures are calculated for 100 years of daily discharges for the Rhine rivers. Comparing the results of copula asymmetry and copula distance between an API and simulated discharge time series by a hydrological model we can show the interesting signals of systematic modifications along the Rhine rivers in the last 30 years.
    Print ISSN: 1812-2108
    Electronic ISSN: 1812-2116
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2016-07-11
    Description: Global climate change can have impacts on characteristics of rainfall–runoff events and subsequently on the hydrological regime. Meanwhile, the catchment itself changes due to anthropogenic influences. However, it is not easy to prove the link between the hydrology and the forcings. In this context, it might be meaningful to detect the temporal changes of catchments independent from climate change by investigating existing long-term discharge records. For this purpose, a new stochastic system based on copulas for time series analysis is introduced in this study.A statistical tool like copula has the advantage to scrutinize the dependence structure of the data and, thus, can be used to attribute the catchment behavior by focusing on the following aspects of the statistics defined in the copula domain: (1) copula asymmetry, which can capture the nonsymmetric property of discharge data, differs from one catchment to another due to the intrinsic nature of both runoff and catchment; and (2) copula distances can assist in identifying catchment change by revealing the variability and interdependency of dependence structures.These measures were calculated for 100 years of daily discharges for the Rhine River and these analyses detected epochs of change in the flow sequences. In a follow-up study, we compared the results of copula asymmetry and copula distance applied to two flow models: (i) antecedent precipitation index (API) and (ii) simulated discharge time series generated by a hydrological model. The results of copula-based analysis of hydrological time series seem to support the assumption that the Neckar catchment had started to change around 1976 and stayed unusual until 1990.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2011-03-07
    Description: The development of methods for estimating the parameters of hydrological models considering uncertainties has been of high interest in hydrological research over the last years. Besides the very popular Markov Chain Monte Carlo (MCMC) methods which estimate the uncertainty of model parameters in the settings of a Bayesian framework, the development of depth based sampling methods, also entitled robust parameter estimation (ROPE), have attracted an increasing research interest. These methods understand the estimation of model parameters as a geometric search of a set of robust performing parameter vectors by application of the concept of data depth. Recent studies showed that the parameter vectors estimated by depth based sampling perform more robust in validation. One major advantage of this kind of approach over the MCMC methods is that the formulation of a likelihood function within a Bayesian uncertainty framework gets obsolete and arbitrary purpose-oriented performance criteria defined by the user can be integrated without any further complications. In this paper we present an advanced ROPE method entitled the Advanced Robust Parameter Estimation by Monte Carlo algorithm (AROPEMC). The AROPEMC algorithm is a modified version of the original robust parameter estimation algorithm ROPEMC developed by Bárdossy and Singh (2008). AROPEMC performs by merging iterative Monte Carlo simulations, identifying well performing parameter vectors, the sampling of robust parameter vectors according to the principle of data depth and the application of a well-founded stopping criterion applied in supervised machine learning. The principals of the algorithm are illustrated by means of the Rosenbrock's and Rastrigin's function, two well known performance benchmarks for optimisation algorithms. Two case studies demonstrate the advantage of AROPEMC compared to state of the art global optimisation algorithms. A distributed process-oriented hydrological model is calibrated and validated for flood forecasting in a small catchment characterised by extreme process dynamics.
    Print ISSN: 1812-2108
    Electronic ISSN: 1812-2116
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2006-09-26
    Description: WaSiM-ETH (Gurtz et al., 2001), a widely used water balance simulation model, is tested for its suitability to serve for flow analysis in the context of rainfall runoff modelling and flood forecasting. In this paper, special focus is on the resolution of the process domain in space as well as in time. We try to couple model runs with different calculation time steps in order to reduce the effort arising from calculating the whole flow hydrograph at the hourly time step. We aim at modelling on the daily time step for water balance purposes, switching to the hourly time step whenever high-resolution information is necessary (flood forecasting). WaSiM-ETH is used at different grid resolutions, thus we try to become clear about being able to transfer the model in spatial resolution. We further use two different approaches for the overland flow time calculation within the sub-basins of the test watershed to gain insights about the process dynamics portrayed by the model. Our findings indicate that the model is very sensitive to time and space resolution and cannot be transferred across scales without recalibration.
    Print ISSN: 1680-7340
    Electronic ISSN: 1680-7359
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2006-09-26
    Description: For the modelling of the flood routing in the lower reaches of the Freiberger Mulde river and its tributaries the one-dimensional hydrodynamic modelling system HEC-RAS has been applied. Furthermore, this model was used to generate a database to train multilayer feedforward networks. To guarantee numerical stability for the hydrodynamic modelling of some 60 km of streamcourse an adequate resolution in space requires very small calculation time steps, which are some two orders of magnitude smaller than the input data resolution. This leads to quite high computation requirements seriously restricting the application – especially when dealing with real time operations such as online flood forecasting. In order to solve this problem we tested the application of Artificial Neural Networks (ANN). First studies show the ability of adequately trained multilayer feedforward networks (MLFN) to reproduce the model performance.
    Print ISSN: 1680-7340
    Electronic ISSN: 1680-7359
    Topics: Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...