ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2014-08-22
    Description: Wave climates are fundamental drivers of coastal vulnerability; changing trends in wave heights, periods and directions can severely impact a coastline. In a diverse storm environment, the changes in these parameters are difficult to detect and quantify. Since wave climates are linked to atmospheric circulation patterns, an automated and objective classification scheme was developed to explore links between synoptic-scale circulation patterns and wave climate variables, specifically wave heights. The algorithm uses a set of objective functions based on wave heights to guide the classification and find atmospheric classes with strong links to wave behaviour. Spatially distributed fuzzy numbers define the classes and are used to detect locally high- and low-pressure anomalies. Classes are derived through a process of simulated annealing. The optimized classification focuses on extreme wave events. The east coast of South Africa was used as a case study. The results show that three dominant patterns drive extreme wave events. The circulation patterns exhibit some seasonality with one pattern present throughout the year. Some 50–80% of the extreme wave events are explained by these three patterns. It is evident that strong low-pressure anomalies east of the country drive a wind towards the KwaZulu-Natal coastline which results in extreme wave conditions. We conclude that the methodology can be used to link circulation patterns to wave heights within a diverse storm environment. The circulation patterns agree with qualitative observations of wave climate drivers. There are applications to the assessment of coastal vulnerability and the management of coastlines worldwide.
    Print ISSN: 1561-8633
    Electronic ISSN: 1684-9981
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2013-02-20
    Description: In this paper we present a novel approach for flood hazard analysis of the whole Mekong Delta with a particular focus on the Vietnamese part. Based on previous studies identifying the flood regime in the Mekong delta as non-stationary (Delgado et al., 2010), we develop a non-stationary approach for flood hazard analysis. Moreover, the approach is also bi-variate, as the flood severity in the Mekong Delta is determined by both maximum discharge and flood volume, which determines the flood duration. Probabilities of occurrences of peak discharge and flood volume are estimated by a copula. The flood discharges and volumes are used to derive synthetic hydrographs, which in turn constitute the upper boundary condition for a large-scale hydrodynamic model covering the whole Mekong Delta. The hydrodynamic model transforms the hydrographs into hazard maps. In addition, we extrapolate the observed trends in flood peak and volume and their associated non-stationary extreme value distributions to the year 2030 in order to give a flood hazard estimate for the near future. The uncertainty of extreme flood events in terms of different possible combinations of peak discharge and flood volume given by the copula is considered. Also, the uncertainty in flood hydrograph shape is combined with parameter uncertainty of the hydrodynamic model in a Monte Carlo framework yielding uncertainty estimates in terms of quantile flood maps. The proposed methodology sets the frame for the development of probabilistic flood hazard maps for the entire Mekong Delta. The combination of bi-variate, non-stationary extreme value statistics with large-scale flood inundation modeling and uncertainty quantification is novel in itself. Moreover, it is in particular novel for the Mekong Delta: a region where not even a standard hazard analysis based on a univariate, stationary extreme value statistic exists.
    Electronic ISSN: 2195-9269
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2014-02-05
    Description: Wave climates are fundamental drivers of coastal vulnerability and changing trends in wave height, period and direction can severely impact coastlines. In a diverse storm environment, the changes in these parameters are difficult to detect and quantify. Since wave climates are linked to atmospheric circulation patterns an automated and objective classification scheme was developed to explore links between synoptic scale circulation patterns and wave climate variables, specifically wave heights. The algorithm uses a set of objective functions based on wave heights to guide the classification. Fuzzy rules define classification types that are used to detect locally high and low pressure anomalies through a process of simulated annealing. The optimized classification focuses on extreme wave events. The east coast of South Africa was used as a case study. The results show that three dominant patterns drive extreme wave events. The circulation patterns exhibit some seasonality with one pattern present throughout the year. Some 50–80% of the extreme wave events are explained by these three patterns. It is evident that strong low pressure anomalies east of the country drive a wind towards the KwaZulu-Natal coastline which results in extreme wave conditions. We conclude that the methodology can be used to link circulation patterns to wave heights within a diverse storm environment. The circulation patterns agree with qualitative observations of wave climate drivers. There are applications to the assessment of coastal vulnerability and the management of coastlines worldwide.
    Electronic ISSN: 2195-9269
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2007-04-03
    Description: A dynamical downscaling scheme is usually used to provide a short range flood forecasting system with high-resolved precipitation fields. Unfortunately, a single forecast of this scheme has a high uncertainty concerning intensity and location especially during extreme events. Alternatively, statistical downscaling techniques like the analogue method can be used which can supply a probabilistic forecasts. However, the performance of the analogue method is affected by the similarity criterion, which is used to identify similar weather situations. To investigate this issue in this work, three different similarity measures are tested: the euclidean distance (1), the Pearson correlation (2) and a combination of both measures (3). The predictor variables are geopotential height at 1000 and 700 hPa-level and specific humidity fluxes at 700 hPa-level derived from the NCEP/NCAR-reanalysis project. The study is performed for three mesoscale catchments located in the Rhine basin in Germany. It is validated by a jackknife method for a period of 44 years (1958–2001). The ranked probability skill score, the Brier Skill score, the Heidke skill score and the confidence interval of the Cramer association coefficient are calculated to evaluate the system for extreme events. The results show that the combined similarity measure yields the best results in predicting extreme events. However, the confidence interval of the Cramer coefficient indicates that this improvement is only significant compared to the Pearson correlation but not for the euclidean distance. Furthermore, the performance of the presented forecasting system is very low during the summer and new predictors have to be tested to overcome this problem.
    Print ISSN: 1561-8633
    Electronic ISSN: 1684-9981
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2006-10-30
    Description: Within the present study we shed light on the question whether objective circulation patterns (CP) classified from either the 500 HPa or the 700 HPa level may serve as predictors to explain the spatio-temporal variability of monsoon rainfall in the Anas catchment in North West India. To this end we employ a fuzzy ruled based classification approach in combination with a novel objective function as originally proposed by (Stehlik and BᲤossy, 2002). After the optimisation we compare the obtained circulation classification schemes for the two pressure levels with respect to their conditional rainfall probabilities and amounts. The classification scheme for the 500 HPa level turns out to be much more suitable to separate dry from wet meteorological conditions during the monsoon season. As is shown during a bootstrap test, the CP conditional rainfall probabilities for the wet and the dry CPs for both pressure levels are highly significant at levels ranging from 95 to 99%. Furthermore, the monthly CP frequencies of the wettest CPs show a significant positive correlation with the variation of the total number of rainy days at the monthly scale. Consistently, the monthly frequencies of the dry CPs exhibit a negative correlation with the number of rainy days at the monthly scale. The present results give clear evidence that the circulation patterns from the 500 HPa level are suitable predictors for explaining spatio- temporal Monsoon variability. A companion paper shows that the CP time series obtained within this study are suitable input into a stochastical rainfall model.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2006-02-08
    Description: In this study three data-driven water level forecasting models are presented and discussed. One is based on the artificial neural networks approach, while the other two are based on the Mamdani and the Takagi-Sugeno fuzzy logic approaches, respectively. All of them are parameterised with reference to flood events alone, where water levels are higher than a selected threshold. The analysis of the three models is performed by using the same input and output variables. However, in order to evaluate their capability to deal with different levels of information, two different input sets are considered. The former is characterized by significant spatial and time aggregated rainfall information, while the latter considers rainfall information more distributed in space and time. The analysis is made with great attention to the reliability and accuracy of each model, with reference to the Reno river at Casalecchio di Reno (Bologna, Italy). It is shown that the two models based on the fuzzy logic approaches perform better when the physical phenomena considered are synthesised by both a limited number of variables and IF-THEN logic statements, while the ANN approach increases its performance when more detailed information is used. As regards the reliability aspect, it is shown that the models based on the fuzzy logic approaches may fail unexpectedly to forecast the water levels, in the sense that in the testing phase, some input combinations are not recognised by the rule system and thus no forecasting is performed. This problem does not occur in the ANN approach.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2007-01-17
    Description: The parameters of hydrological models for catchments with few or no discharge records can be estimated using regional information. One can assume that catchments with similar characteristics show a similar hydrological behaviour and thus can be modeled using similar model parameters. Therefore a regionalisation of the hydrological model parameters on the basis of catchment characteristics is plausible. However, due to the non-uniqueness of the rainfall-runoff model parameters (equifinality), a workflow of regional parameter estimation by model calibration and a subsequent fit of a regional function is not appropriate. In this paper a different approach for the transfer of entire parameter sets from one catchment to another is discussed. Parameter sets are considered as tranferable if the corresponding model performance (defined as the Nash-Sutclife efficiency) on the donor catchment is good and the regional statistics: means and variances of annual discharges estimated from catchment properties and annual climate statistics for the recipient catchment are well reproduced by the model. The methodology is applied to a set of 16 catchments in the German part of the Rhine catchments. Results show that the parameters transfered according to the above criteria perform well on the target catchments.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2008-01-25
    Description: The objective in this study is to investigate the influence of the spatial resolution of the rainfall input on the model calibration and application. The analysis is carried out by varying the distribution of the raingauge network. A meso-scale catchment located in southwest Germany has been selected for this study. First, the semi-distributed HBV model is calibrated with the precipitation interpolated from the available observed rainfall of the different raingauge networks. An automatic calibration method based on the combinatorial optimization algorithm simulated annealing is applied. The performance of the hydrological model is analyzed as a function of the raingauge density. Secondly, the calibrated model is validated using interpolated precipitation from the same raingauge density used for the calibration as well as interpolated precipitation based on networks of reduced and increased raingauge density. Lastly, the effect of missing rainfall data is investigated by using a multiple linear regression approach for filling in the missing measurements. The model, calibrated with the complete set of observed data, is then run in the validation period using the above described precipitation field. The simulated hydrographs obtained in the above described three sets of experiments are analyzed through the comparisons of the computed Nash-Sutcliffe coefficient and several goodness-of-fit indexes. The results show that the model using different raingauge networks might need re-calibration of the model parameters, specifically model calibrated on relatively sparse precipitation information might perform well on dense precipitation information while model calibrated on dense precipitation information fails on sparse precipitation information. Also, the model calibrated with the complete set of observed precipitation and run with incomplete observed data associated with the data estimated using multiple linear regressions, at the locations treated as missing measurements, performs well.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2011-09-01
    Description: For many environmental variables, measurements cannot deliver exact observation values as their concentration is below the sensitivity of the measuring device (detection limit). These observations provide useful information but cannot be treated in the same manner as the other measurements. In this paper a methodology for the spatial interpolation of these values is described. The method is based on spatial copulas. Here two copula models – the Gaussian and a non-Gaussian v-copula are used. First a mixed maximum likelihood approach is used to estimate the marginal distributions of the parameters. After removal of the marginal distributions the next step is the maximum likelihood estimation of the parameters of the spatial dependence including taking those values below the detection limit into account. Interpolation using copulas yields full conditional distributions for the unobserved sites and can be used to estimate confidence intervals, and provides a good basis for spatial simulation. The methodology is demonstrated on three different groundwater quality parameters, i.e. arsenic, chloride and deethylatrazin, measured at more than 2000 locations in South-West Germany. The chloride values are artificially censored at different levels in order to evaluate the procedures on a complete dataset by progressive decimation. Interpolation results are evaluated using a cross validation approach. The method is compared with ordinary kriging and indicator kriging. The uncertainty measures of the different approaches are also compared.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2013-12-05
    Description: The main source of information on future climate conditions are global circulation models (GCMs). While the various GCMs agree on an increase of surface temperature, the predictions for precipitation exhibit high spread among the models, especially in shorter-than-daily temporal resolution. This paper presents a method to predict regional distributions of the hourly rainfall depth based on daily mean sea level pressure and temperature data. It is an indirect downscaling method avoiding uncertain precipitation data from the GCM. It is based on a fuzzy logic classification of atmospheric circulation patterns (CPs) that is further subdivided by means of the average daily temperature. The observed empirical distributions at 30 rain gauges to each CP-temperature class are assumed as constant and used for projections of the hourly precipitation sums in the future. The method was applied to the CP-temperature sequence derived from the 20th-century run and the scenario A1B run of ECHAM5. For the study region in southwestern Germany ECHAM5 predicts that the summers will become progressively drier. Nevertheless, the frequency of the highest hourly precipitation sums will increase. According to the predictions, estival water stress and the risk of extreme hourly precipitation will both increase simultaneously during the next decades. However, the results are yet to be confirmed by further mbox{investigation} based on other GCMs.
    Print ISSN: 1027-5606
    Electronic ISSN: 1607-7938
    Topics: Geography , Geosciences
    Published by Copernicus on behalf of European Geosciences Union.
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...