ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (39)
  • 550 - Earth sciences
  • Electronic structure and strongly correlated systems
  • Engineering General
  • uncertainty
  • Energy, Environment Protection, Nuclear Power Engineering  (39)
  • 1
    ISSN: 1573-1545
    Keywords: participatory integrated assessment ; methodology ; focus groups ; computer models ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Integrated assessment (IA) can be defined as a structured process of dealing with complex issues, using knowledge from various scientific disciplines and/or stakeholders, such that integrated insights are made available to decision makers (J. Rotmans, Enviromental Modelling and Assessment 3 (1998) 155). There is a growing recognition that the participation of stakeholders is a vital element of IA. However, only little is known about methodological requirements for such participatory IA and possible insights to be gained from these approaches. This paper summarizes some of the experiences gathered in the ULYSSES project, which aims at developing procedures that are able to bridge the gap between environmental science and democratic policy making for the issue of climate change. The discussion is based on a total of 52 IA focus groups with citizens, run in six European and one US city. In these groups, different computer models were used, ranging from complex and dynamic global models to simple accounting tools. The analysis in this paper focuses on the role of the computer models. The findings suggest that the computer models were successful at conveying to participants the temporal and spatial scale of climate change, the complexity of the system and the uncertainties in our understanding of it. However, most participants felt that the computer models were less instrumental for the exploration of policy options. Furthermore, both research teams and participants agreed that despite considerable efforts, most models were not sufficiently user-friendly and transparent for being accessed in an IA focus group. With that background, some methodological conclusions are drawn about the inclusion of the computer models in the deliberation process. Furthermore, some suggestions are made about how given models should be adapted and new ones developed in order to be helpful for participatory IA.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 16 (2000), S. 253-262 
    ISSN: 1573-1502
    Keywords: endogenous future preferences ; stock of the environmental asset ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract A dynamic optimization model is developed in whichuncertainty about future preferences is endogenous,namely depending on the state of the environment atthe time the change in preferences occurs.Endogeneizing preferences not only provides economicintuition to previous results but also implies thatoptimal policies are less conservative.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    ISSN: 1573-1545
    Keywords: climate change ; technology policy ; uncertainty ; agent-based modeling ; exploratory modeling ; social interactions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Many governments use technology incentives as an important component of their greenhouse gas abatement strategies. These “carrots” are intended to encourage the initial diffusion of new, greenhouse-gas-emissions-reducing technologies, in contrast to carbon taxes and emissions trading which provide a “stick” designed to reduce emissions by increasing the price of high-emitting technologies for all users. Technology incentives appear attractive, but their record in practice is mixed and economic theory suggests that in the absence of market failures, they are inefficient compared to taxes and trading. This study uses an agent-based model of technology diffusion and exploratory modeling, a new technique for decision-making under conditions of extreme uncertainty, to examine the conditions under which technology incentives should be a key building block of robust climate change policies. We find that a combined strategy of carbon taxes and technology incentives, as opposed to carbon taxes alone, is the best approach to greenhouse gas emissions reductions if the social benefits of early adoption sufficiently exceed the private benefits. Such social benefits can occur when economic actors have a wide variety of cost/performance preferences for new technologies and either new technologies have increasing returns to scale or potential adopters can reduce their uncertainty about the performance of new technologies by querying the experience of other adopters. We find that if decision-makers hold even modest expectations that such social benefits are significant or that the impacts of climate change will turn out to be serious then technology incentive programs may be a promising hedge against the threat of climate change.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    ISSN: 1539-6924
    Keywords: compliance certification application ; engineering analysis ; geochemistry ; geohydrology ; performance assessment ; probabilistic systems analysis ; radioactive waste ; scientific validity ; uncertainty ; 40 CFR 191
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, six-step process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA; they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    ISSN: 1539-6924
    Keywords: risk assessment ; uncertainty ; formaldehyde ; decision analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 585-598 
    ISSN: 1539-6924
    Keywords: uncertainty ; threatened plants ; risk ; conservation ; rule sets ; IUCN
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Australian state and federal agencies use a broad range of methods for setting conservation priorities for species at risk. Some of these are based on rule sets developed by the International Union for the Conservation of Nature, while others use point scoring protocols to assess threat. All of them ignore uncertainty in the data. In this study, we assessed the conservation status of 29 threatened vascular plants from Tasmania and New South Wales using a variety of methods including point scoring and rule-based approaches. In addition, several methods for dealing with uncertainty in the data were applied to each of the priority-setting schemes. The results indicate that the choice of a protocol for setting priorities and the choice of the way in which uncertainty is treated may make important differences to the resulting assessments of risk. The choice among methods needs to be rationalized within the management context in which it is to be applied. These methods are not a substitute for more formal risk assessment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    ISSN: 1539-6924
    Keywords: Variability ; uncertainty ; maximum likelihood ; bootstrap simulation ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1193-1204 
    ISSN: 1539-6924
    Keywords: multimedia modeling ; uncertainty ; variability ; exposure efficiency ; toxicity scoring ; toxics release inventory (TRI) ; life cycle assessment (LCA)
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows us to focus our resources if we want to reduce the uncertainty. The potential dose of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 995-1002 
    ISSN: 1539-6924
    Keywords: conditional ; uncertainty ; probability ; intervals ; risk analysis ; conservatism ; Waste Isolation Pilot Plant
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Uncertainty analyses and the reporting of their results can be misinterpreted when these analyses are conditional on a set of assumptions generally intended to bring some conservatism in the decisions. In this paper, two cases of conditional uncertainty analysis are examined. The first case includes studies that result, for instance, in a family of risk curves representing percentiles of the probability distribution of the future frequency of exceeding specified consequence levels conditional on a set of hypotheses. The second case involves analyses that result in an interval of outcomes estimated on the basis of conservative assumptions. Both types of results are difficult to use because they are sometimes misinterpreted as if they represented the output of a full uncertainty analysis. In the first case, the percentiles shown on each risk curve may be taken at face value when in reality (in marginal terms) they are lower if the chosen hypotheses are conservative. In the second case, the fact that some segments of the resulting interval are highly unlikely—or that some more benign segments outside the range of results are quite possible—does not appear. Also, these results are difficult to compare to those of analyses of other risks, possibly competing for the same risk management resources, and the decision criteria have to be adapted to the conservatism of the hypotheses. In this paper, the focus is on the first type (conditional risk curves) more than on the second and the discussion is illustrated by the case of the performance assessment of the Waste Isolation Pilot Plant in New Mexico. For policy-making purposes, however, the problems of interpretation, comparison, and use of the results are similar.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 135-152 
    ISSN: 1539-6924
    Keywords: Probability ; uncertainty ; data ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Risk assessors attempting to use probabilistic approaches to describe uncertainty often find themselves in a data-sparse situation: available data are only partially relevant to the parameter of interest, so one needs to adjust empirical distributions, use explicit judgmental distributions, or collect new data. In determining whether or not to collect additional data, whether by measurement or by elicitation of experts, it is useful to consider the expected value of the additional information. The expected value of information depends on the prior distribution used to represent current information; if the prior distribution is too narrow, in many risk-analytic cases the calculated expected value of information will be biased downward. The well-documented tendency toward overconfidence, including the neglect of potential surprise, suggests this bias may be substantial. We examine the expected value of information, including the role of surprise, test for bias in estimating the expected value of information, and suggest procedures to guard against overconfidence and underestimation of the expected value of information when developing prior distributions and when combining distributions obtained from multiple experts. The methods are illustrated with applications to potential carcinogens in food, commercial energy demand, and global climate change.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 14 (1999), S. 75-94 
    ISSN: 1573-1502
    Keywords: CGE ; Costa Rica ; environmental indicators ; Monte Carlo ; parameter values ; trade policy ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract This study explores the role of parameter uncertainty in CGE modeling of the environmental impacts of macroeconomic and sectoral policies, using Costa Rica as a case for study. A CGE model is constructed which includes eight environmental indicators covering deforestation, pesticides, overfishing, hazardous wastes, inorganic wastes, organic wastes, greenhouse gases, and air pollution. The parameters are treated as random variables drawn from prespecified distributions. Evaluation of each policy option consists of a Monte Carlo experiment. The impacts of the policy options on the environmental indicators are relatively robust to different parameter values, in spite of the wide range of parameter values employed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 13 (1999), S. 435-458 
    ISSN: 1573-1502
    Keywords: bioeconomics ; multiple stocks ; humane values ; Minke whales ; Monte Carlo analysis ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract Most bioeconomic models of efficient renewable resource management are constructed for a single harvesting ground. A bioeconomic model is developed in this paper to study the optimal management of renewable resources that are found in spatially distinct harvesting grounds. The model is applied to Minke whale management. Important inter-regional substitution effects are shown to exist. In addition, comparison with previous studies shows that multiple stock management is necessary for efficient management. Finally, the current Minke whale moratorium is shown to be inefficient unless significant nonmarket values exist.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Electronic Resource
    Electronic Resource
    Springer
    Mitigation and adaptation strategies for global change 4 (1999), S. 267-281 
    ISSN: 1573-1596
    Keywords: societal adaptation ; globalisation ; institutional capacity ; resilience ; uncertainty ; vulnerability
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Geography
    Notes: Abstract Institutions in many wealthy industrialised countries are robust and their societies appear to be relatively well insulated against the impacts of climate variability, economic problems elsewhere and so on. However, many countries are not in this position, and there is a growing group of humanity which is not benefiting from the apparent global adaptive trends. Worst case scenarios reinforce the impact of this uneven distribution of adaptive capacity, both between and within countries. Nevertheless, at the broad global scale human societies are strongly adaptive and not threatened by climate change for many decades. At the local level the picture is quite different and the survival of some populations at their present locations is in doubt. In the absence of abatement, the longer term outlook is highly uncertain. Adaptation research needs to begin with an understanding of social and economic vulnerability. It requires a different approach to the traditional IPCC impacts assessment, as human behaviour, institutional capacity and culture are more important than biophysical impacts. This is consistent with the intellectual history of the IPCC which has gradually embraced an increasing range of disciplines.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Electronic Resource
    Electronic Resource
    Springer
    Mitigation and adaptation strategies for global change 4 (1999), S. 319-329 
    ISSN: 1573-1596
    Keywords: uncertainty ; risk ; adaptation ; extreme events ; (credible) information ; integrated assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Geography
    Notes: Abstract This paper draws ten lessons from analyses of adaptation to climate change under conditions of risk and uncertainty: (1) Socio-economic systems will likely respond most to extreme realizations of climate change. (2) Systems have been responding to variations in climate for centuries. (3) Future change will effect future citizens and their institutions. (4) Human systems can be the sources of surprise. (5) Perceptions of risk depend upon welfare valuations that depend upon expectations. (6) Adaptive decisions will be made in response to climate change and climate change policy. (7) Analysis of adaptive decisions should recognize the second-best context of those decisions. (8) Climate change offers opportunity as well as risk. (9) All plausible futures should be explored. (10) Multiple methodological approaches should be accommodated. These lessons support two pieces of advice for the Third Assessment Report: (1) Work toward consensus, but not at the expense of thorough examination and reporting of the "tails" of the distributions of the future. (2) Integrated assessment is only one unifying methodology; others that can better accommodate those tails should be encouraged and embraced.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Electronic Resource
    Electronic Resource
    Springer
    Environmental monitoring and assessment 58 (1999), S. 151-172 
    ISSN: 1573-2959
    Keywords: GIS ; ground water vulnerability ; leaching index ; nitrate ; pesticide ; phosphorus ; potassium ; statistical analysis ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Statistical methods and a Geographic Information System (GIS) were used to investigate potential indicators of ground water vulnerability to agricultural chemical contamination in a representative area of the Mississippi River alluvial aquifer. A total of 47 wells were sampled for analysis of nitrate, phosphorus, potassium, and 13 pesticides commonly-used in the area. Ten soil and hydrogeologic variables and five ground water vulnerability indices were examined to explain the variations of chemical concentrations. The results showed that no individual soil or hydrogeologic variables or their linear combinations could explain more than 25% of the variation of the chemical concentrations. A quadratic response surface model with the values of confining unit thickness, slope, soil permeability, depth to ground water, and recharge rate accounted for 62% of the variation of nitrate, 43% of P, and 83% of K, suggesting that the interactions among soil and hydrogeologic variables were significant. Observed trends of decreasing nitrate and P concentrations with increasing well depth and/or depth to ground water seemed to correlate with carbonate equilibrium in the aquifer and more reduced environment with depth. In view of uncertainties involved, it was recognized that the limitations associated with input data resolution used in GIS and the formulation of leaching indices limited their use for predicting ground water vulnerability. Misuse of pesticides could be another factor that would complicate the relationships between pesticide concentrations and the vulnerability indices.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Electronic Resource
    Electronic Resource
    Springer
    Environmental modeling and assessment 4 (1999), S. 217-234 
    ISSN: 1573-2967
    Keywords: climate change ; climate policy ; integrated assessment ; inverse modeling ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The Tolerable Windows Approach (TWA) to Integrated Assessments (IA) of global warming is based on external normative specifications of tolerable sets of climate impacts as well as proposed emission quotas and policy instruments for implementation. In a subsequent step, the complete set of admissible climate protection strategies which are compatible with these normative inputs is determined by scientific analysis. In doing so, minimum requirements concerning global and national greenhouse gas emission paths can be determined. In this paper we present the basic methodological elements of TWA, discuss its relation to more conventional approaches to IA like cost–benefit analyses, and present some preliminary results obtained by a reduced-form climate model.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Electronic Resource
    Electronic Resource
    Springer
    Water, air & soil pollution 110 (1999), S. 313-333 
    ISSN: 1573-2932
    Keywords: Florida Everglades ; Lake Erie ; mercury ; paleoecology ; sediment cores ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Increased recognition of the ecological damage of mercury (Hg) has focused attention on quantifying spatial and temporal patterns of Hg deposition. Studies are commonly based on core chronologies and use a combination of techniques to measure parameters such as bulk density, percent solids, Hg concentration, and radionuclide activity. Little attention is generally devoted to the propagated error associated with these measurements. We identified the impact of sources of uncertainty on stratigraphic Hg determinations for Florida Everglades and Lake Erie cores. Large errors may be introduced by converting wet sample Hg content to dry-weight concentrations. Drying of sediments at 55 °C caused Hg losses of 18%. Samples, air-dried at room temperature, retained considerable moisture and required corrections for remaining water content. Frozen sediments did not lose Hg during a 72-day storage. Random error in radionuclide analysis of cores resulted in dating uncertainty of ±1.2 yr in 10 yr old deposits. This error increased to ±20 yr in 100 yr old sediments. Propagation of small errors in each step of the analysis (while adhering to strict QA/QC criteria) produced compounded uncertainties of ±11 and ±29% in Hg concentrations under different analytical rigor, and errors of up to ±73% in Hg accumulation rates in older sediments. Enrichment factors, comparing uncertain recent and historic Hg accumulation rates, differed by as much as ±48%. Uncertainty in paleoecological studies of mercury needs to be documented in order to correctly evaluate trends and remediation efforts.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 11 (1998), S. 635-646 
    ISSN: 1573-1502
    Keywords: damages ; global warming ; irreversibility ; optimal stopping ; timing ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract Although there is widespread agreement about the dangers of global warming and the resulting need to cut down emissions, there does not seem to be general agreement about the exact form the policy should take or the timing of its adoption. Failure to adopt and implement policies against global warming reflects the complexity of the problem, the uncertainties of climate change and the cost of policy adoption. Issues associated with the interactions between uncertainties and irreversibilities in determining the timing of policy adoption are analyzed by using the methodology of optimal stopping rules. Optimal policy functions are derived for cooperative and noncooperative solutions, with differential game representation. Issues associated with the empirical application of the optimal policy rules are also considered.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 11 (1998), S. 177-195 
    ISSN: 1573-1502
    Keywords: uncertainty ; externalities ; Pigouvian taxes ; nuclear power
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract The external effects arising from the use of nuclear power are, in a fundamental way, related to uncertainty. In this paper we locate these external effects and derive a dynamic Pigouvian tax in order to make the decentralized economy support the command optimum. Another interesting result is that a small constant energy tax (which we interpret as a second best policy) can take the decentralized economy reasonably close to the command optimum.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Electronic Resource
    Electronic Resource
    Springer
    Water, air & soil pollution 101 (1998), S. 289-308 
    ISSN: 1573-2932
    Keywords: emission factor ; emissions ; inventory ; mercury ; operating rate ; sources ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Estimates of mercury emissions from individual sources and source categories are needed to understand relationships between the emissions and resulting deposition and to evaluate possible approaches to reducing those emissions. We have developed geographically-resolved estimates of annual average mercury emission rates from current anthropogenic sources in the 48 contiguous United States. These estimates were made by applying emission factors to individual facility operating data and to county-wide source activity levels. We apportioned the emissions to an Eulerian modeling grid system using point source coordinates and the fractions of county areas in each grid cell. Point sources account for about 89% of the 48-state total mercury emissions of 146.4 Mg/yr. Most of the emissions in the inventory are from combustion of mercury-containing fossil fuels and municipal waste, located primarily in the mid-Atlantic and Great Lakes states as well as in the Southeast. The major uncertainties in the emission estimates are caused by uncertainties in the emission factors used to develop the estimates. This uncertainty is likely a result of variability in the mercury content of the combusted materials and in the removal of mercury by air pollution control devices. The greatest research need to reduce uncertainties in mercury emission estimates is additional measurements to improve emission factors.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Electronic Resource
    Electronic Resource
    Springer
    Stochastic environmental research and risk assessment 11 (1997), S. 433-448 
    ISSN: 1436-3259
    Keywords: Risk ; uncertainty ; reservoir operation ; sedimentation ; computer application
    Source: Springer Online Journal Archives 1860-2000
    Topics: Architecture, Civil Engineering, Surveying , Energy, Environment Protection, Nuclear Power Engineering , Geography , Geosciences
    Notes: Abstract An attempt of using stochastic hydrologic technique to assess the intrinsic risk of reservoir operation is made in this study. A stochastic simulation model for reservoir operation is developed. The model consists of three components: synthetic generation model for streamflow and sediment sequences, one-dimensional delta deposit model for sediment transport processes in reservoirs, and simulation model for reservoir operation. This kind of integrated simulation model can be used to simulate not only the inflow uncertainty of streamflow and sedimentation, but also the variation in operation rules of reservoirs. It is herein used for the risk assessment of a reservoir, and the simulation is performed for different operation scenarios. Simulation for the 100-year period of sediment transport and deposition in the river-reservoir system indicates that the navigation risk is much higher than that of hydropower generation or sediment deposition in the reservoir. The risk of sediment deposition at the river-section near the backwater profile is also high thereby the navigation at the river-segment near this profile takes high risk because of inadequate navigation depth.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 9 (1997), S. 451-466 
    ISSN: 1573-1502
    Keywords: global warming ; uncertainty ; learning ; irreversibility ; value of information ; dynamic games ; international agreements
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract In this paper we construct a simple model of global warming which captures a number of key features of the global warming problem: (i) environmental damages are related to the stock of greenhouse gases in the atmosphere; (ii) the global commons nature of the problem means that these are strategic interactions between the emissions policies of the governments of individual nation states; (iii) there is uncertainty about the extent of the future damages that will be incurred by each country from any given level of concentration of greenhouse gases but there is the possibility that at a future date better information about the true extent of environmental damages may become available; an important aspect of the problem is the extent to which damages in different countries may be correlated. In the first part of the paper we consider a simple model with two symmetric countries and show that the value of perfect information is an increasing function of the correlation between damages in the two countries in both the cooperative and non-cooperative equilibria. However, while the value of perfect information is always non-negative in the cooperative equilibrium, in the non-cooperative equilibrium there is a critical value of the correlation coefficient below which the value of perfect information will be negative. In the second part of the paper we construct an empirical model of global warming distinguishing between OECD and non-OECD countries and show that in the non-cooperative equilibrium the value of perfect information for OECD countries is negative when the correlation coefficient between environmental damages for OECD and non-OECD countries is negative. The implications of these results for international agreements are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 9 (1997), S. 103-124 
    ISSN: 1573-1502
    Keywords: climate change ; uncertainty ; irreversibility ; intergenerational ; stochastic dynamic programming ; resource extraction
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract A three-generation planning model incorporating uncertain climate change is developed. Each generation features a production activity based on capital and an exhaustible resource. An irreversible climate change may occur in period two or three, reducing the productivity for this and the remaining generation. The model is solved by stochastic dynamic programming. If the climate impact and climate change probability is constant, the optimal period one (and two) resource extraction is larger than for the reference case of climate stability. If, however, climate impact and climate change probability increases with increased aggregate resource use, this result is reversed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 9 (1997), S. 451-466 
    ISSN: 1573-1502
    Keywords: global warming ; uncertainty ; learning ; irreversibility ; value of information ; dynamic games ; international agreements
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract In this paper we construct a simple model of global warming which captures a number of key features of the global warming problem: (i) environmental damages are related to the stock of greenhouse gases in the atmosphere; (ii) the global commons nature of the problem means that these are strategic interactions between the emissions policies of the governments of individual nation states; (iii) there is uncertainty about the extent of the future damages that will be incurred by each country from any given level of concentration of greenhouse gases but there is the possibility that at a future date better information about the true extent of environmental damages may become available; an important aspect of the problem is the extent to which damages in different countries may be correlated. In the first part of the paper we consider a simple model with two symmetric countries and show that the value of perfect information is an increasing function of the correlation between damages in the two countries in both the cooperative and non-cooperative equilibria. However, while the value of perfect information is always non-negative in the cooperative equilibrium, in the non-cooperative equilibrium there is a critical value of the correlation coefficient below which the value of perfect information will be negative. In the second part of the paper we construct an empirical model of global warming distinguishing between OECD and non-OECD countries and show that in the non-cooperative equilibrium the value of perfect information for OECD countries is negative when the correlation coefficient between environmental damages for OECD and non-OECD countries is negative. The implications of these results for international agreements are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 9 (1997), S. 103-124 
    ISSN: 1573-1502
    Keywords: climate change ; uncertainty ; irreversibility ; intergenerational ; stochastic dynamic programming ; resource extraction
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract A three-generation planning model incorporating uncertain climate change is developed. Each generation features a production activity based on capital and an exhaustible resource. An irreversible climate change may occur in period two or three, reducing the productivity for this and the remaining generation. The model is solved by stochastic dynamic programming. If the climate impact and climate change probability is constant, the optimal period one (and two) resource extraction is larger than for the reference case of climate stability. If, however, climate impact and climate change probability increases with increased aggregate resource use, this result is reversed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 8 (1996), S. 39-61 
    ISSN: 1573-1502
    Keywords: tropical forests ; irreversibility ; uncertainty ; Thai parks
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract This paper develops a framework for the valuation and management of tropical forests that reflects their ecological and economic characteristics. The analysis demonstrates the importance of modeling the feasible use patterns and the information structure in tropical forest management decisions. The model predicts that cases exist where the foresighted management of forests leads to more preservation than the traditional expected value approach. An application in Thailand provides evidence that such cases occur in relevant ranges of benefit flows. The model focuses tropical forest management on assessments of sustainability and feasible sequences in light of uncertainty and information flows.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Electronic Resource
    Electronic Resource
    Springer
    Environmental modeling and assessment 1 (1996), S. 71-90 
    ISSN: 1573-2967
    Keywords: Integrated assessment ; uncertainty ; subjectivity in modelling ; perspectives, risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In the context of integrated assessment, the authors address the issue of uncertainty and subjectivity in modelling. In relating bias to different perspectives, the authors introduce the methodology of multiple model routes, which are reflections of different perceptions of reality and various policy preferences. As heuristic they use three perspectives, which are distinguished in cultural theory. The article describes case-studies on the population and health controversy in order to illustrate the possibilities of their approach. The article concludes with discussing the lessons learned by applying this methodology.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 8 (1996), S. 399-416 
    ISSN: 1573-1502
    Keywords: environmental taxes ; tradable permits ; excess burden ; tax revenues ; uncertainty ; secondbest policy
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract This paper analyses the optimal choice of second-best optimal environmental policies. Using a partial equilibrium model, the paper first reconfirms the well-known result that the existence of a double dividend (in its weak definition) favours environmental policy instruments which maximise tax revenues for a given improvement in environmental quality. Additional revenues can be used to reduce the distortion of existing taxes such as taxes on labour and capital income. Without uncertainty, environmental taxes and auctioned permits are equally appropriate. In the presence of uncertainty, however, the optimal choice of taxes or tradable permits depends on the relative magnitudes of the marginal environmental damage and the marginal benefit from consuming a polluting good. In the second part, the paper, therefore, analyses how the revenue capacity affects the optimal choice of environmental policy instruments in the presence of uncertainty. The paper shows that the first-best choice rule between price and quantity regulation (Weitzman, 1974) remains valid in a second-best world with distortionary taxation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 5 (1995), S. 353-374 
    ISSN: 1573-1502
    Keywords: Climate change damage costs ; cost functions ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract It is argued that estimating the damage costs of a certain benchmark climate change is not sufficient. What is needed are cost functions and confidence intervals. Although these are contained in the integrated models and their technical manuals, this paper brings them into the open in order to stimulate discussion. After briefly reviewing the benchmark climate change damage costs, region-specific cost functions are presented which distinguish tangible from intangible losses and the losses due to a changing climate from those due to a changed climate. Furthermore, cost functions are assumed to be quadratic, as an approximation of the unknown but presumably convex functions. Results from the damage module of the integrated climate economy modelFUND are presented. Next, uncertainties are incorporated and expected damages are calculated. It is shown that because of convex loss functions and right-skewed uncertainties, the risk premium is substantial, calling for more action than analysis based on best-guess estimates. The final section explores some needs for further scientific research.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Electronic Resource
    Electronic Resource
    Springer
    Water, air & soil pollution 85 (1995), S. 2521-2526 
    ISSN: 1573-2932
    Keywords: Critical loads ; percentiles ; scale ; resolution ; uncertainty ; UK ; UNECE
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Critical loads are estimated in the UK by the Department of Environment's Critical Loads Advisory Group and sub-groups. The Mapping and Data Centre at ITE Monks Wood acts as the National Focal Centre for the UNECE programme for mapping critical loads. The centre is responsible for the generation of UK data sets and their application for national and European purposes. To make effective use of these data, it is necessary to draw upon other environmental data and examine the issues of scale, uncertainty and the way that data are presented. This paper outlines the methodologies which have been employed to derive national maps. Early critical load maps were not vegetation specific, but now critical loads for acidity and for nutrient nitrogen for soils, critical levels maps for ozone and sulphur dioxide, and sulphur deposition maps, have been generated on a vegetation or ecosystem specific basis. These have been used to derive a number of different types of critical load and exceedance maps. The results show the importance of the method selected and the data used for the interpretation. The visualisation of critical loads and the corresponding exceedance data is an important aspect in producing information for pollution abatement strategies.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Electronic Resource
    Electronic Resource
    Springer
    Environmental and resource economics 5 (1995), S. 71-82 
    ISSN: 1573-1502
    Keywords: Externality ; greenhouse ; estimation ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Economics
    Notes: Abstract The shadow price of carbon dioxide is the value of the external damage caused by an emission. A shadow price model for calculating the present value of the external damage of a carbon dioxide emission is derived explicitly. Sixteen experts provided subjective high, low and most likely parameter estimates because correct values for the eight model parameters are uncertain. The estimation procedure retains parameter uncertainty while generating the main result, which is a distribution of shadow price estimates. Major assumptions made in the estimation identify the basis for the results. Of the eight model parameters, the discount rate dominates the determination of the shadow price. For comparison, expert estimates of the shadow price itself provide a second distribution of shadow price estimates.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Electronic Resource
    Electronic Resource
    Springer
    Water, air & soil pollution 85 (1995), S. 2503-2508 
    ISSN: 1573-2932
    Keywords: critical load ; deposition model ; spatial scale ; uncertainty ; probability distribution
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The critical loads approach to quantifying areas at risk of damage requires deposition and critical loads data at the same spatial scale to calculate exceedance. While maps of critical loads for soil acidification are available at a 1 km scale no monitoring networks in Europe measure wet and dry inputs at this scale and, further, the models currently used to estimate deposition incorporate a number of assumptions which are not valid at the 1 km scale. Simulations of 1 km deposition from 20 km data show that the uncertainty introduced by using 20 km scale estimates of deposition is small, except in mountain areas where it can give misleading results, but a major problem is the uncertainty in estimates of deposition at the 20 km scale produced by the current models.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Electronic Resource
    Electronic Resource
    Springer
    Water, air & soil pollution 85 (1995), S. 2515-2520 
    ISSN: 1573-2932
    Keywords: Regionalization ; critical loads ; forest ; uncertainty ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The steady-state model PROFILE was used to perform Monte Carlo simulations of critical loads of acidity and exceedances of forest soils for 128 sites in the province of Scania, southern Sweden. Statistical tests showed that 100 sites had normal distributed critical loads and exceedances and that the variance of these parameters was statistically equal for all sites. Pooled estimates of the standard deviation was 0.19 and 0.31 kmolc ha−1 yr−1 for the critical loads and exceedances, respectively. Introduction of uncertainties, expressed as confidence intervals, in the cumulative distribution function for critical loads showed that overlaps between percentiles were substantial. The 5%-ile was systematically equal to the 57%-ile using a 67% confidence interval and equal to the 87%-ile when a 95% confidence level was chosen. The overlaps of percentiles cause a reduction of acidic deposition according to the mean value of the 5%-ile to protect only 68% of the ecosystem area with an 84% probability and not a guaranteed protection of 95% as if uncertainties did not exist. Thus, uncertainties make it possible to advocate reductions to levels of deposition below the 5%-tile of critical loads.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    ISSN: 1573-2932
    Keywords: ozone ; AOT40 ; linear modelling ; spatial correlation ; uncertainty
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Ozone critical levels in Europe are defined in terms of an accumulated exposure over a threshold of 40 ppb, AOT40. For agricultural crops, for example, the critical level is an AOT40 of 5300 ppb.h during daylight in May to July in the year with the highest cumulative exposure in the last five years. In a region of the size of the UK, however, the worst case year is not the same over the whole region and maps become difficult to interpret. Prediction of crop losses on the basis of a single year out of five also wastes potentially valuable information. An alternative approach estimates the distribution of aggregate exceedances over a threshold by means of a compound Poisson model for episodes of raised ozone concentration with linear modelling techniques used to allow direct incorporation of covariate information. The use of spatial and environmental covariates, along with temporal and spatially correlated random effects, is explored using data from the UK ozone monitoring network. The model produces results similar to those from other mapping methods. By combining this model with a crop loss relationship, crop losses of 5–15% for the UK are predicted but the errors range between 2% and 6% indicating that fine detail in crop loss mapping is unlikely to be very accurate.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Electronic Resource
    Electronic Resource
    Springer
    Stochastic environmental research and risk assessment 8 (1994), S. 259-268 
    ISSN: 1436-3259
    Keywords: Rainfall ; runoff ; modeling ; uncertainty ; stochastics ; stochastic integral equations
    Source: Springer Online Journal Archives 1860-2000
    Topics: Architecture, Civil Engineering, Surveying , Energy, Environment Protection, Nuclear Power Engineering , Geography , Geosciences
    Notes: Abstract In this paper a very general rainfall-runoff model structure (described below) is shown to reduce to a unit hydrograph model structure. For the general model, a multi-linear unit hydrograph approach is used to develop subarea runoff, and is coupled to a multi-linear channel flow routing method to develop a link-node rainfall-runoff model network. The spatial and temporal rainfall distribution over the catchment is probabilistically related to a known rainfall data source located in the catchment in order to account for the stochastic nature of rainfall with respect to the rain gauge measured data. The resulting link node model structure is a series of stochastic integral equations, one equation for each subarea. A cumulative stochastic integral equation is developed as a sum of the above series, and includes the complete spatial and temporal variabilities of the rainfall over the catchment. The resulting stochastic integral equation is seen to be an extension of the well-known single area unit hydrograph method, except that the model output of a runoff hydrograph is a distribution of outcomes (or realizations) when applied to problems involving prediction of storm runoff; that is, the model output is a set of probable runoff hydrographs, each outcome being the results of calibration to a known storm event.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Electronic Resource
    Electronic Resource
    Springer
    Natural hazards 9 (1994), S. 215-233 
    ISSN: 1573-0840
    Keywords: Seismic hazard ; random source location ; random boundary ; source zone boundary ; seismic sources ; uncertainty ; earthquakes ; statistical analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Geography , Geosciences
    Notes: Abstract Demarcation of areal and linear seismic sources involves a certain degree of uncertainty and this should be reflected in the final seismic hazard results. The uncertainty associated with the description of the geographical coordinates of a source zone boundary is modeled by introducing the concept of ‘random boundary’, where the location of the boundary is assumed to exhibit a spatial bivariate Gaussian distribution. Here the mean vector denotes the best estimate of location and the variance reflects the magnitude of location uncertainty, which may be isotropic or may show spatial directivity. The consideration of spatial randomness in the boundaries smooths the seismicity parameters and permits the gradual transitions of these to occur across border zones. Seismic sources modeled as lines can also be attributed random geometrical properties. The sensitivity of seismic hazard results to the isotropic and direction dependent location uncertainty is examined on the basis of hypothetical case studies. Area and line source location uncertainties are examined separately because they are reflected in the eventual outcome of the analyses in a complicated manner. The effect of random source zone boundaries on the expected peak ground acceleration is tested for a specific site in Turkey by conducting a comprehensive seismic hazard analysis.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Electronic Resource
    Electronic Resource
    Springer
    Natural hazards 6 (1992), S. 201-226 
    ISSN: 1573-0840
    Keywords: Seismic hazard ; Jordan ; earthquakes ; uncertainty ; Bayesian method ; intensity attenuation ; expert opinion
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering , Geography , Geosciences
    Notes: Abstract Probabilistic methods are used to quantify the seismic hazard in Jordan and neighbouring regions. The hazard model incorporates the uncertainties associated with the seismicity parameters and the attenuation equation. Seven seismic sources are identified in the region and the seismicity parameters of these sources are estimated by making use of all the available information. Seismic hazard computations and the selection of peak ground acceleration and modified Mercalli intensity values at the nodes of a 25 × 25 km mesh covering the region under study are carried out by two different computer programs. The results of the study are presented through a set of seismic hazard maps displaying iso-acceleration and iso-intensity contours corresponding to specified return periods. The first set of maps is derived based on the seismicity data assessed in this study and display our ‘best’ estimate of the seismic hazard for Jordan and the neighbouring areas. The second set of maps which shows the ‘alternative’ estimate of seismic hazard is based solely on the seismicity parameters reported by other researchers. The third set of maps, called the ‘Bayesian’ estimate of seismic hazard, reflects the influence of expert opinion involving more conservative assumptions regarding the Red Sea and Araba faults.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Electronic Resource
    Electronic Resource
    Amsterdam : Elsevier
    Radiation Physics and Chemistry 42 (1993), S. 731-738 
    ISSN: 0969-806X
    Keywords: Dosimetry ; radiation ; standards ; sterilization ; traceability ; uncertainty
    Source: Elsevier Journal Backfiles on ScienceDirect 1907 - 2002
    Topics: Chemistry and Pharmacology , Energy, Environment Protection, Nuclear Power Engineering , Physics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Electronic Resource
    Electronic Resource
    Amsterdam : Elsevier
    Environmental Pollution 83 (1994), S. 87-93 
    ISSN: 0269-7491
    Keywords: artificial intelligence ; climate change ; modelling ; potato ; uncertainty
    Source: Elsevier Journal Backfiles on ScienceDirect 1907 - 2002
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...