ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Articles  (25,250)
  • Springer  (25,067)
  • Annual Reviews  (183)
  • 1995-1999  (23,900)
  • 1955-1959  (1,350)
  • 1940-1944
  • Energy, Environment Protection, Nuclear Power Engineering  (25,250)
Collection
  • Articles  (25,250)
Years
Year
Journal
  • 1
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 461-486 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Yucca Mountain, NV, is being characterized for disposal of U.S. high-level nuclear waste, which consists predominantly of spent fuel from nuclear reactors and radioactive waste from reprocessing. In this paper, the program is presented in the context of global and U.S. nuclear energy systems and of international plans for high-level waste disposal. The potential impact of the proposed repository is discussed in the context of the U.S. Department of Energy's Total System Performance Assessment-Viability Assessment, the primary tool for assessing how the repository might operate.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 487-512 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In this paper, we estimate the value of energy technology research and development (R&D) as an insurance investment to reduce four risks to the United States. These four risks are (a) the costs of climate stabilization, (b) oil price shocks and cartel pricing, (c) urban air pollution, and (d) other energy disruptions. The total value is estimated conservatively to be 〉$12 billion/year. However, only about half of this total may be warranted because some R&D is applicable to more than one risk. Nevertheless, the total Department of Energy investment in energy technology R&D [~$1.5 billion/year in fiscal year 1999 (FY99)] seems easily justified by its insurance value alone. In fact, a larger investment might be justified, particularly in the areas related to climate change, oil price shock, and urban air pollution. This conclusion appears robust even if the private sector is assumed to be investing a comparable amount relevant to these risks. No additional benefit is credited for the value to the economy and to the competitiveness of the U.S. from better energy technologies that may result from the R&D; only the insurance value for reducing the potential cost of these four risks to society was estimated.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 513-544 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Climate policy is often discussed as a lever with which to bring about climate-friendly technical innovation and diffusion. However, quantitative policy assessments routinely treat technological change as a factor that is independent of policy. Stabilizing atmospheric concentrations of CO2 cannot be achieved through marginal changes in the way we supply and use energy. The only path to stabilization of climate over the next century that is consistent with widely accepted population and economic-growth scenarios involves substantial decoupling of energy services from carbon emissions. The required rate of structural and technical change for such a goal has been experienced only in the wake of economic and resource crises and for periods of a decade or less. Historic rates of structural and technical change averaged over a century are far from adequate for stabilizing climate. In this paper, we review technical changes in the energy system and a few instances in which energy economic models have begun to include technical change as an endogenous feature of their assessments. Finally, we consider the implications of considering endogenous technical change for critical climate policy questions, such as the cost of control and the appropriate timing of the emissions mitigation effort.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 545-569 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Technology largely determines economic development and its impact on the environment; yet technological change is one of the least developed parts of existing global change models. This paper reviews two approaches developed at the International Institute for Applied Systems Analysis, both of which use the concept of technological learning and aid modeling of technological change. The first approach is a micromodel ("bottom-up") of three electricity generation technologies that rigorously endogenizes technological change by incorporating both uncertainty (stochasticity) and learning into the model's decision rules. This model, with its endogenous technological change, allows radical innovations to penetrate the energy market and generates S-shaped patterns of technological diffusion that are observed in the real world. The second approach is a macro ("top-down") model that consists of coupled economic- and technological-system models. Although more stylistic in its representation of endogenous technological change, the macro model can be applied on a worldwide scale and can generate long-term scenarios that are critical for policy analysis. Both the micro- and macro models generate radical departures from currently dominant technological systems ("surprises"), including long-term scenarios with low carbon and sulfur emissions. Our focus is modeling, but for policy, the work underscores the need for huge investments before environmentally superior technologies can compete in the market.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 571-605 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A total of 176 countries have ratified the United Nations Framework Convention on Climate Change, thereby agreeing to limit emissions of greenhouse gases that threaten to interfere with the Earth's climate. While compliance procedures are being developed, the best indicators of implementation of the Convention are the emissions inventories of greenhouse gases that member countries must submit to the Convention as part of their national communications. We review some of the first emissions inventories from non-Annex I (developing) countries. We focus on land-use change and forestry because these activities are responsible for the major emissions of carbon in many non-Annex I parties, and because they are the only activities with the potential to remove carbon from the atmosphere and sequester it on land. The review shows first, that some developing countries have already begun to reduce emissions and second, that there are significant discrepancies between the data used in the emissions inventories and the data available in international surveys. Conceptual uncertainties also exist, such as distinguishing anthropogenic from nonanthropogenic sinks of carbon, and these will require political rather than scientific resolution. We discuss several options for counting terrestrial sources and sinks of carbon in light of the Kyoto Protocol.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 645-661 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Although global warming is generally linked to increasing levels of carbon dioxide, there are many other gases produced from industrial, agricultural, and energy-generating sources that can also cause the Earth's temperature to rise. Individually these gases are not likely to make a significant contribution, but, taken together, it is believed that they can rival the effects of carbon dioxide. This paper reviews the current trends of the most abundant or the most effective of these non-CO2 greenhouse gases. Methane, nitrous oxide, and the major chlorofluorocarbons (F-11 and F-12) have been the most notable greenhouse gases other than CO2. Although these gases will continue to play a role in global warming, new compounds are likely to become increasingly important. These include the fluorocarbon replacement compounds in the hydrofluorocarbon and the hydrochlorofluorocarbon groups and gases that are nearly inert in the atmosphere, persisting for thousands of years, such as the perfluorocarbons and sulfur hexafluoride.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 607-643 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The US-Mexico border region illustrates the challenges of binational environmental management in the context of a harsh physical environment, rapid growth, and economic integration. Transboundary and shared resources and conflicts include limited surface water supplies, depletion of groundwater, air and water pollution, hazardous waste, and conservation of important natural ecosystems. Public policy responses to environmental problems on the border include binational institutions such as the IBWC, BECC and CEC, the latter two established in response to environmental concerns about the North American Free Trade Agreement (NAFTA). Environmental social movements and nongovernmental organizations have also become important agents in the region. These new institutions and social movements are especially interesting on the Mexican side of the border where political and economic conditions have often limited environmental enforcement and conservation, and where recent policy changes also include changes in land and water law, political democratization, and government decentralization.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 1-31 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: This review presents a personal view of the development of plant physiological ecology, the science of studying biological diversity, and the functioning of the Earth as a system. The need for interaction among these disciplines is becoming increasingly urgent as we are faced with the challenge of "managing" the Earth system that is increasingly impacted by the activities of humans.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 113-137 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The worldwide future of nuclear energy is a highly disputed subject; one side is certain that nuclear energy will have to expand in the next century to meet energy demand, whereas the other side is equally certain that this energy form is too dangerous and uneconomical to be of longer-term use. By looking at the way such beliefs are formed, the history of nuclear power, and the energy scene in the next century, this paper tests both points of view and concludes that both are flawed, but there is a strong case for keeping the option for nuclear expansion open. Yet, there has to be doubt whether today's technology is adequate for such expansion. There are alternative technologies under development that may make nuclear power more acceptable; however, although there is the time to develop such new processes, the question has to be asked whether such work can be funded, unless public opposition to nuclear power can be reduced and international collaboration improved. The long-term future of nuclear power depends more on successful research and development than on achieving early orders for more nuclear plants.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 189-226 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Ethanol made from lignocellulosic biomass sources, such as agricultural and forestry residues and herbaceous and woody crops, provides unique environmental, economic, and strategic benefits. Through sustained research funding, primarily by the U.S. Department of Energy, the estimated cost of biomass ethanol production has dropped from ~$4.63/gallon in 1980 to ~$1.22/gallon today, and it is now potentially competitive for blending with gasoline. Advances in pretreatment by acid-catalyzed hemicellulose hydrolysis and enzymes for cellulose breakdown coupled with recent development of genetically engineered bacteria that ferment all five sugars in biomass to ethanol at high yields have been the key to reducing costs. However, through continued advances in accessing the cellulose and hemicellulose fractions, the cost of biomass ethanol can be reduced to the point at which it is competitive as a pure fuel without subsidies. A major challenge to realizing the great benefits of biomass ethanol remains to substantially reduce the risk of commercializing first-of-a-kind technology, and greater emphasis on developing a fundamental understanding of the technology for biomass conversion to ethanol would reduce application costs and accelerate commercialization. Teaming of experts to cooperatively research key processing steps would be a particularly powerful and effective approach to meeting these needs.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 281-328 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This chapter on fuel cells covers the following topics: (a) fundamental electrochemical aspects and performance analysis; (b) technology research and development and demonstrations of fuel cell power sources for power generation, transportation, portable power, and space applications; (c) the role of fuel cells vs competing technologies, and (d) prospects for the applications and commercialization of fuel cell technologies in the twenty-first century. Although the fuel cell was invented in the nineteenth century, the twentieth century has been the period for technology development rather than widespread use. The fuel cell faces a great deal of competition in the proposed applications of power generation, transportation, and portable power. Significant work is still necessary, but intensified research and development activities could lead to the dawn of fuel cell commercialization and widespread use in the early part of the twenty-first century.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 227-279 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract About two-thirds of primary energy today is used directly as transportation and heating fuels. Any discussion of energy-related issues, such as air pollution, global climate change, and energy supply security, raises the issue of future use of alternative fuels. Hydrogen offers large potential benefits in terms of reduced emissions of pollutants and greenhouse gases and diversified primary energy supply. Like electricity, hydrogen is a premium-quality energy carrier, which can be used with high efficiency and zero emissions. Hydrogen can be made from a variety of feedstocks, including natural gas, coal, biomass, wastes, solar sources, wind, or nuclear sources. Hydrogen vehicles, heating, and power systems have been technically demonstrated. Key hydrogen end-use technologies such as fuel cells are making rapid progress toward commercialization. If hydrogen were made from renewable or decarbonized fossil sources, it would be possible to have a large-scale energy system with essentially no emissions of pollutants or greenhouse gases. Despite these potential benefits, the development of a large-scale hydrogen energy infrastructure is often seen as an insurmountable technical and economic barrier. Here we review the current status of technologies for hydrogen production, storage, transmission, and distribution; describe likely areas for technological progress; and discuss the implications for developing hydrogen as an energy carrier.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 391-430 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper reviews the empirical evidence for the following five hypotheses from the economic growth-liberalization-pollution debate: (a) economic growth will lead to a worsening pollution problem; (b) tighter environmental regulation will reduce economic growth; (c) trade liberalization will exacerbate environmental degradation, especially in developing countries with weak environmental protection; (d) tighter environmental protection in the developed countries will lead to a loss of competitiveness compared with that of countries with lower standards, especially in polluting industries; and (e) tighter environmental protection in the developed countries will lead to relocation of investment to developing countries with lax regulation, especially in polluting industries (the pollution haven hypothesis). Overall, the evidence for these hypotheses is found to be ambiguous and weak. It is further suggested that the growth-liberalization-environment empirical literature has neglected three important elements: (a) environmental innovation, (b) the international diffusion of environmental technologies, and (c) the economic benefits of a cleaner environment. Future research should integrate these elements into the debate. Analyses of endogenous environmental innovation in response to environmental policy, the tradable nature of environmental technologies, the role of trade and foreign direct investment as channels of environmental-technology transfer to developing countries, the effects of local environmental policies in encouraging the adoption of such technologies in developing countries, and the economic benefits of a cleaner environment would contribute to the development of sound, well-coordinated economic and environmental policies.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 367-390 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Of the thousands of species of microalgae that form the base of the marine food chain, only a small number are toxic or harmful. However, when these toxic species proliferate, they can cause massive kills of fish and shellfish, mortality among marine mammals and seabirds, substantive alterations of marine habitats, and human illness and death. Currently, six distinct human clinical syndromes associated with harmful algal blooms are recognized: ciguatera fish poisoning, paralytic shellfish poisoning, neurotoxic shellfish poisoning, diarrhetic shellfish poisoning, amnesic shellfish poisoning, and Pfiesteria-associated syndrome. Human illnesses are caused by toxins produced by these microorganisms, acquired either by passage through the food chain or direct skin or respiratory contact. Syndromes frequently include debilitating neurologic manifestations and, in some instances, may progress to death. There is a perception among investigators that the number of harmful algal blooms is increasing, as is the range of toxic species. It has been postulated that this increase is caused by human-related phenomena such as disruption of ecosystems, nutrient enrichment of waterways, and climatic change. In environmental studies, attention has traditionally focused on direct human health effects of pollutants. Harmful algal blooms are an example of an alternative paradigm, in which human-induced stress on complex ecologic systems leads to the emergence of new, potentially harmful microorganisms (or the reemergence of "old" pathogens from previously restricted environmental niches), which, in turn, cause human disease. Although data are lacking to fully substantiate this latter model, it provides a useful conceptual framework to assess data needs and consider public health interventions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 329-365 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Six methods for attributing ambient pollutants to emission sources are reviewed: emissions analysis, trend analysis, tracer studies, trajectory analysis, receptor modeling, and dispersion modeling. The ranges of applicability, types of information provided, limitations, performance capabilities, and areas of active research of the different methods are compared. For primary, nonreactive pollutants whose effects of concern occur on a global scale, an accounting of emissions rates by source type and location largely characterizes source contributions. For other pollutants or smaller spatial scales, accurate estimates of emissions are needed for identifying the emissions reduction potentials of possible control measures and as inputs to dispersion models. Emission levels are frequently known with factor-of-two accuracy or worse, and improved estimates are needed for dispersion modeling. The analysis of regional or urban-scale trends in emissions and ambient pollutant concentrations can provide qualitative information on source contributions, but quantitative results are limited by the confounding influence of variations in meteorology and uncertainties in the areas over which emissions affect concentrations. Tracer studies are useful for quantifying dispersion characteristics of plumes, qualitatively characterizing transport directions, and providing empirical data for evaluating trajectory and dispersion models. Data are usually temporally limited to a short study period, typically do not provide information on vertical pollutant distributions, and are most applicable to the transport of primary, nonreactive pollutants. Trajectory analyses are routinely used to estimate atmospheric transport directions. Trajectory errors of about 20% of travel distance are considered typical of the better models and data sets. Receptor models use measurements of ambient pollutant concentrations to quantify the contributions of different source types to primary particulate matter or volatile organic compounds, or to characterize source-region contributions to a single pollutant. Accuracy rates of ~30% are often achieved when quantifying the contributions from different types of emission sources. Dispersion models are well-suited for estimating quantitative source-receptor relationships, as the effects of individual emission sources or source regions can be studied. Lagrangian and Gaussian dispersion models are computationally efficient and can simulate the transport of nonreactive primary or linear secondary species. Eulerian models are computationally intensive but lend themselves to the simulation of nonlinear chemistry. Careful evaluation of modeling accuracy is needed for a model application to fulfill its potential for source attribution. Accuracy can be evaluated through a combination of performance evaluation, sensitivity analysis, diagnostic evaluation, and corroborating analyses.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 431-460 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper focuses on the desirability, from an economic perspective, of setting fixed and relatively short-term targets and timetables, such as those contained in the Kyoto Protocol, as a means of achieving longer-term climate change mitigation goals. The paper argues that whatever long-term policy goals are adopted, greater flexibility lowers implementation costs. Lower implementation costs, in turn, increases the likelihood that the policies will actually be followed and the goals achieved. Importantly, the Kyoto Protocol incorporates key elements of both "what" and "where" flexibility. That is, the "Kyoto basket" includes all six of the major greenhouse gases plus sinks, and the Protocol incorporates several mechanisms that allow emission reductions to take place at the least-cost geographic location, regardless of nation-state boundaries. The Protocol also provides substantial "how" flexibility in the sense that countries can use a variety of means to achieve domestic policy goals. However, the Protocol does not allow emission reductions to take place at a point in time when they can be achieved at lowest cost as long as they are consistent with the long-term environmental goals ("when" flexibility). Additionally, it does not allow the use of efficient price-based policy instruments to define targets and, thereby, balance environmental goals and compliance costs (which could be thought of as a broader version of "when" flexibility). Instead, the Protocol relies exclusively on strict, short term quantity targets. The relative inflexibility of the Protocol with respect to the timing of reductions and definitions of the targets may derive, in part, from a misplaced analogy between the global warming issue and the highly successful effort to phase out CFCs under the Montreal Protocol. The lack of when flexibility may be a key barrier to achieving the broader goals of the Kyoto Protocol, particularly if where flexibility is constrained in the implementation process.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 139-171 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In the last decade, the economics of existing nuclear power plants have improved in the United States and worldwide. Further economic improvements could be realized by better management of planned outages, understanding of unplanned outages, resource sharing among several plants, and more efficient use of nuclear fuel. Dry spent-fuel storage has removed the limitation of on-site storage caused by the limited size of the originally designed storage pools. However, delays and uncertainty about the date by which the national U.S. program will receive the spent fuel have significant financial penalties that could be mitigated if a central interim facility were established to receive spent fuel. Furthermore, a higher incentive for spent-fuel minimization could be obtained if the waste fees were based on spent-fuel volume rather than on electricity sales. Introducing thorium into the fuel cycle has the potential to improve the economics of the fuel cycle while reducing the volume of spent fuel and improving its proliferation resistance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 83-111 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The activities of oil and other energy companies are increasingly being challenged by nongovernmental organizations and media to justify their behavior in ethical terms. Activities that visibly damage the environment have long been challenged by advocacy groups. In recent years public interest has broadened into calls to respect "sustainability," human rights, and other ethical imperatives. This article attempts to set these developments in the context of international promotion of the idea of a global "civil society." Ethical codes reach, by persuasion, beyond coercive legal obligations. They have the character and role of "repeated games." Codes of behavior for business are rooted in national and cultural values, which may conflict at the international level. However, many governments following the lead of the United States are often developing sanctions to promote ethical behavior by businesses, to redress the failure of markets to manage common access to resources and to protect aspects of the natural world for its own sake. Examples are efforts to uphold human rights, fight against corruption, and promote sustainability of resources. Business leaders and the nongovernment organizations that advocate international values on these subjects have the opportunity to contribute to the development of global civil society by working together to establish persuasive codes that do not require slow and difficult international intervention by government.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 33-82 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract After a first career as Professor of Physics, University of California at Berkeley, working in experimental particle physics at Lawrence Berkeley National Laboratory (LBNL), I was prompted by the 1973 Organization of Petroleum Exporting Countries (OPEC) oil embargo to switch to improving energy end-use efficiency, particularly in buildings. I cofounded and directed the Energy Efficient Buildings program at LBNL, which later became the Center for Building Science. At the Center we developed high-frequency solid-state ballasts for fluorescent lamps, low-emissivity and selective windows, and the DOE-2 computer program for the energy analysis and design of buildings. The ballasts in turn stimulated Philips lighting to produce compact fluorescent lamps. When they achieve their expected market share, energy savings from products started or developed at the Center for Building Sciences are projected to save American consumers $30 billion/year, net of the cost of the better buildings and products. In terms of pollution control, this is equivalent to displacing approximately 100 million cars. We did the analysis on which the California and later the U.S. appliance standards are based, and we also worked on indoor air quality and discovered how radon is sucked into homes. We worked closely with the California utilities to develop programs in "Demand Side Management" and "Integrated Utility Planning." I also worked in California and New England on utility "collaboratives" under which we changed their profit rules to favor investment in customer energy efficiency (and sharing the savings with the customer) over selling raw electricity. I cofounded a successful nonprofit, the American Council for an Energy-Efficient Economy, and a University of California research unit, the California Institute for Energy Efficiency, and I served on the steering Committee of Pacific Gas and Electric's ACT2 project, in which we cost-effectively cut the energy use of six sites by one half. Starting in l994, my third career has been as Senior Advisor to the U.S. Department of Energy Assistant Secretary for Energy Efficiency and Renewable Energy.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Electronic Resource
    Electronic Resource
    Palo Alto, Calif. : Annual Reviews
    Annual Review of Environment and Resources 24 (1999), S. 173-188 
    ISSN: 1056-3466
    Source: Annual Reviews Electronic Back Volume Collection 1932-2001ff
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Conventional hydroelectric generation uses a renewable energy source and currently supplies ~10% of the annual output of electricity in the United States and ~20% of electricity generated worldwide. To provide a significant contribution to sustainable development, the hydropower industry must address a variety of environmental concerns, including water quality and fish passage issues. The paper discusses new technologies for turbine design and control systems to improve dissolved oxygen levels in turbine discharges and survival of fish during turbine passage. The paper describes development, testing, and test results for these technologies, with an emphasis on collaboration of stakeholders and balance between environmental stewardship and economical power production.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 283-294 
    ISSN: 1539-6924
    Keywords: Risk perception ; pesticides ; pest management ; health effects ; agricultural pollution
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Water pollution from agricultural pesticides continues to be a public concern. Given that the use of such pesticides on the farm is largely governed by voluntary behavior, it is important to understand what drives farmer behavior. Health belief models in public health and social psychology argue that persons who have adverse health experiences are likely to undertake preventive behavior. An analogous hypothesis set was tested here: farmers who believe they have had adverse health experiences from pesticides are likely to have heightened concerns about pesticides and are more likely to take greater precautions in dealing with pesticides. This work is based on an original survey of a population of 2700 corn and soybean growers in Maryland, New York, and Pennsylvania using the U.S. Department of Agriculture data base. It was designed as a mail survey with telephone follow-up, and resulted in a 60 percent response rate. Farm operators report experiencing adverse health problems they believe are associated with pesticides that is equivalent to an incidence rate that is higher than the reported incidence of occupational pesticide poisonings, but similar to the reported incidence of all pesticide poisonings. Farmers who report experiencing such problems have more heightened concerns about water pollution from fertilizers and pesticides, and illness and injury from mixing, loading, and applying pesticides than farmers who have not experienced such problems. Farmers who report experiencing such problems also are more likely to report using alternative pest management practices than farmers who do not report having such problems. This implies that farmers who have had such experiences do care about the effects of application and do engage in alternative means of pest management, which at least involve the reduction in pesticide use.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    ISSN: 1539-6924
    Keywords: Ethnicity ; fish consumption ; advisories ; Savannah River ; methylmercury ; risk perception
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract South Carolina has issued fish consumption advisories for the Savannah River based on mercury and radionuclide levels. We examine differences in fishing rates and fish consumption of 258 people interviewed while fishing along the Savannah River, as a function of age, education, ethnicity, employment history, and income, and test the assumption that the average consumption of fish is less than the recreational value of 19 kg/year assumed by risk assessors. Ethnicity and education contributed significantly to explaining variations in number of fish meals per month, serving size, and total quantity of fish consumed per year. Blacks fished more often, ate more fish meals of slightly larger serving sizes, and consumed more fish per year than did Whites. Although education and income were correlated, education contributed most significantly to behavior; people who did not graduate from high school ate fish more often, ate more fish per year, and ate more whole fish than people who graduated from high school. Computing consumption of fish for each person individually indicates that (1) people who eat fish more often also eat larger portions, (2) a substantial number of people consume more than the amount of fish used to compute risk to recreational fishermen, (3) some people consume more than the subsistence level default assumption (50 kg/year) and (4) Blacks consume more fish per year than Whites, putting them at greater risk from contaminants in fish. Overall, ethnicity, age, and education contributed to variations in fishing behavior and consumption.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 453-459 
    ISSN: 1539-6924
    Keywords: Efficiency ; nonquantal ; probit ; quantal
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Methods of quantitative risk assessment for toxic responses that are measured on a continuous scale are not well established. Although risk-assessment procedures that attempt to utilize the quantitative information in such data have been proposed, there is no general agreement that these procedures are appreciably more efficient than common quantal dose–response procedures that operate on dichotomized continuous data. This paper points out an equivalence between the dose–response models of the nonquantal approach of Kodell and West(1) and a quantal probit procedure, and provides results from a Monte Carlo simulation study to compare coverage probabilities of statistical lower confidence limits on dose corresponding to specified additional risk based on applying the two procedures to continuous data from a dose–response experiment. The nonquantal approach is shown to be superior, in terms of both statistical validity and statistical efficiency.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    ISSN: 1539-6924
    Keywords: Threshold ; measurement error ; mortality ; air pollution
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The association between daily fluctuations in ambient particulate matter and daily variations in nonaccidental mortality have been extensively investigated. Although it is now widely recognized that such an association exists, the form of the concentration–response model is still in question. Linear, no threshold and linear threshold models have been most commonly examined. In this paper we considered methods to detect and estimate threshold concentrations using time series data of daily mortality rates and air pollution concentrations. Because exposure is measured with error, we also considered the influence of measurement error in distinguishing between these two completing model specifications. The methods were illustrated on a 15-year daily time series of nonaccidental mortality and particulate air pollution data in Toronto, Canada. Nonparametric smoothed representations of the association between mortality and air pollution were adequate to graphically distinguish between these two forms. Weighted nonlinear regression methods for relative risk models were adequate to give nearly unbiased estimates of threshold concentrations even under conditions of extreme exposure measurement error. The uncertainty in the threshold estimates increased with the degree of exposure error. Regression models incorporating threshold concentrations could be clearly distinguished from linear relative risk models in the presence of exposure measurement error. The assumption of a linear model given that a threshold model was the correct form usually resulted in overestimates in the number of averted premature deaths, except for low threshold concentrations and large measurement error.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 527-545 
    ISSN: 1539-6924
    Keywords: breast-feeding ; chlorinated compounds ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Exposure to persistent organochlorines in breast milk was estimated probabilistically for Canadian infants. Noncancer health effects were evaluated by comparing the predicted exposure distributions to published guidance values. For chemicals identified as potential human carcinogens, cancer risks were evaluated using standard methodology typically applied in Canada, as well as an alternative method developed under the Canadian Environmental Protection Act. Potential health risks associated with exposure to persistent organochlorines were quantitatively and qualitatively weighed against the benefits of breast-feeding. Current levels of the majority of contaminants identified in Canadian breast milk do not pose unacceptable risks to infants. Benefits of breast-feeding are well documented and qualitatively appear to outweigh potential health concerns associated with organochlorine exposure. Furthermore, the risks of mortality from not breast-feeding estimated by Rogan and colleagues exceed the theoretical cancer risks estimated for infant exposure to potential carcinogens in Canadian breast milk. Although levels of persistent compounds have been declining in Canadian breast milk, potentially significant risks were estimated for exposure to polychlorinated biphenyls, dibenzo-p-dioxins, and dibenzofurans. Follow-up work is suggested that would involve the use of a physiologically based toxicokinetic model with probabilistic inputs to predict dioxin exposure to the infant. A more detailed risk analysis could be carried out by coupling the exposure estimates with a dose–response analysis that accounts for uncertainty.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    ISSN: 1539-6924
    Keywords: air dispersion ; models ; validation ; Rocky Flats
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Five atmospheric transport models were evaluated for use in Phase II of the Historical Public Exposures Studies at the Rocky Flats Plant. Models included a simple straight-line Gaussian plume model (ISCST2), several integrated puff models (RATCHET, TRIAD, and INPUFF2), and a complex terrain model (TRAC). Evaluations were based on how well model predictions compared with sulfur hexafluoride tracer measurements taken in the vicinity of Rocky Flats in February 1991. Twelve separate tracer experiments were conducted, each lasting 9 hr and measured at 140 samplers in arcs 8 and 16 km from the release point at Rocky Flats. Four modeling objectives were defined based on the endpoints of the overall study: (1) the unpaired maximum hourly average concentration, (2) paired time-averaged concentration, (3) unpaired time-averaged concentration, and (4) arc-integrated concentration. Performance measures were used to evaluate models and focused on the geometric mean and standard deviation of the predicted-to-observed ratio and the correlation coefficient between predicted and observed concentrations. No one model consistently outperformed the others in all modeling objectives and performance measures. About 75% of the maximum hourly concentration predictions were within a factor of 5 of the observations. About 64% of the paired and 80% of the unpaired time-averaged model predictions were within a factor of 5 of the observations. The overall performance of the RATCHET model was somewhat better than the other models. All models appeared to experience difficulty defining plume trajectories, which was attributed to the influence of multilayered flow initiated by terrain complexities and the diurnal flow patterns characteristic of the Colorado Front Range.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    ISSN: 1539-6924
    Keywords: initiation ; Monte Carlo methods ; promotion
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We present the results of a quantitative assessment of the lung cancer risk associated with occupational exposure to refractory ceramic fibers (RCF). The primary sources of data for our risk assessment were two long-term oncogenicity studies in male Fischer rats conducted to assess the potential pathogenic effects associated with prolonged inhalation of RCF. An interesting feature of the data was the availability of the temporal profile of fiber burden in the lungs of experimental animals. Because of this information, we were able to conduct both exposure–response and dose–response analyses. Our risk assessment was conducted within the framework of a biologically based model for carcinogenesis, the two-stage clonal expansion model, which allows for the explicit incorporation of the concepts of initiation and promotion in the analyses. We found that a model positing that RCF was an initiator had the highest likelihood. We proposed an approach based on biological considerations for the extrapolation of risk to humans. This approach requires estimation of human lung burdens for specific exposure scenarios, which we did by using an extension of a model due to Yu. Our approach acknowledges that the risk associated with exposure to RCF depends on exposure to other lung carcinogens. We present estimates of risk in two populations: (1) a population of nonsmokers and (2) an occupational cohort of steelworkers not exposed to coke oven emissions, a mixed population that includes both smokers and nonsmokers.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    ISSN: 1539-6924
    Keywords: accident risk ; population distribution ; RADTRAN ; transportation ; radioactive materials
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Calculation of accident dose-risk estimates with the RADTRAN code requires input data describing the population likely to be affected by the plume of radioactive material (RAM) released in a hypothetical transportation accident. In the existing model, population densities within 1/2 mile (0.8 km) of the route centerline are tabulated in three ranges (Rural, Suburban, and Urban). These population densities may be of questionable validity since the plume in the RADTRAN analysis is assumed to extend out to 120 km from the hypothetical accident site. We present a GIS-based population model which accounts for the actual distribution of population under a potential plume, and compare accident-risk estimates based on the resulting population densities with those based on the existing model. Results for individual points along a route differ greatly, but the cumulative accident risks for a sample route of a few hundred kilometers are found to be comparable, if not identical. We conclude, therefore, that for estimation of aggregate accident risks over typical routes of several hundred kilometers, the existing, simpler RADTRAN model is sufficiently detailed and accurate.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 685-687 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 703-710 
    ISSN: 1539-6924
    Keywords: probabilistic risk analysis ; subjective judgment ; risk-informed regulation ; robust Bayesian analysis ; human performance ; human error ; management and organizational factors ; corporate culture
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper discusses a number of the key challenges to the acceptance and application of probabilistic risk analysis (PRA). Those challenges include: (a) the extensive reliance on subjective judgment in PRA, requiring the development of guidance for the use of PRA in risk-informed regulation, and possibly the development of “robust” or “reference” prior distributions to minimize the reliance on judgment; and (b) the treatment of human performance in PRA, including not only human error per se but also management and organizational factors more broadly. All of these areas are seen as presenting interesting research challenges at the interface between engineering and other disciplines.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 689-701 
    ISSN: 1539-6924
    Keywords: risk ; risk perception ; risk assessment ; risk communication ; risk management
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Risk management has become increasingly politicized and contentious. Polarized views, controversy, and conflict have become pervasive. Research has begun to provide a new perspective on this problem by demonstrating the complexity of the concept “risk” and the inadequacies of the traditional view of risk assessment as a purely scientific enterprise. This paper argues that danger is real, but risk is socially constructed. Risk assessment is inherently subjective and represents a blending of science and judgment with important psychological, social, cultural, and political factors. In addition, our social and democratic institutions, remarkable as they are in many respects, breed distrust in the risk arena. Whoever controls the definition of risk controls the rational solution to the problem at hand. If risk is defined one way, then one option will rise to the top as the most cost-effective or the safest or the best. If it is defined another way, perhaps incorporating qualitative characteristics and other contextual factors, one will likely get a different ordering of action solutions. Defining risk is thus an exercise in power. Scientific literacy and public education are important, but they are not central to risk controversies. The public is not irrational. Their judgments about risk are influenced by emotion and affect in a way that is both simple and sophisticated. The same holds true for scientists. Public views are also influenced by worldviews, ideologies, and values; so are scientists' views, particularly when they are working at the limits of their expertise. The limitations of risk science, the importance and difficulty of maintaining trust, and the complex, sociopolitical nature of risk point to the need for a new approach—one that focuses upon introducing more public participation into both risk assessment and risk decision making in order to make the decision process more democratic, improve the relevance and quality of technical analysis, and increase the legitimacy and public acceptance of the resulting decisions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 727-738 
    ISSN: 1539-6924
    Keywords: mitigation ; insurance ; catastrophic risk ; building codes
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper examines the impact that insurance coupled with specific risk mitigation measures (RMMs) could have on reducing losses from hurricanes and earthquakes as well as improving the solvency position of insurers who provide coverage against these hazards. We first explore why relatively few individuals adopt cost-effective RMMs by reporting on the results of empirical studies and controlled laboratory studies. We then investigate the impact that an RMM has on both the expected losses and those from a worst case scenario in two model cities—Oakland (an earthquake-prone area) and Miami/Dade County (a hurricane-prone area) which were constructed respectively with the assistance of two modeling firms. The paper then explores three programs for forging a meaningful public-private sector partnership: well-enforced building codes, insurance premium reductions linked with long-term loans, and lower deductibles on insurance policies tied to mitigation. We conclude by briefly examining four issues for future research on linking mitigation with insurance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 711-726 
    ISSN: 1539-6924
    Keywords: variability ; exposure ; susceptibility ; risk assessment ; pharmacokinetics ; pharmacodynamics
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper reviews existing data on the variability in parameters relevant for health risk analyses. We cover both exposure-related parameters and parameters related to individual susceptibility to toxicity. The toxicity/susceptibility data base under construction is part of a longer term research effort to lay the groundwork for quantitative distributional analyses of non-cancer toxic risks. These data are broken down into a variety of parameter types that encompass different portions of the pathway from external exposure to the production of biological responses. The discrete steps in this pathway, as we now conceive them, are: •Contact Rate (Breathing rates per body weight; fish consumption per body weight) •Uptake or Absorption as a Fraction of Intake or Contact Rate •General Systemic Availability Net of First Pass Elimination and Dilution via Distribution Volume (e.g., initial blood concentration per mg/kg of uptake) •Systemic Elimination (half life or clearance) •Active Site Concentration per Systemic Blood or Plasma Concentration •Physiological Parameter Change per Active Site Concentration (expressed as the dose required to make a given percentage change in different people, or the dose required to achieve some proportion of an individual's maximum response to the drug or toxicant) •Functional Reserve Capacity–Change in Baseline Physiological Parameter Needed to Produce a Biological Response or Pass a Criterion of Abnormal Function Comparison of the amounts of variability observed for the different parameter types suggests that appreciable variability is associated with the final step in the process–differences among people in “functional reserve capacity.” This has the implication that relevant information for estimating effective toxic susceptibility distributions may be gleaned by direct studies of the population distributions of key physiological parameters in people that are not exposed to the environmental and occupational toxicants that are thought to perturb those parameters. This is illustrated with some recent observations of the population distributions of Low Density Lipoprotein Cholesterol from the second and third National Health and Nutrition Examination Surveys.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 751-758 
    ISSN: 1539-6924
    Keywords: nuclear waste ; high-level waste ; performance assessment ; Yucca Mountain
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The management of spent nuclear fuel and high-level nuclear waste has the deserved reputation as one of the most intractable policy issues facing the United States and other nations using nuclear reactors for electric power generation. This paper presents the author's perspective on this complex issue, based on a decade of service with the Nuclear Waste Technical Review Board and Board on Radioactive Waste Management of the National Research Council.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 763-807 
    ISSN: 1539-6924
    Keywords: risk assessment ; probabilistic risk assessment ; performance assessment ; policy analysis ; history of technology
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This article describes the evolution of the process for assessing the hazards of a geologic disposal system for radioactive waste and, similarly, nuclear power reactors, and the relationship of this process with other assessments of risk, particularly assessments of hazards from manufactured carcinogenic chemicals during use and disposal. This perspective reviews the common history of scientific concepts for risk assessment developed until the 1950s. Computational tools and techniques developed in the late 1950s and early 1960s to analyze the reliability of nuclear weapon delivery systems were adopted in the early 1970s for probabilistic risk assessment of nuclear power reactors, a technology for which behavior was unknown. In turn, these analyses became an important foundation for performance assessment of nuclear waste disposal in the late 1970s. The evaluation of risk to human health and the environment from chemical hazards is built on methods for assessing the dose response of radionuclides in the 1950s. Despite a shared background, however, societal events, often in the form of legislation, have affected the development path for risk assessment for human health, producing dissimilarities between these risk assessments and those for nuclear facilities. An important difference is the regulator's interest in accounting for uncertainty.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    ISSN: 1539-6924
    Keywords: performance assessment ; nuclear waste ; risk-informed regulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The U.S. Nuclear Regulatory Commission (NRC) staff has developed a performance assessment capability to address three programmatic areas in nuclear waste management: high-level waste, low-level waste, and decommissioning of licensed facilities (license termination). The NRC capability consists of: (1) methodologies for performance assessment; (2) models and computer codes for estimating system performance; (3) regulatory guidance in various forms, such as regulations, Branch Technical Positions, and Standard Review Plans; and (4) a technical staff experienced in executing and evaluating performance assessments for a variety of waste systems. Although the tools and techniques are refined for each programmatic area, general approaches and similar issues are encountered in all areas.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 903-913 
    ISSN: 1539-6924
    Keywords: nuclear waste ; performance assessment ; Yucca Mountain ; probability ; repository ; high-level waste ; risk ; engineered barriers
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In this paper the problem of high-level nuclear waste disposal is viewed as a five-stage, cascaded decision problem. The first four of these decisions having essentially been made, the work of recent years has been focused on the fifth stage, which concerns specifics of the repository design. The probabilistic performance assessment (PPA) work is viewed as the outcome prediction for this stage, and the site characterization work as the information gathering option. This brief examination of the proposed Yucca Mountain repository through a decision analysis framework resulted in three conclusions: (1) A decision theory approach to the process of selecting and characterizing Yucca Mountain would enhance public understanding of the issues and solutions to high-level waste management; (2) engineered systems are an attractive alternative to offset uncertainties in the containment capability of the natural setting and should receive greater emphasis in the design of the repository; and (3) a strategy of “waste management” should be adopted, as opposed to “waste disposal,” as it allows for incremental confirmation and confidence building of a permanent solution to the high-level waste problem.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 915-931 
    ISSN: 1539-6924
    Keywords: Yucca Mountain ; performance assessment ; logic tree ; high-level radioactive waste ; Monte Carlo ; expert judgment ; repository ; groundwater ; climate ; infiltration ; percolation ; hydrothermal ; corrosion
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The Electric Power Research Institute (EPRI) has sponsored the development of a model to assess the long-term, overall “performance” of the candidate spent fuel and high-level radioactive waste (HLW) disposal facility at Yucca Mountain, Nevada. The model simulates the processes that lead to HLW container corrosion, HLW mobilization from the spent fuel, and transport by groundwater, and contaminated groundwater usage by future hypothetical individuals leading to radiation doses to those individuals. The model must incorporate a multitude of complex, coupled processes across a variety of technical disciplines. Furthermore, because of the very long time frames involved in the modeling effort (≫104 years), the relative lack of directly applicable data, and many uncertainties and variabilities in those data, a probabilistic approach to model development was necessary. The developers of the model chose a logic tree approach to represent uncertainties in both conceptual models and model parameter values. The developers felt the logic tree approach was the most appropriate. This paper discusses the value and use of logic trees applied to assessing the uncertainties in HLW disposal, the components of the model, and a few of the results of that model. The paper concludes with a comparison of logic trees and Monte Carlo approaches.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    ISSN: 1539-6924
    Keywords: compliance certification application ; engineering analysis ; geochemistry ; geohydrology ; performance assessment ; probabilistic systems analysis ; radioactive waste ; scientific validity ; uncertainty ; 40 CFR 191
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, six-step process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA; they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1003-1016 
    ISSN: 1539-6924
    Keywords: WIPP ; radioactive waste ; repository ; performance assessment ; transuranic waste
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The Waste Isolation Pilot Plant (WIPP) is a geological repository for disposal of U.S. defense transuranic radioactive waste. Built and operated by the U.S. Department of Energy (DOE), it is located in the Permian age salt beds in southeastern New Mexico at a depth of 655 m. Performance assessment for the repository's compliance with the 10,000-year containment standards was completed in 1996 and the U.S. Environmental Protection Agency (EPA) certified in 1998 that the repository meets compliance with the EPA standards 40 CFR 191 and 40 CFR 194. The Environmental Evaluation Group (EEG) review of the DOE's application for certification identified a number of issues. These related to the scenarios, conceptual models, and values of the input parameters used in the calculations. It is expected that these issues will be addressed and resolved during the first 5-year recertification process that began with the first receipt of waste at WIPP on March 26, 1999, and scheduled to be completed in March 2004.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    ISSN: 1539-6924
    Keywords: risk perception ; CRESP ; trust ; DOE Savannah River site ; risk assessment ; stakeholder ; economic dependence
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Environmental managers are increasingly charged with involving the public in the development and modification of policies regarding risks to human health and the environment. Involving the public in environmental decision making first requires a broad understanding of how and why the public perceives various risks. The Savannah River Stakeholder Study was conducted with the purpose of investigating individual, economic, and social characteristics of risk perceptions among those living near the Savannah River Nuclear Weapons Site. A number of factors were found to impact risk perceptions among those living near the site. One's estimated proximity to the site and relative river location surfaced as strong determinants of risk perceptions among SRS residents. Additionally, living in a quality neighborhood and demonstrating a willingness to accept health risks for economic gain strongly abated heightened risk perceptions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    ISSN: 1539-6924
    Keywords: risk assessment ; uncertainty ; formaldehyde ; decision analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1059-1069 
    ISSN: 1539-6924
    Keywords: spatial statistics ; optimal sequential search ; adaptive sampling ; simulation-optimization ; multiple imputation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Suppose that a residential neighborhood may have been contaminated by a nearby abandoned hazardous waste site. The suspected contamination consists of elevated soil concentrations of chemicals that are also found in the absence of site-related contamination. How should a risk manager decide which residential properties to sample and which ones to clean? This paper introduces an adaptive spatial sampling approach which uses initial observations to guide subsequent search. Unlike some recent model-based spatial data analysis methods, it does not require any specific statistical model for the spatial distribution of hazards, but instead constructs an increasingly accurate nonparametric approximation to it as sampling proceeds. Possible cost-effective sampling and cleanup decision rules are described by decision parameters such as the number of randomly selected locations used to initialize the process, the number of highest-concentration locations searched around, the number of samples taken at each location, a stopping rule, and a remediation action threshold. These decision parameters are optimized by simulating the performance of each decision rule. The simulation is performed using the data collected so far to impute multiple probable values of unknown soil concentration distributions during each simulation run. This optimized adaptive spatial sampling technique has been applied to real data using error probabilities for wrongly cleaning or wrongly failing to clean each location (compared to the action that would be taken if perfect information were available) as evaluation criteria. It provides a practical approach for quantifying trade-offs between these different types of errors and expected cost. It also identifies strategies that are undominated with respect to all of these criteria.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1113-1125 
    ISSN: 1539-6924
    Keywords: OSHA ; environmental health regulation ; risk ambiguity ; indoor/workplace air quality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Political context may play a large role in influencing the efficiency of environmental and health regulations. This case study uses data from a 1989 update of the Occupational Safety and Health Administration (OSHA) Permissible Exposure Limits (PELs) program to determine the relative effects of legislative mandates, costly acquisition of information by the agency, and pressure applied by special interest groups upon exposure standards. The empirical analysis suggests that federal agencies successfully thwart legislative attempts to limit agency discretion, and that agencies exercise bounded rationality by placing greater emphasis on more easily obtained information. The 1989 PELs were less significantly related to more costly information, contained “safety factors” for chemicals presenting relatively more ambiguous risks, and the proposed standard stringencies showed evidence of being influenced by vying industry and labor interests.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    ISSN: 1539-6924
    Keywords: aldrin ; dieldrin ; epidemiology ; occupational exposure ; cancer dose-response modeling ; proportional hazards ; hormesis ; distributional characterizations of added cancer risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The paper applies classical statistical principles to yield new tools for risk assessment and makes new use of epidemiological data for human risk assessment. An extensive clinical and epidemiological study of workers engaged in the manufacturing and formulation of aldrin and dieldrin provides occupational hygiene and biological monitoring data on individual exposures over the years of employment and provides unusually accurate measures of individual lifetime average daily doses. In the cancer dose-response modeling, each worker is treated as a separate experimental unit with his own unique dose. Maximum likelihood estimates of added cancer risk are calculated for multistage, multistage-Weibull, and proportional hazards models. Distributional characterizations of added cancer risk are based on bootstrap and relative likelihood techniques. The cancer mortality data on these male workers suggest that low-dose exposures to aldrin and dieldrin do not significantly increase human cancer risk and may even decrease the human hazard rate for all types of cancer combined at low doses (e.g., 1 μg/kg/day). The apparent hormetic effect in the best fitting dose-response models for this data set is statistically significant. The decrease in cancer risk at low doses of aldrin and dieldrin is in sharp contrast to the U.S. Environmental Protection Agency's upper bound on cancer potency based on mouse liver tumors. The EPA's upper bound implies that lifetime average daily doses of 0.0000625 and 0.00625 μg/kg body weight/day would correspond to increased cancer risks of 0.000001 and 0.0001, respectively. However, the best estimate from the Pernis epidemiological data is that there is no increase in cancer risk in these workers at these doses or even at doses as large as 2 μg/kg/day.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1157-1171 
    ISSN: 1539-6924
    Keywords: risk assessment ; transportation risk ; diesel exhaust ; fugitive dust ; vehicle emissions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract When the transportation risk posed by shipments of hazardous chemical and radioactive materials is being assessed, it is necessary to evaluate the risks associated with both vehicle emissions and cargo-related risks. Diesel exhaust and fugitive dust emissions from vehicles transporting hazardous shipments lead to increased air pollution, which increases the risk of latent fatalities in the affected population along the transport route. The estimated risk from these vehicle-related sources can often be as large or larger than the estimated risk associated with the material being transported. In this paper, data from the U.S. Environmental Protection Agency's Motor Vehicle-Related Air Toxics Study are first used to develop latent cancer fatality estimates per kilometer of travel in rural and urban areas for all diesel truck classes. These unit risk factors are based on studies investigating the carcinogenic nature of diesel exhaust. With the same methodology, the current per-kilometer latent fatality risk factor used in transportation risk assessments for heavy diesel trucks in urban areas is revised and the analysis expanded to provide risk factors for rural areas and all diesel truck classes. These latter fatality estimates may include, but are not limited to, cancer fatalities and are based primarily on the most recent epidemiological data available on mortality rates associated with ambient air PM-10 concentrations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    ISSN: 1539-6924
    Keywords: transportation risk ; Hydraxine ; sensitivity analysis ; simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1205-1214 
    ISSN: 1539-6924
    Keywords: Monte Carlo ; correlation ; copulas ; bivariate distributions ; dioxins
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Monte Carlo methods in risk assessment are finding increasingly widespread application. With the recognition that inputs may be correlated, the incorporation of such correlations into the simulation has become important. Most implementations rely upon the method of Iman and Conover for generating correlated random variables. In this work, alternative methods using copulas are presented for deriving correlated random variables. It is further shown that the particular algorithm or assumption used may have a substantial effect on the output results, due to differences in higher order bivariate moments.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    ISSN: 1539-6924
    Keywords: municipal waste incineration ; risk assessment ; Monte-Carlo simulation ; time activity patterns
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract During the modernization of the municipal waste incinerator (MWI, maximum capacity of 180,000 tons per year) of Metropolitan Grenoble (405,000 inhabitants), in France, a risk assessment was conducted, based on four tracer pollutants: two volatile organic compounds (benzene and 1, 1, 1 trichloroethane) and two heavy metals (nickel and cadmium, measured in particles). A Gaussian plume dispersion model, applied to maximum emissions measured at the MWI stacks, was used to estimate the distribution of these pollutants in the atmosphere throughout the metropolitan area. A random sample telephone survey (570 subjects) gathered data on time-activity patterns, according to demographic characteristics of the population. Life-long exposure was assessed as a time-weighted average of ambient air concentrations. Inhalation alone was considered because, in the Grenoble urban setting, other routes of exposure are not likely. A Monte Carlo simulation was used to describe probability distributions of exposures and risks. The median of the life-long personal exposures distribution to MWI benzene was 3.2·10−5 μg/m3 (20th and 80th percentiles = 1.5·10−5 and 6.5·10−5 μg/m3), yielding a 2.6·10−10 carcinogenic risk (1.2·10−10–5.4·10−10). For nickel, the corresponding life-time exposure and cancer risk were 1.8·10−4 μg/m3 (0.9.10−4 – 3.6·10−4 μg/m3) and 8.6·10−8 (4.3·10−8–17.3·10−8); for cadmium they were respectively 8.3·10−6 μg/m3 (4.0·10−6–17.6·10−6) and 1.5·10−8 (7.2·10−9–3.1·10−8). Inhalation exposure to cadmium emitted by the MWI represented less than 1% of the WHO Air Quality Guideline (5 ng/m3), while there was a margin of exposure of more than 109 between the NOAEL (150 ppm) and exposure estimates to trichloroethane. Neither dioxins nor mercury, a volatile metal, were measured. This could lessen the attributable life-long risks estimated. The minute (VOCs and cadmium) to moderate (nickel) exposure and risk estimates are in accord with other studies on modern MWIs meeting recent emission regulations, however.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1235-1249 
    ISSN: 1539-6924
    Keywords: soil contamination ; remediation urgency ; standards ; human exposure ; ecotoxicological risks ; risk due to contaminant migration
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract To assess soil and groundwater quality two generic (i.e. multifunctional) risk-based standards, Target and Intervention Value, have been developed, in the framework of the Dutch Soil Protection Act. These standards allow soil and groundwater to be classified as clean, slightly contaminated or seriously contaminated. The Target Value is based on potential risks to ecosystems, while the Intervention Value is based on potential risks to humans and ecosystems. In the case of serious soil contamination the site has, in principle, to be remediated, making it necessary to determine the remediation urgency on the basis of actual (i.e. site-specific) risks to humans and ecosystems and, besides, actual risks due to contaminant migration.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    ISSN: 1539-6924
    Keywords: Environment ; equity ; coke ; oil ; history ; risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Facility-specific information on pollution was obtained for 36 coke plants and 46 oil refineries in the United States and matched with information on populations surrounding these 82 facilities. These data were analyzed to determine whether environmental inequities were present, whether they were more economic or racial in nature, and whether the racial composition of nearby communities has changed significantly since plants began operations. The Census tracts near coke plants have a disproportionate share of poor and nonwhite residents. Multivariate analyses suggest that existing inequities are primarily economic in nature. The findings for oil refineries are not strongly supportive of the environmental inequity hypothesis. Rank ordering of facilities by race, poverty, and pollution produces limited (although not consistent) evidence that the more risky facilities tend to be operating in communities with above-median proportions of nonwhite residents (near coke plants) and Hispanic residents (near oil refineries). Over time, the racial makeup of many communities near facilities has changed significantly, particularly in the case of coke plants sited in the early 1900s. Further risk-oriented studies of multiple manufacturing facilities in various industrial sectors of the economy are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 231-247 
    ISSN: 1539-6924
    Keywords: Health risk assessment ; hazard characterization ; Acceptable Daily Intake ; Reference Dose ; paradigm ; practices ; cancer ; non-cancer ; Bayesian ; default options
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the “NAS paradigm.” Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as “Acceptable Daily Intake,” “Reference Dose,” and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's “Proposition 65,” where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of “conventional air pollutants.” These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 249-259 
    ISSN: 1539-6924
    Keywords: Reliability ; Monte Carlo simulation ; hazardous waste treatment ; safety factor ; packed tower ; activated sludge
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The reliability of a treatment process is addressed in terms of achieving a regulatory effluent concentration standard and the design safety factors associated with the treatment process. This methodology was then applied to two aqueous hazardous waste treatment processes: packed tower aeration and activated sludge (aerobic) biological treatment. The designs achieving 95 percent reliability were compared with those designs based on conventional practice to determine their patterns of conservatism. Scoping-level treatment costs were also related to reliability levels for these treatment processes. The results indicate that the reliability levels for the physical/chemical treatment process (packed tower aeration) based on the deterministic safety factors range from 80 percent to over 99 percent, whereas those for the biological treatment process range from near 0 percent to over 99 percent, depending on the compound evaluated. Increases in reliability per unit increase in treatment costs are most pronounced at lower reliability levels (less than about 80 percent) than at the higher reliability levels (greater than 90 percent, indicating a point of diminishing returns. Additional research focused on process parameters that presently contain large uncertainties may reduce those uncertainties, with attending increases in the reliability levels of the treatment processes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 321-321 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 295-308 
    ISSN: 1539-6924
    Keywords: Noncancer risk assessment ; uncertainty analysis ; systematic error ; calibration ; censoring ; relative potency ; safety factor
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The prominent role of animal bioassay evidence in environmental regulatory decisions compels a careful characterization of extrapolation uncertainties. In noncancer risk assessment, uncertainty factors are incorporated to account for each of several extrapolations required to convert a bioassay outcome into a putative subthreshold dose for humans. Measures of relative toxicity taken between different dosing regimens, different endpoints, or different species serve as a reference for establishing the uncertainty factors. Ratios of no observed adverse effect levels (NOAELs) have been used for this purpose; statistical summaries of such ratios across sets of chemicals are widely used to guide the setting of uncertainty factors. Given the poor statistical properties of NOAELs, the informativeness of these summary statistics is open to question. To evaluate this, we develop an approach to “calibrate” the ability of NOAEL ratios to reveal true properties of a specified distribution for relative toxicity. A priority of this analysis is to account for dependencies of NOAEL ratios on experimental design and other exogenous factors. Our analysis of NOAEL ratio summary statistics finds (1) that such dependencies are complex and produce pronounced systematic errors and (2) that sampling error associated with typical sample sizes (50 chemicals) is non-negligible. These uncertainties strongly suggest that NOAEL ratio summary statistics cannot be taken at face value; conclusions based on such ratios reported in well over a dozen published papers should be reconsidered.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    ISSN: 1539-6924
    Keywords: Cancer risk ; in vivo doses ; linear multiplicative model ; ethylene oxide ; relative potency ; butadiene ; acrylamide
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A mechanistic model and associated procedures are proposed for cancer risk assessment of genotoxic chemicals. As previously shown for ionizing radiation, a linear multiplicative model was found to be compatible with published experimental data for ethylene oxide, acrylamide, and butadiene. The validity of this model was anticipated in view of the multiplicative interaction of mutation with inherited and acquired growth-promoting conditions. Concurrent analysis led to rejection of an additive model (i.e. the model commonly applied for cancer risk assessment). A reanalysis of data for radiogenic cancer in mouse, dog and man shows that the relative risk coefficient is approximately the same (0.4 to 0.5 percent per rad) for tumours induced in the three species. Doses in vivo, defined as the time-integrated concentrations of ultimate mutagens, expressed in millimol × kg−1 × h (mMh) are, like radiation doses given in Gy or rad, proportional to frequencies of potentially mutagenic events. The radiation dose equivalents of chemical doses are, calculated by multiplying chemical doses (in mMh) with the relative genotoxic potencies (in rad × mMh−1) determined in vitro. In this way the relative cancer incidence increments in rats and mice exposed to ethylene oxide were shown to be about 0.4 percent per rad-equivalent, in agreement with the data for radiogenic cancer. Our analyses suggest that values of the relative risk coefficients for genotoxic chemicals are independent of species and that relative cancer risks determined in animal tests apply also to humans. If reliable animal test data are not available, cancer risks may be estimated by the relative potency. In both cases exposure dose/target dose relationships, the latter via macromolecule adducts, should be determined.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 585-598 
    ISSN: 1539-6924
    Keywords: uncertainty ; threatened plants ; risk ; conservation ; rule sets ; IUCN
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Australian state and federal agencies use a broad range of methods for setting conservation priorities for species at risk. Some of these are based on rule sets developed by the International Union for the Conservation of Nature, while others use point scoring protocols to assess threat. All of them ignore uncertainty in the data. In this study, we assessed the conservation status of 29 threatened vascular plants from Tasmania and New South Wales using a variety of methods including point scoring and rule-based approaches. In addition, several methods for dealing with uncertainty in the data were applied to each of the priority-setting schemes. The results indicate that the choice of a protocol for setting priorities and the choice of the way in which uncertainty is treated may make important differences to the resulting assessments of risk. The choice among methods needs to be rationalized within the management context in which it is to be applied. These methods are not a substitute for more formal risk assessment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    ISSN: 1539-6924
    Keywords: MeHg ; pharmacokinetics ; PBPK model ; variability ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 μg/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 μg/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 μg/kg/day and an MRL of 0.3 μg/kg/day.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 577-584 
    ISSN: 1539-6924
    Keywords: risk assessment ; exposure point concentration ; bootstrapping ; gamma distribution ; lognormal
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The U.S. Environmental Protection Agency (EPA) recommends the use of the one-sided 95% upper confidence limit of the arithmetic mean based on either a normal or lognormal distribution for the contaminant (or exposure point) concentration term in the Superfund risk assessment process. When the data are not normal or lognormal this recommended approach may overestimate the exposure point concentration (EPC) and may lead to unecessary cleanup at a hazardous waste site. The EPA concentration term only seems to perform like alternative EPC methods when the data are well fit by a lognormal distribution. Several alternative methods for calculating the EPC are investigated and compared using soil data collected from three hazardous waste sites in Montana, Utah, and Colorado. For data sets that are well fit by a lognormal distribution, values for the Chebychev inequality or the EPA concentration term may be appropriate EPCs. For data sets where the soil concentration data are well fit by gamma distributions, Wong's method may be used for calculating EPCs. The studentized bootstrap-t and Hall's bootstrap-t transformation are recommended for EPC calculation when all distribution fits are poor. If a data set is well fit by a distribution, parametric bootstrap may provide a suitable EPC.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    ISSN: 1539-6924
    Keywords: risk perception ; air quality ; environmental justice ; community health survey
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper describes a multi-stakeholder process designed to assess the potential health risks associated with adverse air quality in an urban industrial neighborhood. The paper briefly describes the quantitative health risk assessment conducted by scientific experts, with input by a grassroots community group concerned about the impacts of adverse air quality on their health and quality of life. In this case, rather than accept the views of the scientific experts, the community used their powers of perception to advantage by successfully advocating for a professionally conducted community health survey. This survey was designed to document, systematically and rigorously, the health risk perceptions community members associated with exposure to adverse air quality in their neighborhood. This paper describes the institutional and community contexts within which the research is situated as well as the design, administration, analysis, and results of the community health survey administered to 402 households living in an urban industrial neighborhood in Hamilton, Ontario, Canada. These survey results served to legitimate the community's concerns about air quality and to help broaden operational definitions of ‘health.’ In addition, the results of both health risk assessment exercises served to keep issues of air quality on the local political agenda. Implications of these findings for our understanding of the environmental justice process as well as the ability of communities to influence environmental health policy are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    ISSN: 1539-6924
    Keywords: risk perception ; risk characteristics ; outrage factors ; rbGH ; ordered probit
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This study estimates the effect risk characteristics, described as outrage factors by Hadden, have on consumers' risk perceptions toward the food-related biotechnology, recombinant bovine growth hormone (rbGH). The outrage factors applicable to milk from rbGH treated herds are involuntary risk exposure, unfamiliarity with the product's production process, unnatural product characteristics, lack of trust in regulator's ability to protect consumers in the marketplace, and consumers' inability to distinguish milk from rbGH treated herds compared to milk from untreated herds. An empirical analysis of data from a national survey of household food shoppers reveals that outrage factors mediate risk perceptions. The results support the inclusion of outrage factors into the risk perception model for the rbGH product, as they add significantly to the explanatory power of the model and therefore reduce bias compared to a simpler model of attitudinal and demographic factors. The study indicates that outrage factors which have a significant impact on risk perceptions are the lack of trust in the FDA as a food-related information source, and perceiving no consumer benefits from farmers' use of rbGH. Communication strategies to reduce consumer risk perceptions therefore could utilize agencies perceived as more trustworthy and emphasize the benefits of rbGH use to consumers.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    ISSN: 1539-6924
    Keywords: risk perceptions ; psychometric paradigm ; multilevel modeling ; random coefficient models
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Psychometric data on risk perceptions are often collected using the method developed by Slovic, Fischhoff, and Lichtenstein, where an array of risk issues are evaluated with respect to a number of risk characteristics, such as how dreadful, catastrophic or involuntary exposure to each risk is. The analysis of these data has often been carried out at an aggregate level, where mean scores for all respondents are compared between risk issues. However, this approach may conceal important variation between individuals, and individual analyses have also been performed for single risk issues. This paper presents a new methodological approach using a technique called multilevel modelling for analysing individual and aggregated responses simultaneously, to produce unconditional and unbiased results at both individual and aggregate levels of the data. Two examples are given using previously published data sets on risk perceptions collected by the authors, and results between the traditional and new approaches compared. The discussion focuses on the implications of and possibilities provided by the new methodology.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 739-749 
    ISSN: 1539-6924
    Keywords: probabilistic forecasting ; uncertainty quantification ; Bayesian method ; Monte-Carlo simulation ; decision making
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and “ensemble forecasting” technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 759-761 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    ISSN: 1539-6924
    Keywords: regulation ; radioactive waste ; performance assessment ; risk assessment ; regulatory assessment ; bias evaluation ; international collaboration ; underground disposal ; quantitative risk analysis ; public debate ; decision process
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Much has been written about the development and application of quantitative methods for estimating under uncertainty the long-term radiological performance of underground disposal of radioactive wastes. Until recently, interest has been focused almost entirely on the technical challenges regardless of the role of the organization responsible for these analyses. Now the dialogue between regulators, the repository developer or operator, and other interested parties in the decision-making process receives increasing attention, especially in view of some current difficulties in obtaining approvals to construct or operate deep facilities for intermediate or high-level wastes. Consequently, it is timely to consider the options for regulators' review and evaluation of safety submissions, at the various stages in the site selection to repository closure process, and to consider, especially, the role for performance assessment (PA) within the programs of a regulator both before and after delivery of such a submission. The origins and broad character of present regulations in the European Union (EU) and in the OECD countries are outlined and some regulatory PA reviewed. The issues raised are discussed, especially in regard to the interpretation of regulations, the dangers from the desire for simplicity in argument, the use of regulatory PA to review and challenge the PA in the safety case, and the effects of the relationship between proponent and regulator. Finally, a very limited analysis of the role of PA in public hearings is outlined and recommendations are made, together with proposals for improving the mechanisms for international collaboration on technical issues of regulatory concern.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 23-32 
    ISSN: 1539-6924
    Keywords: Software failures ; software hazard analysis ; safety-critical systems ; risk assessment ; context
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a “context-based” approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing “randomly.” The paper elaborates on the concept of “error-forcing context” as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify “error-forcing contexts” for software in the form of fault tree prime implicants.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 47-68 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    ISSN: 1539-6924
    Keywords: Variability ; uncertainty ; maximum likelihood ; bootstrap simulation ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 159-169 
    ISSN: 1539-6924
    Keywords: Trust ; geography ; personality ; environment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A sample of 323 residents of New Jersey stratified by neighborhood quality (excellent, good, fair, poor) was gathered to determine if trust in science and technology to protect public health and environment at the societal scale was associated with trust of the local officials, such as the mayor, health officer, developers, mass media, and legislators who are guardians of the local environment. Societal (trust of science and technology) and neighborhood (mayor, health officer) dimensions of trust were found. These societal and neighborhood trust dimensions were weakly correlated. Respondents were divided into four trust-of-authority groups: high societal–high neighborhood, low societal–low neighborhood, high societal–low neighborhood, and low societal–high neighborhood. High societal–high neighborhood trust respondents were older, had lived in the neighborhoods for many years, were not troubled much by neighborhood or societal environmental threats, and had a strong sense of control over their environment. In strong contrast, low societal–low neighborhood trust respondents were relatively young, typically had lived in their present neighborhood for a short time, were troubled by numerous neighborhood and societal environmental threats, did not practice many personal public health practices, and felt little control over their environment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    ISSN: 1539-6924
    Keywords: Risk ; fishing ; ethnicity ; perception ; toxics ; consumption
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Recreational and subsistence angling are important aspects of urban culture for much of North America where people are concentrated near the coasts or major rivers. Yet there are fish and shellfish advisories for many estuaries, rivers, and lakes, and these are not always heeded. This paper examines fishing behavior, sources of information, perceptions, and compliance with fishing advisories as a function of ethnicity for people fishing in the Newark Bay Complex of the New York–New Jersey Harbor. We test the null hypothesis that there were no ethnic differences in sources of information, perceptions of the safety of fish consumption, and compliance with advisories. There were ethnic differences in consumption rates, sources of information about fishing, knowledge about the safety of the fish, awareness of fishing advisories or of the correct advisories, and knowledge about risks for increased cancer and to unborn and young children. In general, the knowledge base was much lower for Hispanics, was intermediate for blacks, and was greatest for whites. When presented with a statement about the potential risks from eating fish, there were no differences in their willingness to stop eating fish or to encourage pregnant women to stop. These results indicate a willingness to comply with advisories regardless of ethnicity, but a vast difference in the base knowledge necessary to make informed risk decisions about the safety of fish and shellfish. Although the overall median income level of the population was in the $25,000–34,999 income category, for Hispanics it was on the border between $15,000–24,999 and $25,000–34,999.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    ISSN: 1539-6924
    Keywords: Cancer dose–response modeling ; multistage model ; two-stage model ; hazard functions ; carcinogenesis ; Benzene ; Dieldrin ; Ethylene Thiourea ; Trichloroethylene ; Vinyl Chloride
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We compare the regulatory implications of applying the traditional (linearized) and exact two-stage dose–response models to animal carcinogenic data. We analyze dose–response data from six studies, representing five different substances, and we determine the “goodness-of-fit” of each model as well as the 95% confidence lower limit of the dose corresponding to a target excess risk of 10−5 (the target risk dose TRD). For the two concave datasets, we find that the exact model gives a substantially better fit to the data than the traditional model, and that the exact model gives a TRD that is an order of magnitude lower than that given by the traditional model. In the other cases, the exact model gives a fit equivalent to or better than the traditional model. We also show that although the exact two-stage model may exhibit dose–response concavity at moderate dose levels, it is always linear or sublinear, and never supralinear, in the low-dose limit. Because regulatory concern is almost always confined to the low-dose region extrapolation, supralinear behavior seems not to be of regulatory concern in the exact two-stage model. Finally, we find that when performing this low-dose extrapolation in cases of dose–response concavity, extrapolating the model fit leads to a more conservative TRD than taking a linear extrapolation from 10% excess risk. We conclude with a set of recommendations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 69-81 
    ISSN: 1539-6924
    Keywords: Parameters ; probability distributions ; validity
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In any model the values of estimates for various parameters are obtained from different sources each with its own level of uncertainty. When the probability distributions of the estimates are obtained as opposed to point values only, the measurement uncertainties in the parameter estimates may be addressed. However, the sources used for obtaining the data and the models used to select appropriate distributions are of differing degrees of uncertainty. A hierarchy of different sources of uncertainty based upon one's ability to validate data and models empirically is presented. When model parameters are aggregated with different levels of the hierarchy represented, this implies distortion or degradation in the utility and validity of the models used. Means to identify and deal with such heterogeneous data sources are explored, and a number of approaches to addressing this problem is presented. One approach, using Range/Confidence Estimates coupled with an Information Value Analysis Process, is presented as an example.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1-2 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 33-42 
    ISSN: 1539-6924
    Keywords: Uncertainty ; model uncertainty ; epistemic uncertainty ; integrated assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem. Uncertainty about model structure may become as, or more important than, uncertainty about parameter values. When uncertainty grows so large that prediction or optimization no longer makes sense, it may still be possible to use the model as a “behavioral test bed” to examine the relative robustness of alternative observational and behavioral strategies. When models must be run into portions of their phase space that are not well understood, different submodels may become unreliable at different rates. A common example involves running a time stepped model far into the future. Several strategies can be used to deal with such situations. The probability of model failure can be reported as a function of time. Possible alternative “surprises” can be assigned probabilities, modeled separately, and combined. Finally, through the use of subjective judgments, one may be able to combine, and over time shift between models, moving from more detailed to progressively simpler order-of-magnitude models, and perhaps ultimately, on to simple bounding analysis.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 187-203 
    ISSN: 1539-6924
    Keywords: Combining probabilities ; expert judgment ; probability assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper concerns the combination of experts' probability distributions in risk analysis, discussing a variety of combination methods and attempting to highlight the important conceptual and practical issues to be considered in designing a combination process in practice. The role of experts is important because their judgments can provide valuable information, particularly in view of the limited availability of “hard data” regarding many important uncertainties in risk analysis. Because uncertainties are represented in terms of probability distributions in probabilistic risk analysis (PRA), we consider expert information in terms of probability distributions. The motivation for the use of multiple experts is simply the desire to obtain as much information as possible. Combining experts' probability distributions summarizes the accumulated information for risk analysts and decision-makers. Procedures for combining probability distributions are often compartmentalized as mathematical aggregation methods or behavioral approaches, and we discuss both categories. However, an overall aggregation process could involve both mathematical and behavioral aspects, and no single process is best in all circumstances. An understanding of the pros and cons of different methods and the key issues to consider is valuable in the design of a combination process for a specific PRA. The output, a “combined probability distribution,” can ideally be viewed as representing a summary of the current state of expert opinion regarding the uncertainty of interest.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    ISSN: 1539-6924
    Keywords: Hormesis ; U-shaped ; adaptive response ; low dose ; β-curve ; stimulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract From a comprehensive search of the literature, the hormesis phenomenon was found to occur over a wide range of chemicals, taxonomic groups, and endpoints. By use of computer searches and extensive cross-referencing, nearly 3000 potentially relevant articles were identified. Evidence of chemical and radiation hormesis was judged to have occurred in approximately 1000 of these by use of a priori criteria. These criteria included study design features (e.g., number of doses, dose range), dose–response relationship, statistical analysis, and reproducibility of results. Numerous biological endpoints were assessed, with growth responses the most prevalent, followed by metabolic effects, reproductive responses, longevity, and cancer. Hormetic responses were generally observed to be of limited magnitude with an average maximum stimulation of 30 to 60 percent over that of the controls. This maximum usually occurred 4- to 5-fold below the NOAEL for a particular endpoint. The present analysis suggests that hormesis is a reproducible and generalizable biological phenomenon and is a fundamental component of many, if not most, dose–response relationships. The relatively infrequent observation of hormesis in the literature is believed to be due primarily to experimental design considerations, especially with respect to the number and range of doses and endpoint selection. Because of regulatory considerations, most toxicologic studies have been carried out at high doses above the low-dose region where the hormesis phenomenon occurs.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 323-326 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 327-334 
    ISSN: 1539-6924
    Keywords: Biological introductions ; binucleate Rhizoctonia ; biocontrol ; risk assessment ; seedlings ; susceptibility
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This article describes an application of a method for assessing risks associated with the introduction of an organism into a new environment. The test organism was a binucleate Rhizoctonia fungal isolate that has potential for commercial development as a biological control agent for damping-off diseases in bedding plants. A test sample of host plant species was selected using the centrifugal phylogenetic host range principles, but with an emphasis on economic species. The effect of the fungus on the plant was measured for each species and expressed on a logarithmic scale. The effects on weights of shoots and roots per container were not normally distributed, nor were the effects on the number of plants standing (those which survived). Statements about the effect on the number standing and the shoot weight per container involved using the observed (empirical) distribution. This is illustrated with an example. Problems were encountered in defining the population of species at risk, and in deciding how this population should be formally sampled. The limitations of the method are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper discusses a successful public involvement effort that addressed and resolved several highly controversial water management issues involving environmental and flood risks associated with an electrical generation facility in British Columbia. It begins with a discussion of concepts for designing public involvement, summarizing research that indicates why individuals and groups may find it difficult to make complex choices. Reasons for public involvement, and the range of current practices are discussed. Next, four principles for designing group decision process are outlined, emphasizing decision-aiding concepts that include “value-focused thinking” and “adaptive management.” The next sections discuss the Alouette River Stakeholder Committee process in terms of objectives, participation, process, methods for structuring values and creating alternatives, information sources, and results. Discussion and conclusions complete the paper.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 417-426 
    ISSN: 1539-6924
    Keywords: Pathway analysis ; radiological risk assessment ; dose assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Many different radionuclides have been released to the environment from the Savannah River Site (SRS) during the facility's operational history. However, as shown by this analysis, only a small number of the released radionuclides have been significant contributors to potential doses and risks to off-site people. This article documents the radiological critical contaminant/critical pathway analysis performed for SRS. If site missions and operations remain constant over the next 30 years, only tritium oxide releases are projected to exceed a maximally exposed individual (MEI) risk of 1.0E-06 for either the airborne or liquid pathways. The critical exposure pathways associated with site airborne releases are inhalation and vegetation consumption, whereas the critical exposure pathways associated with liquid releases are drinking water and fish consumption. For the SRS-specific, nontypical exposure pathways (i.e., recreational fishing and deer and hog hunting), cesium-137 is the critical radionuclide.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    ISSN: 1539-6924
    Keywords: nuclear weapons sites ; accelerated cleanup ; economic impact ; Savannah River ; Rocky Flats ; Hanford ; INEEL ; Oak Ridge ; Los Alamos ; Sandia
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The regional economic impacts of the U.S. Department of Energy's accelerated environmental cleanup plan are estimated for the major nuclear weapons sites in Colorado, Idaho, New Mexico, South Carolina, Tennessee, and Washington. The analysis shows that the impact falls heavily on the three relatively rural regions around the Savannah River (SC), Hanford (WA), and Idaho National Engineering and Environmental Laboratory (ID) sites. A less aggressive phase-down of environmental management funds and separate funds to invest in education and infrastructure in the regions helps buffer the impacts on jobs, personal income, and gross regional product. Policy options open to the federal and state and local governments are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 649-659 
    ISSN: 1539-6924
    Keywords: risk perception ; risk communication ; intuitive toxicology ; mental models
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The concept of exposure is central to chemical risk assessment and plays an important role in communicating to the public about the potential health risks of chemicals. Research on chemical risk perception has found some indication that the model lay people use to judge chemical exposure differs from that of toxicologists, thereby leading to different conclusions about chemical safety. This paper presents the results of a series of studies directed toward developing a model for understanding how lay people interpret the concept of chemical exposure. The results indicate that people's beliefs about chemical exposure (and its risks) are based on two broad categories of inferences. One category of inferences relates to the nature in which contact with a chemical has taken place, including the amount of a chemical involved and its potential health consequences. A second category of inferences about chemical exposure relates to the pragmatics of language interpretation, leading to beliefs about the motives and purposes behind chemical risk communication. Risk communicators are encouraged to consider how alternative models of exposure and language interpretation can lead to conflicting conclusions on the part of the public about chemical safety.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    ISSN: 1539-6924
    Keywords: Cancer model ; cell proliferation ; two-stage model ; approximate solution ; MVK model ; hazard rate
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The approximate solution of the two-stage clonal expansion model of cancer may substantially deviate from the exact solution, and may therefore lead to erroneous conclusions in particular applications. However, for time-varying parameters the exact solution (method of characteristics) is not easy to implement, hampering the accessibility of the model to nonmathematicians. Based on intuitive reasoning, Clewell et al. (1995) proposed an improved approximate solution that is easy to implement whatever time-varying behavior the parameters may have. Here we provide the mathematical foundation for the approximation suggested by Clewell et al. (1995) and show that, after a slight modification, it is in fact an exact solution for the case of time-constant parameters. We were not able to prove that it is an exact solution for time-varying parameters as well. However, several computer simulations showed that the numerical results do not differ from the exact solution as proposed by Moolgavkar and Luebeck (1990). The advantage of this alternative solution is that the hazard rate of the first malignant cell can be evaluated by numerically integrating a single differential equation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 43-46 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    ISSN: 1539-6924
    Keywords: risk-tradeoff analysis ; building codes ; housing ; health effects ; QALY
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Residential building codes intended to promote health and safety may produce unintended countervailing risks by adding to the cost of construction. Higher construction costs increase the price of new homes and may increase health and safety risks through “income” and “stock” effects. The income effect arises because households that purchase a new home have less income remaining for spending on other goods that contribute to health and safety. The stock effect arises because suppression of new-home construction leads to slower replacement of less safe housing units. These countervailing risks are not presently considered in code debates. We demonstrate the feasibility of estimating the approximate magnitude of countervailing risks by combining the income effect with three relatively well understood and significant home-health risks. We estimate that a code change that increases the nationwide cost of constructing and maintaining homes by $150 (0.1% of the average cost to build a single-family home) would induce offsetting risks yielding between 2 and 60 premature fatalities or, including morbidity effects, between 20 and 800 lost quality-adjusted life years (both discounted at 3%) each year the code provision remains in effect. To provide a net health benefit, the code change would need to reduce risk by at least this amount. Future research should refine these estimates, incorporate quantitative uncertainty analysis, and apply a full risk-tradeoff approach to real-world case studies of proposed code changes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1071-1076 
    ISSN: 1539-6924
    Keywords: upper confidence limit ; likelihood-based confidence limit ; multistage carcinogenesis model ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The estimation of health risks from exposure to a mixture of chemical carcinogens is generally based on the combination of information from several available single compound studies. The current practice of directly summing the upper bound risk estimates of individual carcinogenic components as an upper bound on the total risk of a mixture is known to be generally too conservative. Gaylor and Chen (1996, Risk Analysis) proposed a simple procedure to compute an upper bound on the total risk using only the upper confidence limits and central risk estimates of individual carcinogens. The Gaylor-Chen procedure was derived based on an underlying assumption of the normality for the distributions of individual risk estimates. In this paper we evaluated the Gaylor-Chen approach in terms of the coverage probability. The performance of the Gaylor-Chen approach in terms the coverages of the upper confidence limits on the true risks of individual carcinogens. In general, if the coverage probabilities for the individual carcinogens are all approximately equal to the nominal level, then the Gaylor-Chen approach should perform well. However, the Gaylor-Chen approach can be conservative or anti-conservative if some or all individual upper confidence limit estimates are conservative or anti-conservative.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    ISSN: 1539-6924
    Keywords: dose-response ; models ; food-borne ; pathogens ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Food-related illness in the United States is estimated to affect over six million people per year and cost the economy several billion dollars. These illnesses and costs could be reduced if minimum infectious doses were established and used as the basis of regulations and monitoring. However, standard methodologies for dose-response assessment are not yet formulated for microbial risk assessment. The objective of this study was to compare dose-response models for food-borne pathogens and determine which models were most appropriate for a range of pathogens. The statistical models proposed in the literature and chosen for comparison purposes were log-normal, log-logistic, exponential, β-Poisson and Weibull-Gamma. These were fit to four data sets also taken from published literature, Shigella flexneri, Shigella dysenteriae,Campylobacter jejuni, and Salmonella typhosa, using the method of maximum likelihood. The Weibull-gamma, the only model with three parameters, was also the only model capable of fitting all the data sets examined using the maximum likelihood estimation for comparisons. Infectious doses were also calculated using each model. Within any given data set, the infectious dose estimated to affect one percent of the population ranged from one order of magnitude to as much as nine orders of magnitude, illustrating the differences in extrapolation of the dose response models. More data are needed to compare models and examine extrapolation from high to low doses for food-borne pathogens.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    ISSN: 1539-6924
    Keywords: ethylene oxide ; risk assessment ; epidemiology ; cancer guidelines
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Ethylene oxide (EO) research has significantly increased since the 1980s, when regulatory risk assessments were last completed on the basis of the animal cancer chronic bioassays. In tandem with the new scientific understanding, there have been evolutionary changes in regulatory risk assessment guidelines, that encourage flexibility and greater use of scientific information. The results of an updated meta-analysis of the findings from 10 unique EO study cohorts from five countries, including nearly 33,000 workers, and over 800 cancers are presented, indicating that EO does not cause increased risk of cancers overall or of brain, stomach or pancreatic cancers. The findings for leukemia and non-Hodgkin's lymphoma (NHL) are inconclusive. Two studies with the requisite attributes of size, individual exposure estimates and follow up are the basis for dose-response modeling and added lifetime risk predictions under environmental and occupational exposure scenarios and a variety of plausible alternative assumptions. A point of departure analysis, with various margins of exposure, is also illustrated using human data. The two datasets produce remarkably similar leukemia added risk predictions, orders of magnitude lower than prior animal-based predictions under conservative, default assumptions, with risks on the order of 1 × 10−6 or lower for exposures in the low ppb range. Inconsistent results for “lymphoid” tumors, a non-standard grouping using histologic information from death certificates, are discussed. This assessment demonstrates the applicability of the current risk assessment paradigm to epidemiological data.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    ISSN: 1539-6924
    Keywords: physiologically-based toxicokinetics ; empirical Bayes ; MAP estimation ; mathematical model ; toluene ; error analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Physiologically-based toxicokinetic (PBTK) models are widely used to quantify whole-body kinetics of various substances. However, since they attempt to reproduce anatomical structures and physiological events, they have a high number of parameters. Their identification from kinetic data alone is often impossible, and other information about the parameters is needed to render the model identifiable. The most commonly used approach consists of independently measuring, or taking from literature sources, some of the parameters, fixing them in the kinetic model, and then performing model identification on a reduced number of less certain parameters. This results in a substantial reduction of the degrees of freedom of the model. In this study, we show that this method results in final estimates of the free parameters whose precision is overestimated. We then compared this approach with an empirical Bayes approach, which takes into account not only the mean value, but also the error associated with the independently determined parameters. Blood and breath 2H8-toluene washout curves, obtained in 17 subjects, were analyzed with a previously presented PBTK model suitable for person-specific dosimetry. Model parameters with the greatest effect on predicted levels were alveolar ventilation rate QPC, fat tissue fraction VFC, blood-air partition coefficient Kb, fraction of cardiac output to fat Qa/co and rate of extrahepatic metabolism Vmax-p. Differences in the measured and Bayesian-fitted values of QPC, VFC and Kb were significant (p 〈 0.05), and the precision of the fitted values Vmax-p and Qa/co went from 11 ± 5% to 75 ± 170% (NS) and from 8 ± 2% to 9 ± 2% (p 〈 0.05) respectively. The empirical Bayes approach did not result in less reliable parameter estimates: rather, it pointed out that the precision of parameter estimates can be overly optimistic when other parameters in the model, either directly measured or taken from literature sources, are treated as known without error. In conclusion, an empirical Bayes approach to parameter estimation resulted in a better model fit, different final parameter estimates, and more realistic parameter precisions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    ISSN: 1539-6924
    Keywords: computer animation ; probability plots ; mixture models ; LogNormal distributions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Risk assessors often use different probability plots as a way to assess the fit of a particular distribution or model by comparing the plotted points to a straight line and to obtain estimates of the parameters in parametric distributions or models. When empirical data do not fall in a sufficiently straight line on a probability plot, and when no other single parametric distribution provides an acceptable (graphical) fit to the data, the risk assessor may consider a mixture model with two component distributions. Animated probability plots are a way to visualize the possible behaviors of mixture models with two component distributions. When no single parametric distribution provides an adequate fit to an empirical dataset, animated probability plots can help an analyst pick some plausible mixture models for the data based on their qualitative fit. After using animations during exploratory data analysis, the analyst must then use other statistical tools, including but not limited to: Maximum Likelihood Estimation (MLE) to find the optimal parameters, Goodness of Fit (GoF) tests, and a variety of diagnostic plots to check the adequacy of the fit. Using a specific example with two LogNormal components, we illustrate the use of animated probability plots as a tool for exploring the suitability of a mixture model with two component distributions. Animations work well with other types of probability plots, and they may be extended to analyze mixture models with three or more component distributions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1193-1204 
    ISSN: 1539-6924
    Keywords: multimedia modeling ; uncertainty ; variability ; exposure efficiency ; toxicity scoring ; toxics release inventory (TRI) ; life cycle assessment (LCA)
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows us to focus our resources if we want to reduce the uncertainty. The potential dose of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1223-1234 
    ISSN: 1539-6924
    Keywords: risk assessment ; standard-setting ; carcinogens ; OSHA ; ACGIH
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract For carcinogens, this paper provides a quantitative examination of the roles of potency and weight-of-evidence (WOE) in setting permissible exposure limits (PELs) at the U.S. Occupational Safety and Health Administration (OSHA) and threshold limit values (TLVs) at the private American Conference of Governmental Industrial Hygienists (ACGIH). On normative grounds, both of these factors should influence choices about the acceptable level of exposures. Our major objective is to examine whether and in what ways these factors have been considered by these organizations. A lesser objective is to identify outliers, which might be candidates for further regulatory scrutiny. Our sample (N=48) includes chemicals for which EPA has estimated a unit risk as a measure of carcinogenic potency and for which OSHA or the ACGIH has a PEL or TLV. Different assessments of the strength of the evidence of carcinogenicity were obtained from EPA, ACGIH, and the International Agency for Research on Cancer. We found that potency alone explains 49% of the variation in PELs and 62% of the variation in TLVs. For the ACGIH, WOE plays a much smaller role than potency. TLVs set by the ACGIH since 1989 appear to be stricter than earlier TLVs. We suggest that this change represents evidence that the ACGIH had responded to criticisms leveled at it in the late 1980s for failing to adopt sufficiently protective standards. The models developed here identify 2-nitropropane, ethylene dibromide, and chromium as having OSHA PELs significantly higher than predicted on the basis of potency and WOE.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1251-1260 
    ISSN: 1539-6924
    Keywords: dose response ; gastroenteritis ; mathematical model ; hazard
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract When pathogenic microorganisms enter the human body via ingestion with food or drinking water, they encounter a system of barriers mounted by the host. In order to reach parts of the intestinal tract that are suitable for growth and attachment, each of the barriers must be overcome successfully. The present view on infection states that at least one of the ingested pathogens must survive to start colonization. This is the basis for dose response models, used for quantitative risk assessment. In this paper, the usefulness of the Beta Poisson model for multiple barriers is corroborated. Infection is associated with the presence of elevated numbers of reproducing pathogens in the intestinal tract. This does not necessarily imply illness symptoms: when intestinal microorganisms engage in damaging activities, this may lead to illness symptoms. At the same time, these activities probably elicit defensive measures from the host, promoting the removal of pathogens and terminating infection. The duration of the period of colonization reflects the balance between the colonization potential of pathogens and the strength of host defenses. Starting from the assumption that during infection the host has a certain hazard of becoming ill, a simple dose response relation for acute gastroenteritis is developed. With the use of literature data from volunteer experiments, we show that examples can be found for three possible alternatives: an increase in the probability of illness with increasing dose, a decrease with higher doses, and a probability of illness (given infection) independent of the ingested dose. These alternatives may reflect different modes of interaction between pathogens and host.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    ISSN: 1539-6924
    Keywords: Extreme events ; risk assessment ; risk management ; extreme value theory ; judgmental distributions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In this paper, we review methods for assessing and managing the risk of extreme events, where “extreme events” are defined to be rare, severe, and outside the normal range of experience of the system in question. First, we discuss several systematic approaches for identifying possible extreme events. We then discuss some issues related to risk assessment of extreme events, including what type of output is needed (e.g., a single probability vs. a probability distribution), and alternatives to the probabilistic approach. Next, we present a number of probabilistic methods. These include: guidelines for eliciting informative probability distributions from experts; maximum entropy distributions; extreme value theory; other approaches for constructing prior distributions (such as reference or noninformative priors); the use of modeling and decomposition to estimate the probability (or distribution) of interest; and bounding methods. Finally, we briefly discuss several approaches for managing the risk of extreme events, and conclude with recommendations and directions for future research.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 153-157 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 335-348 
    ISSN: 1539-6924
    Keywords: ethics ; risk communication
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Proposed in this article is one possible framework for classifying multiple types of ethical issues in risk communication research and practice to help continue a discussion initiated in 1990 by Morgan and Lave. Some of the questions that each stage of the process for planning risk communication strategies appears to pose for ethics are discussed (e.g., selecting issues to be communicated, knowing the issue, dealing with constraints). Also discussed briefly are some issues raised by the possibility that risk communicators aspire to the status of a profession. The purpose is to foster discussion rather than issue a conclusive statement on the topic, because its very nature makes a definitive pronouncement indefensible.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...