ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Books
  • Articles  (5,819)
  • Data
  • Springer  (5,819)
  • 1995-1999  (5,819)
  • 1999  (5,819)
  • Energy, Environment Protection, Nuclear Power Engineering  (3,986)
  • Electrical Engineering, Measurement and Control Technology  (1,833)
Collection
  • Books
  • Articles  (5,819)
  • Data
Years
  • 1995-1999  (5,819)
Year
Journal
  • 1
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 27-42 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Of concern is the propagation of distortionless surface waves in a medium that may be nonuniform relative to depth. Distortionless wave propagation in inhomogeneous media was discussed by V. Burke, R. J. Duffin and D. Hazony, inQuart. Appl. Math., 183–194 (1976). Accordingly, the media could be modeled by a distributed electrical ladder network, nonuniform along the axis. We give a two-dimensional development based on Hooke's law and Newton's law which leads to the well-known case of Rayleigh waves in homogeneous media. It will be seen that the available pool of propagation modes greatly increases when high-pass propagation is included. The emphasis is on media where the elastic coefficients track one another as a function of depth. Special cases are studied in detail showing that as a disturbance travels along the surface, it may assume a broadband phase change, which translates into a shape distortion in the time domain, which is periodic with distance. Applications may be found in acousto-optics, in situ monitoring of elongated bodies, high-frequency SAW filters, microstrips, and any situations where surface waves are used in an environment of high precision or relatively large distances.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 131-147 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract This study presents a linear output-based controller for stabilizing a rigid-link flexible-joint electrically driven (RLFJED) robot manipulator. The proposed controller ensures local exponential stability under some uncertainty conditions. It is assumed that the velocity signals from the link side are not measurable. The controller is analyzed by using tools for pole placement by an output-feedback in the framework of the linear system theory. Some useful structural properties of the systems under consideration have been studied. Applications of the results to the set-point regulation control problem are considered.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 205-223 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper we consider an adaptive controller with vanishing gain and excitation of the reference signal. We use the burst recovery concept to show that all signals in the adaptive loop remain uniformly bounded. We also show that the mean-square performance converges so that the adaptive system is optimal in the sense that the parameter estimation error and the one-step ahead prediction error are uncorrelated in the mean despite the presence of the unmodeled dynamics.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 191-204 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Finite homogeneous Markov chains ξ, which admit invariant probability distributions, can be defined by the cycloids { $$\bar C_k $$ } (closed polygonal lines whose consecutive edges have various orientations that do not necessarily determine a common direction for $$\bar C_k $$ ) occurring in their graphs. These Markov chains are called cycloid chains, and the corresponding finite-dimensional distributions are linear expressions on the cycloids { $$\bar C_k $$ } with the real coefficients αk. Then the collection {{ $$\bar C_k $$ }, {αk}}, called the cycloid decomposition of ξ, gives a minimal description of the finite-dimensional distributions that, except for a choice of the maximal tree, uniquely determines the chain ξ. Furthermore, the cycloid decompositions have an interpretation in terms of the transition probability functions expressing the same essence as the known Chapman-Kolmogorov equations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 241-267 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract We study solutions of the “linear system in a saturated mode” $$\begin{array}{*{20}c} {(M)} & {x' \in Tx + c - \partial I_{D^n } x.} \\ \end{array} $$ We show that a trajectory is in a constant face of the cubeD n on some interval (0,d]. We answer a question about comparing the two systems: (M) and $$\begin{array}{*{20}c} {(H)} & {\begin{array}{*{20}c} {Cu' = T\upsilon + c - R^{ - 1} u,} & {\upsilon = G(\lambda } \\ \end{array} u)} \\ \end{array} $$ . As λ→∞, limits ofv corresponding to asymptotically stable equilibrium points of (H) are asymptotically stable equilibrium points of (M), and the converse is also true. We study the assumptions to see which are required and which may be weakened.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 291-314 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper we introduce a new computational method for solving the diffusion equation. In particular, we construct a “generalized” state-space system and compute the impulse response of an equivalent truncated state-space system. In this effort, we use a 3D finite element method (FEM) to obtain the state-space system. We then use the Arnoldi iteration to approximate the state impulse response by projecting on the dominant controllable subspace. The idea exploited here is the approximation of the impulse response of the linear system. We study the homogeneous and heterogeneous cases and discuss the approximation error. Finally, we compare our computational results to our experimental setup.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 351-364 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract A simple state-space approach for the four-block singular nonlinearH ∞ control problem is proposed in this paper. This approach combines a (J, J′)-lossless and a class of conjugate (J, J′)-expansive systems to yield a family of nonlinearH ∞ output feedback controllers. The singular nonlinearH ∞ control problem is thus transformed into a simple lossless network problem that is easy to deal with in a network-theory context.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 395-406 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The stability of time-varying autoregressive (TVAR) models is an important issue in many applications such as time-varying spectral estimation, EEG simulation and analysis, and time-varying linear prediction coding (TVLPC). For stationary AR models there are methods that guarantee stability, but the for nonadaptive time-varying approaches there are no such methods. On the other hand, in some situations, such as in EEG analysis, the models that temporarily exhibit roots with almost unit moduli are difficult to use. Thus we may need a tighter stability condition such as stability with margin 1−ϱ. In this paper we propose a method for the estimation of TVAR models that guarantees stability with margin 1−ϱ, that is, the moduli of the roots of the time-varying characteristic polynomial are less than or equal to some arbitrary positive number ϱ for every time instant. The model class is the Subba Rao-Liporace class, in which the time-varying coefficients are constrained to a subspace of the coefficient time evolutions. The method is based on sequential linearization of the associated nonlinear constraints and the subsequent use of a Gauss-Newton-type algorithm. The method is also applied to a simulated autoregressive process.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 443-443 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 283-294 
    ISSN: 1539-6924
    Keywords: Risk perception ; pesticides ; pest management ; health effects ; agricultural pollution
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Water pollution from agricultural pesticides continues to be a public concern. Given that the use of such pesticides on the farm is largely governed by voluntary behavior, it is important to understand what drives farmer behavior. Health belief models in public health and social psychology argue that persons who have adverse health experiences are likely to undertake preventive behavior. An analogous hypothesis set was tested here: farmers who believe they have had adverse health experiences from pesticides are likely to have heightened concerns about pesticides and are more likely to take greater precautions in dealing with pesticides. This work is based on an original survey of a population of 2700 corn and soybean growers in Maryland, New York, and Pennsylvania using the U.S. Department of Agriculture data base. It was designed as a mail survey with telephone follow-up, and resulted in a 60 percent response rate. Farm operators report experiencing adverse health problems they believe are associated with pesticides that is equivalent to an incidence rate that is higher than the reported incidence of occupational pesticide poisonings, but similar to the reported incidence of all pesticide poisonings. Farmers who report experiencing such problems have more heightened concerns about water pollution from fertilizers and pesticides, and illness and injury from mixing, loading, and applying pesticides than farmers who have not experienced such problems. Farmers who report experiencing such problems also are more likely to report using alternative pest management practices than farmers who do not report having such problems. This implies that farmers who have had such experiences do care about the effects of application and do engage in alternative means of pest management, which at least involve the reduction in pesticide use.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    ISSN: 1539-6924
    Keywords: Ethnicity ; fish consumption ; advisories ; Savannah River ; methylmercury ; risk perception
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract South Carolina has issued fish consumption advisories for the Savannah River based on mercury and radionuclide levels. We examine differences in fishing rates and fish consumption of 258 people interviewed while fishing along the Savannah River, as a function of age, education, ethnicity, employment history, and income, and test the assumption that the average consumption of fish is less than the recreational value of 19 kg/year assumed by risk assessors. Ethnicity and education contributed significantly to explaining variations in number of fish meals per month, serving size, and total quantity of fish consumed per year. Blacks fished more often, ate more fish meals of slightly larger serving sizes, and consumed more fish per year than did Whites. Although education and income were correlated, education contributed most significantly to behavior; people who did not graduate from high school ate fish more often, ate more fish per year, and ate more whole fish than people who graduated from high school. Computing consumption of fish for each person individually indicates that (1) people who eat fish more often also eat larger portions, (2) a substantial number of people consume more than the amount of fish used to compute risk to recreational fishermen, (3) some people consume more than the subsistence level default assumption (50 kg/year) and (4) Blacks consume more fish per year than Whites, putting them at greater risk from contaminants in fish. Overall, ethnicity, age, and education contributed to variations in fishing behavior and consumption.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 453-459 
    ISSN: 1539-6924
    Keywords: Efficiency ; nonquantal ; probit ; quantal
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Methods of quantitative risk assessment for toxic responses that are measured on a continuous scale are not well established. Although risk-assessment procedures that attempt to utilize the quantitative information in such data have been proposed, there is no general agreement that these procedures are appreciably more efficient than common quantal dose–response procedures that operate on dichotomized continuous data. This paper points out an equivalence between the dose–response models of the nonquantal approach of Kodell and West(1) and a quantal probit procedure, and provides results from a Monte Carlo simulation study to compare coverage probabilities of statistical lower confidence limits on dose corresponding to specified additional risk based on applying the two procedures to continuous data from a dose–response experiment. The nonquantal approach is shown to be superior, in terms of both statistical validity and statistical efficiency.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    ISSN: 1539-6924
    Keywords: Threshold ; measurement error ; mortality ; air pollution
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The association between daily fluctuations in ambient particulate matter and daily variations in nonaccidental mortality have been extensively investigated. Although it is now widely recognized that such an association exists, the form of the concentration–response model is still in question. Linear, no threshold and linear threshold models have been most commonly examined. In this paper we considered methods to detect and estimate threshold concentrations using time series data of daily mortality rates and air pollution concentrations. Because exposure is measured with error, we also considered the influence of measurement error in distinguishing between these two completing model specifications. The methods were illustrated on a 15-year daily time series of nonaccidental mortality and particulate air pollution data in Toronto, Canada. Nonparametric smoothed representations of the association between mortality and air pollution were adequate to graphically distinguish between these two forms. Weighted nonlinear regression methods for relative risk models were adequate to give nearly unbiased estimates of threshold concentrations even under conditions of extreme exposure measurement error. The uncertainty in the threshold estimates increased with the degree of exposure error. Regression models incorporating threshold concentrations could be clearly distinguished from linear relative risk models in the presence of exposure measurement error. The assumption of a linear model given that a threshold model was the correct form usually resulted in overestimates in the number of averted premature deaths, except for low threshold concentrations and large measurement error.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 527-545 
    ISSN: 1539-6924
    Keywords: breast-feeding ; chlorinated compounds ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Exposure to persistent organochlorines in breast milk was estimated probabilistically for Canadian infants. Noncancer health effects were evaluated by comparing the predicted exposure distributions to published guidance values. For chemicals identified as potential human carcinogens, cancer risks were evaluated using standard methodology typically applied in Canada, as well as an alternative method developed under the Canadian Environmental Protection Act. Potential health risks associated with exposure to persistent organochlorines were quantitatively and qualitatively weighed against the benefits of breast-feeding. Current levels of the majority of contaminants identified in Canadian breast milk do not pose unacceptable risks to infants. Benefits of breast-feeding are well documented and qualitatively appear to outweigh potential health concerns associated with organochlorine exposure. Furthermore, the risks of mortality from not breast-feeding estimated by Rogan and colleagues exceed the theoretical cancer risks estimated for infant exposure to potential carcinogens in Canadian breast milk. Although levels of persistent compounds have been declining in Canadian breast milk, potentially significant risks were estimated for exposure to polychlorinated biphenyls, dibenzo-p-dioxins, and dibenzofurans. Follow-up work is suggested that would involve the use of a physiologically based toxicokinetic model with probabilistic inputs to predict dioxin exposure to the infant. A more detailed risk analysis could be carried out by coupling the exposure estimates with a dose–response analysis that accounts for uncertainty.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    ISSN: 1539-6924
    Keywords: air dispersion ; models ; validation ; Rocky Flats
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Five atmospheric transport models were evaluated for use in Phase II of the Historical Public Exposures Studies at the Rocky Flats Plant. Models included a simple straight-line Gaussian plume model (ISCST2), several integrated puff models (RATCHET, TRIAD, and INPUFF2), and a complex terrain model (TRAC). Evaluations were based on how well model predictions compared with sulfur hexafluoride tracer measurements taken in the vicinity of Rocky Flats in February 1991. Twelve separate tracer experiments were conducted, each lasting 9 hr and measured at 140 samplers in arcs 8 and 16 km from the release point at Rocky Flats. Four modeling objectives were defined based on the endpoints of the overall study: (1) the unpaired maximum hourly average concentration, (2) paired time-averaged concentration, (3) unpaired time-averaged concentration, and (4) arc-integrated concentration. Performance measures were used to evaluate models and focused on the geometric mean and standard deviation of the predicted-to-observed ratio and the correlation coefficient between predicted and observed concentrations. No one model consistently outperformed the others in all modeling objectives and performance measures. About 75% of the maximum hourly concentration predictions were within a factor of 5 of the observations. About 64% of the paired and 80% of the unpaired time-averaged model predictions were within a factor of 5 of the observations. The overall performance of the RATCHET model was somewhat better than the other models. All models appeared to experience difficulty defining plume trajectories, which was attributed to the influence of multilayered flow initiated by terrain complexities and the diurnal flow patterns characteristic of the Colorado Front Range.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    ISSN: 1539-6924
    Keywords: initiation ; Monte Carlo methods ; promotion
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We present the results of a quantitative assessment of the lung cancer risk associated with occupational exposure to refractory ceramic fibers (RCF). The primary sources of data for our risk assessment were two long-term oncogenicity studies in male Fischer rats conducted to assess the potential pathogenic effects associated with prolonged inhalation of RCF. An interesting feature of the data was the availability of the temporal profile of fiber burden in the lungs of experimental animals. Because of this information, we were able to conduct both exposure–response and dose–response analyses. Our risk assessment was conducted within the framework of a biologically based model for carcinogenesis, the two-stage clonal expansion model, which allows for the explicit incorporation of the concepts of initiation and promotion in the analyses. We found that a model positing that RCF was an initiator had the highest likelihood. We proposed an approach based on biological considerations for the extrapolation of risk to humans. This approach requires estimation of human lung burdens for specific exposure scenarios, which we did by using an extension of a model due to Yu. Our approach acknowledges that the risk associated with exposure to RCF depends on exposure to other lung carcinogens. We present estimates of risk in two populations: (1) a population of nonsmokers and (2) an occupational cohort of steelworkers not exposed to coke oven emissions, a mixed population that includes both smokers and nonsmokers.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    ISSN: 1539-6924
    Keywords: accident risk ; population distribution ; RADTRAN ; transportation ; radioactive materials
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Calculation of accident dose-risk estimates with the RADTRAN code requires input data describing the population likely to be affected by the plume of radioactive material (RAM) released in a hypothetical transportation accident. In the existing model, population densities within 1/2 mile (0.8 km) of the route centerline are tabulated in three ranges (Rural, Suburban, and Urban). These population densities may be of questionable validity since the plume in the RADTRAN analysis is assumed to extend out to 120 km from the hypothetical accident site. We present a GIS-based population model which accounts for the actual distribution of population under a potential plume, and compare accident-risk estimates based on the resulting population densities with those based on the existing model. Results for individual points along a route differ greatly, but the cumulative accident risks for a sample route of a few hundred kilometers are found to be comparable, if not identical. We conclude, therefore, that for estimation of aggregate accident risks over typical routes of several hundred kilometers, the existing, simpler RADTRAN model is sufficiently detailed and accurate.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 685-687 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 703-710 
    ISSN: 1539-6924
    Keywords: probabilistic risk analysis ; subjective judgment ; risk-informed regulation ; robust Bayesian analysis ; human performance ; human error ; management and organizational factors ; corporate culture
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper discusses a number of the key challenges to the acceptance and application of probabilistic risk analysis (PRA). Those challenges include: (a) the extensive reliance on subjective judgment in PRA, requiring the development of guidance for the use of PRA in risk-informed regulation, and possibly the development of “robust” or “reference” prior distributions to minimize the reliance on judgment; and (b) the treatment of human performance in PRA, including not only human error per se but also management and organizational factors more broadly. All of these areas are seen as presenting interesting research challenges at the interface between engineering and other disciplines.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 689-701 
    ISSN: 1539-6924
    Keywords: risk ; risk perception ; risk assessment ; risk communication ; risk management
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Risk management has become increasingly politicized and contentious. Polarized views, controversy, and conflict have become pervasive. Research has begun to provide a new perspective on this problem by demonstrating the complexity of the concept “risk” and the inadequacies of the traditional view of risk assessment as a purely scientific enterprise. This paper argues that danger is real, but risk is socially constructed. Risk assessment is inherently subjective and represents a blending of science and judgment with important psychological, social, cultural, and political factors. In addition, our social and democratic institutions, remarkable as they are in many respects, breed distrust in the risk arena. Whoever controls the definition of risk controls the rational solution to the problem at hand. If risk is defined one way, then one option will rise to the top as the most cost-effective or the safest or the best. If it is defined another way, perhaps incorporating qualitative characteristics and other contextual factors, one will likely get a different ordering of action solutions. Defining risk is thus an exercise in power. Scientific literacy and public education are important, but they are not central to risk controversies. The public is not irrational. Their judgments about risk are influenced by emotion and affect in a way that is both simple and sophisticated. The same holds true for scientists. Public views are also influenced by worldviews, ideologies, and values; so are scientists' views, particularly when they are working at the limits of their expertise. The limitations of risk science, the importance and difficulty of maintaining trust, and the complex, sociopolitical nature of risk point to the need for a new approach—one that focuses upon introducing more public participation into both risk assessment and risk decision making in order to make the decision process more democratic, improve the relevance and quality of technical analysis, and increase the legitimacy and public acceptance of the resulting decisions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 727-738 
    ISSN: 1539-6924
    Keywords: mitigation ; insurance ; catastrophic risk ; building codes
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper examines the impact that insurance coupled with specific risk mitigation measures (RMMs) could have on reducing losses from hurricanes and earthquakes as well as improving the solvency position of insurers who provide coverage against these hazards. We first explore why relatively few individuals adopt cost-effective RMMs by reporting on the results of empirical studies and controlled laboratory studies. We then investigate the impact that an RMM has on both the expected losses and those from a worst case scenario in two model cities—Oakland (an earthquake-prone area) and Miami/Dade County (a hurricane-prone area) which were constructed respectively with the assistance of two modeling firms. The paper then explores three programs for forging a meaningful public-private sector partnership: well-enforced building codes, insurance premium reductions linked with long-term loans, and lower deductibles on insurance policies tied to mitigation. We conclude by briefly examining four issues for future research on linking mitigation with insurance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 711-726 
    ISSN: 1539-6924
    Keywords: variability ; exposure ; susceptibility ; risk assessment ; pharmacokinetics ; pharmacodynamics
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper reviews existing data on the variability in parameters relevant for health risk analyses. We cover both exposure-related parameters and parameters related to individual susceptibility to toxicity. The toxicity/susceptibility data base under construction is part of a longer term research effort to lay the groundwork for quantitative distributional analyses of non-cancer toxic risks. These data are broken down into a variety of parameter types that encompass different portions of the pathway from external exposure to the production of biological responses. The discrete steps in this pathway, as we now conceive them, are: •Contact Rate (Breathing rates per body weight; fish consumption per body weight) •Uptake or Absorption as a Fraction of Intake or Contact Rate •General Systemic Availability Net of First Pass Elimination and Dilution via Distribution Volume (e.g., initial blood concentration per mg/kg of uptake) •Systemic Elimination (half life or clearance) •Active Site Concentration per Systemic Blood or Plasma Concentration •Physiological Parameter Change per Active Site Concentration (expressed as the dose required to make a given percentage change in different people, or the dose required to achieve some proportion of an individual's maximum response to the drug or toxicant) •Functional Reserve Capacity–Change in Baseline Physiological Parameter Needed to Produce a Biological Response or Pass a Criterion of Abnormal Function Comparison of the amounts of variability observed for the different parameter types suggests that appreciable variability is associated with the final step in the process–differences among people in “functional reserve capacity.” This has the implication that relevant information for estimating effective toxic susceptibility distributions may be gleaned by direct studies of the population distributions of key physiological parameters in people that are not exposed to the environmental and occupational toxicants that are thought to perturb those parameters. This is illustrated with some recent observations of the population distributions of Low Density Lipoprotein Cholesterol from the second and third National Health and Nutrition Examination Surveys.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 751-758 
    ISSN: 1539-6924
    Keywords: nuclear waste ; high-level waste ; performance assessment ; Yucca Mountain
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The management of spent nuclear fuel and high-level nuclear waste has the deserved reputation as one of the most intractable policy issues facing the United States and other nations using nuclear reactors for electric power generation. This paper presents the author's perspective on this complex issue, based on a decade of service with the Nuclear Waste Technical Review Board and Board on Radioactive Waste Management of the National Research Council.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 763-807 
    ISSN: 1539-6924
    Keywords: risk assessment ; probabilistic risk assessment ; performance assessment ; policy analysis ; history of technology
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This article describes the evolution of the process for assessing the hazards of a geologic disposal system for radioactive waste and, similarly, nuclear power reactors, and the relationship of this process with other assessments of risk, particularly assessments of hazards from manufactured carcinogenic chemicals during use and disposal. This perspective reviews the common history of scientific concepts for risk assessment developed until the 1950s. Computational tools and techniques developed in the late 1950s and early 1960s to analyze the reliability of nuclear weapon delivery systems were adopted in the early 1970s for probabilistic risk assessment of nuclear power reactors, a technology for which behavior was unknown. In turn, these analyses became an important foundation for performance assessment of nuclear waste disposal in the late 1970s. The evaluation of risk to human health and the environment from chemical hazards is built on methods for assessing the dose response of radionuclides in the 1950s. Despite a shared background, however, societal events, often in the form of legislation, have affected the development path for risk assessment for human health, producing dissimilarities between these risk assessments and those for nuclear facilities. An important difference is the regulator's interest in accounting for uncertainty.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    ISSN: 1539-6924
    Keywords: performance assessment ; nuclear waste ; risk-informed regulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The U.S. Nuclear Regulatory Commission (NRC) staff has developed a performance assessment capability to address three programmatic areas in nuclear waste management: high-level waste, low-level waste, and decommissioning of licensed facilities (license termination). The NRC capability consists of: (1) methodologies for performance assessment; (2) models and computer codes for estimating system performance; (3) regulatory guidance in various forms, such as regulations, Branch Technical Positions, and Standard Review Plans; and (4) a technical staff experienced in executing and evaluating performance assessments for a variety of waste systems. Although the tools and techniques are refined for each programmatic area, general approaches and similar issues are encountered in all areas.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 903-913 
    ISSN: 1539-6924
    Keywords: nuclear waste ; performance assessment ; Yucca Mountain ; probability ; repository ; high-level waste ; risk ; engineered barriers
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In this paper the problem of high-level nuclear waste disposal is viewed as a five-stage, cascaded decision problem. The first four of these decisions having essentially been made, the work of recent years has been focused on the fifth stage, which concerns specifics of the repository design. The probabilistic performance assessment (PPA) work is viewed as the outcome prediction for this stage, and the site characterization work as the information gathering option. This brief examination of the proposed Yucca Mountain repository through a decision analysis framework resulted in three conclusions: (1) A decision theory approach to the process of selecting and characterizing Yucca Mountain would enhance public understanding of the issues and solutions to high-level waste management; (2) engineered systems are an attractive alternative to offset uncertainties in the containment capability of the natural setting and should receive greater emphasis in the design of the repository; and (3) a strategy of “waste management” should be adopted, as opposed to “waste disposal,” as it allows for incremental confirmation and confidence building of a permanent solution to the high-level waste problem.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 915-931 
    ISSN: 1539-6924
    Keywords: Yucca Mountain ; performance assessment ; logic tree ; high-level radioactive waste ; Monte Carlo ; expert judgment ; repository ; groundwater ; climate ; infiltration ; percolation ; hydrothermal ; corrosion
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The Electric Power Research Institute (EPRI) has sponsored the development of a model to assess the long-term, overall “performance” of the candidate spent fuel and high-level radioactive waste (HLW) disposal facility at Yucca Mountain, Nevada. The model simulates the processes that lead to HLW container corrosion, HLW mobilization from the spent fuel, and transport by groundwater, and contaminated groundwater usage by future hypothetical individuals leading to radiation doses to those individuals. The model must incorporate a multitude of complex, coupled processes across a variety of technical disciplines. Furthermore, because of the very long time frames involved in the modeling effort (≫104 years), the relative lack of directly applicable data, and many uncertainties and variabilities in those data, a probabilistic approach to model development was necessary. The developers of the model chose a logic tree approach to represent uncertainties in both conceptual models and model parameter values. The developers felt the logic tree approach was the most appropriate. This paper discusses the value and use of logic trees applied to assessing the uncertainties in HLW disposal, the components of the model, and a few of the results of that model. The paper concludes with a comparison of logic trees and Monte Carlo approaches.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    ISSN: 1539-6924
    Keywords: compliance certification application ; engineering analysis ; geochemistry ; geohydrology ; performance assessment ; probabilistic systems analysis ; radioactive waste ; scientific validity ; uncertainty ; 40 CFR 191
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Performance Assessment (PA) is the use of mathematical models to simulate the long-term behavior of engineered and geologic barriers in a nuclear waste repository; methods of uncertainty analysis are used to assess effects of parametric and conceptual uncertainties associated with the model system upon the uncertainty in outcomes of the simulation. PA is required by the U.S. Environmental Protection Agency as part of its certification process for geologic repositories for nuclear waste. This paper is a dialogue to explore the value and limitations of PA. Two “skeptics” acknowledge the utility of PA in organizing the scientific investigations that are necessary for confident siting and licensing of a repository; however, they maintain that the PA process, at least as it is currently implemented, is an essentially unscientific process with shortcomings that may provide results of limited use in evaluating actual effects on public health and safety. Conceptual uncertainties in a PA analysis can be so great that results can be confidently applied only over short time ranges, the antithesis of the purpose behind long-term, geologic disposal. Two “proponents” of PA agree that performance assessment is unscientific, but only in the sense that PA is an engineering analysis that uses existing scientific knowledge to support public policy decisions, rather than an investigation intended to increase fundamental knowledge of nature; PA has different goals and constraints than a typical scientific study. The “proponents” describe an ideal, six-step process for conducting generalized PA, here called probabilistic systems analysis (PSA); they note that virtually all scientific content of a PA is introduced during the model-building steps of a PSA; they contend that a PA based on simple but scientifically acceptable mathematical models can provide useful and objective input to regulatory decision makers. The value of the results of any PA must lie between these two views and will depend on the level of knowledge of the site, the degree to which models capture actual physical and chemical processes, the time over which extrapolations are made, and the proper evaluation of health risks attending implementation of the repository. The challenge is in evaluating whether the quality of the PA matches the needs of decision makers charged with protecting the health and safety of the public.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1003-1016 
    ISSN: 1539-6924
    Keywords: WIPP ; radioactive waste ; repository ; performance assessment ; transuranic waste
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The Waste Isolation Pilot Plant (WIPP) is a geological repository for disposal of U.S. defense transuranic radioactive waste. Built and operated by the U.S. Department of Energy (DOE), it is located in the Permian age salt beds in southeastern New Mexico at a depth of 655 m. Performance assessment for the repository's compliance with the 10,000-year containment standards was completed in 1996 and the U.S. Environmental Protection Agency (EPA) certified in 1998 that the repository meets compliance with the EPA standards 40 CFR 191 and 40 CFR 194. The Environmental Evaluation Group (EEG) review of the DOE's application for certification identified a number of issues. These related to the scenarios, conceptual models, and values of the input parameters used in the calculations. It is expected that these issues will be addressed and resolved during the first 5-year recertification process that began with the first receipt of waste at WIPP on March 26, 1999, and scheduled to be completed in March 2004.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    ISSN: 1539-6924
    Keywords: risk perception ; CRESP ; trust ; DOE Savannah River site ; risk assessment ; stakeholder ; economic dependence
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Environmental managers are increasingly charged with involving the public in the development and modification of policies regarding risks to human health and the environment. Involving the public in environmental decision making first requires a broad understanding of how and why the public perceives various risks. The Savannah River Stakeholder Study was conducted with the purpose of investigating individual, economic, and social characteristics of risk perceptions among those living near the Savannah River Nuclear Weapons Site. A number of factors were found to impact risk perceptions among those living near the site. One's estimated proximity to the site and relative river location surfaced as strong determinants of risk perceptions among SRS residents. Additionally, living in a quality neighborhood and demonstrating a willingness to accept health risks for economic gain strongly abated heightened risk perceptions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    ISSN: 1539-6924
    Keywords: risk assessment ; uncertainty ; formaldehyde ; decision analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A call for risk assessment approaches that better characterize and quantify uncertainty has been made by the scientific and regulatory community. This paper responds to that call by demonstrating a distributional approach that draws upon human data to derive potency estimates and to identify and quantify important sources of uncertainty. The approach is rooted in the science of decision analysis and employs an influence diagram, a decision tree, probabilistic weights, and a distribution of point estimates of carcinogenic potency. Its results estimate the likelihood of different carcinogenic risks (potencies) for a chemical under a specific scenario. For this exercise, human data on formaldehyde were employed to demonstrate the approach. Sensitivity analyses were performed to determine the relative impact of specific levels and alternatives on the potency distribution. The resulting potency estimates are compared with the results of an exercise using animal data on formaldehyde. The paper demonstrates that distributional risk assessment is readily adapted to situations in which epidemiologic data serve as the basis for potency estimates. Strengths and weaknesses of the distributional approach are discussed. Areas for further application and research are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1059-1069 
    ISSN: 1539-6924
    Keywords: spatial statistics ; optimal sequential search ; adaptive sampling ; simulation-optimization ; multiple imputation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Suppose that a residential neighborhood may have been contaminated by a nearby abandoned hazardous waste site. The suspected contamination consists of elevated soil concentrations of chemicals that are also found in the absence of site-related contamination. How should a risk manager decide which residential properties to sample and which ones to clean? This paper introduces an adaptive spatial sampling approach which uses initial observations to guide subsequent search. Unlike some recent model-based spatial data analysis methods, it does not require any specific statistical model for the spatial distribution of hazards, but instead constructs an increasingly accurate nonparametric approximation to it as sampling proceeds. Possible cost-effective sampling and cleanup decision rules are described by decision parameters such as the number of randomly selected locations used to initialize the process, the number of highest-concentration locations searched around, the number of samples taken at each location, a stopping rule, and a remediation action threshold. These decision parameters are optimized by simulating the performance of each decision rule. The simulation is performed using the data collected so far to impute multiple probable values of unknown soil concentration distributions during each simulation run. This optimized adaptive spatial sampling technique has been applied to real data using error probabilities for wrongly cleaning or wrongly failing to clean each location (compared to the action that would be taken if perfect information were available) as evaluation criteria. It provides a practical approach for quantifying trade-offs between these different types of errors and expected cost. It also identifies strategies that are undominated with respect to all of these criteria.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1113-1125 
    ISSN: 1539-6924
    Keywords: OSHA ; environmental health regulation ; risk ambiguity ; indoor/workplace air quality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Political context may play a large role in influencing the efficiency of environmental and health regulations. This case study uses data from a 1989 update of the Occupational Safety and Health Administration (OSHA) Permissible Exposure Limits (PELs) program to determine the relative effects of legislative mandates, costly acquisition of information by the agency, and pressure applied by special interest groups upon exposure standards. The empirical analysis suggests that federal agencies successfully thwart legislative attempts to limit agency discretion, and that agencies exercise bounded rationality by placing greater emphasis on more easily obtained information. The 1989 PELs were less significantly related to more costly information, contained “safety factors” for chemicals presenting relatively more ambiguous risks, and the proposed standard stringencies showed evidence of being influenced by vying industry and labor interests.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    ISSN: 1539-6924
    Keywords: aldrin ; dieldrin ; epidemiology ; occupational exposure ; cancer dose-response modeling ; proportional hazards ; hormesis ; distributional characterizations of added cancer risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The paper applies classical statistical principles to yield new tools for risk assessment and makes new use of epidemiological data for human risk assessment. An extensive clinical and epidemiological study of workers engaged in the manufacturing and formulation of aldrin and dieldrin provides occupational hygiene and biological monitoring data on individual exposures over the years of employment and provides unusually accurate measures of individual lifetime average daily doses. In the cancer dose-response modeling, each worker is treated as a separate experimental unit with his own unique dose. Maximum likelihood estimates of added cancer risk are calculated for multistage, multistage-Weibull, and proportional hazards models. Distributional characterizations of added cancer risk are based on bootstrap and relative likelihood techniques. The cancer mortality data on these male workers suggest that low-dose exposures to aldrin and dieldrin do not significantly increase human cancer risk and may even decrease the human hazard rate for all types of cancer combined at low doses (e.g., 1 μg/kg/day). The apparent hormetic effect in the best fitting dose-response models for this data set is statistically significant. The decrease in cancer risk at low doses of aldrin and dieldrin is in sharp contrast to the U.S. Environmental Protection Agency's upper bound on cancer potency based on mouse liver tumors. The EPA's upper bound implies that lifetime average daily doses of 0.0000625 and 0.00625 μg/kg body weight/day would correspond to increased cancer risks of 0.000001 and 0.0001, respectively. However, the best estimate from the Pernis epidemiological data is that there is no increase in cancer risk in these workers at these doses or even at doses as large as 2 μg/kg/day.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1157-1171 
    ISSN: 1539-6924
    Keywords: risk assessment ; transportation risk ; diesel exhaust ; fugitive dust ; vehicle emissions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract When the transportation risk posed by shipments of hazardous chemical and radioactive materials is being assessed, it is necessary to evaluate the risks associated with both vehicle emissions and cargo-related risks. Diesel exhaust and fugitive dust emissions from vehicles transporting hazardous shipments lead to increased air pollution, which increases the risk of latent fatalities in the affected population along the transport route. The estimated risk from these vehicle-related sources can often be as large or larger than the estimated risk associated with the material being transported. In this paper, data from the U.S. Environmental Protection Agency's Motor Vehicle-Related Air Toxics Study are first used to develop latent cancer fatality estimates per kilometer of travel in rural and urban areas for all diesel truck classes. These unit risk factors are based on studies investigating the carcinogenic nature of diesel exhaust. With the same methodology, the current per-kilometer latent fatality risk factor used in transportation risk assessments for heavy diesel trucks in urban areas is revised and the analysis expanded to provide risk factors for rural areas and all diesel truck classes. These latter fatality estimates may include, but are not limited to, cancer fatalities and are based primarily on the most recent epidemiological data available on mortality rates associated with ambient air PM-10 concentrations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    ISSN: 1539-6924
    Keywords: transportation risk ; Hydraxine ; sensitivity analysis ; simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1205-1214 
    ISSN: 1539-6924
    Keywords: Monte Carlo ; correlation ; copulas ; bivariate distributions ; dioxins
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Monte Carlo methods in risk assessment are finding increasingly widespread application. With the recognition that inputs may be correlated, the incorporation of such correlations into the simulation has become important. Most implementations rely upon the method of Iman and Conover for generating correlated random variables. In this work, alternative methods using copulas are presented for deriving correlated random variables. It is further shown that the particular algorithm or assumption used may have a substantial effect on the output results, due to differences in higher order bivariate moments.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    ISSN: 1539-6924
    Keywords: municipal waste incineration ; risk assessment ; Monte-Carlo simulation ; time activity patterns
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract During the modernization of the municipal waste incinerator (MWI, maximum capacity of 180,000 tons per year) of Metropolitan Grenoble (405,000 inhabitants), in France, a risk assessment was conducted, based on four tracer pollutants: two volatile organic compounds (benzene and 1, 1, 1 trichloroethane) and two heavy metals (nickel and cadmium, measured in particles). A Gaussian plume dispersion model, applied to maximum emissions measured at the MWI stacks, was used to estimate the distribution of these pollutants in the atmosphere throughout the metropolitan area. A random sample telephone survey (570 subjects) gathered data on time-activity patterns, according to demographic characteristics of the population. Life-long exposure was assessed as a time-weighted average of ambient air concentrations. Inhalation alone was considered because, in the Grenoble urban setting, other routes of exposure are not likely. A Monte Carlo simulation was used to describe probability distributions of exposures and risks. The median of the life-long personal exposures distribution to MWI benzene was 3.2·10−5 μg/m3 (20th and 80th percentiles = 1.5·10−5 and 6.5·10−5 μg/m3), yielding a 2.6·10−10 carcinogenic risk (1.2·10−10–5.4·10−10). For nickel, the corresponding life-time exposure and cancer risk were 1.8·10−4 μg/m3 (0.9.10−4 – 3.6·10−4 μg/m3) and 8.6·10−8 (4.3·10−8–17.3·10−8); for cadmium they were respectively 8.3·10−6 μg/m3 (4.0·10−6–17.6·10−6) and 1.5·10−8 (7.2·10−9–3.1·10−8). Inhalation exposure to cadmium emitted by the MWI represented less than 1% of the WHO Air Quality Guideline (5 ng/m3), while there was a margin of exposure of more than 109 between the NOAEL (150 ppm) and exposure estimates to trichloroethane. Neither dioxins nor mercury, a volatile metal, were measured. This could lessen the attributable life-long risks estimated. The minute (VOCs and cadmium) to moderate (nickel) exposure and risk estimates are in accord with other studies on modern MWIs meeting recent emission regulations, however.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1235-1249 
    ISSN: 1539-6924
    Keywords: soil contamination ; remediation urgency ; standards ; human exposure ; ecotoxicological risks ; risk due to contaminant migration
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract To assess soil and groundwater quality two generic (i.e. multifunctional) risk-based standards, Target and Intervention Value, have been developed, in the framework of the Dutch Soil Protection Act. These standards allow soil and groundwater to be classified as clean, slightly contaminated or seriously contaminated. The Target Value is based on potential risks to ecosystems, while the Intervention Value is based on potential risks to humans and ecosystems. In the case of serious soil contamination the site has, in principle, to be remediated, making it necessary to determine the remediation urgency on the basis of actual (i.e. site-specific) risks to humans and ecosystems and, besides, actual risks due to contaminant migration.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 17-25 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract A complete analytic characterization and solution construction (done either explicitly or by recursion) for the minimax control problem using optimal rate feedback is given for the case when the plant consists of a known fixed set of coupled oscillators of cardinality not exceeding three. When this is not the case, the problem appears to be analytically intractable, and suboptimal solutions based on numerical techniques are currently the only recourse.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 1-16 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The task of constructing an energy function is essential for direct stability analysis of electric power systems. This paper presents a general procedure for constructing analytical energy functions for detailed lossless network-reduction power system stability models. This paper primarily (i) develops canonical representations for lossless networkreduction power system models and shows that such canonical representations cover existing stability models, (ii) derives theoretical results regarding the existence of analytical energy functions for the canonical representations, and (iii) presents a systematic procedure to construct corresponding energy functions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 89-110 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The problem of finding all the DC solutions of a certain class of piecewise-linear electronic circuits containing locally passive and locally active one-ports is considered in this paper. An effective method enabling us to locate the solutions is developed. The method constitutes the crucial point of an algorithm based on the idea of successive contraction, division, and elimination that is capable of determining all the solutions. Several numerical examples are given, and some comparison analyses are performed confirming the usefulness of the proposed approach.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 269-290 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The controllability and observability properties of a singular system are extensively studied. The definitions of controllability,R-controllability, and impulse controllability are introduced via characteristics of the original state vector. Analogous definitions are presented for the case of observability. The criteria established for controllability and observability are simple rank criteria related to the Markov parameters from the inputs to the states and from the initial conditions to the outputs, respectively. The present results can be considered as the direct extension of Kalman's controllability and observability criteria to the case of singular systems. Finally, the controllability and observability subspaces are derived from the image and the kernel of the controllability and the observability matrices, respectively.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Given a stationary time seriesX and another stationary time seriesY (with a different power spectral density), we describe an algorithm for constructing a stationary time series Z that contains exactly the same values asX permuted in an order such that the power spectral density ofZ closely resembles that ofY. We call this methodspectral mimicry. We prove (under certain restrictions) that, if the univariate cumulative distribution function (CDF) ofX is identical to the CDF ofY, then the power spectral density ofZ equals the power spectral density ofY. We also show, for a class of examples, that when the CDFs ofX andY differ modestly, the power spectral density ofZ closely approximates the power spectral density ofY. The algorithm, developed to design an experiment in microbial population dynamics, has a variety of other applications.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 479-487 
    ISSN: 1531-5878
    Keywords: Cramér-Rao bounds ; direction-of-arrival estimation ; unknown noise
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The deterministic and stochastic direction estimation Cramér-Rao bounds (CRBs) are studied in the presence of one signal and spatially uncorrelated sensor noise with unknown nonequal variances in array sensors. The explicit CRB expressions are obtained, and their relationship is studied showing some typical properties inherent in the nonidentical noise case.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    ISSN: 1531-5878
    Keywords: Nonlinear circuit theory ; co-content ; functional minimization ; image processing
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The solutions of many physical-mathematical problems can be obtained by minimizing proper functionals. In the literature, some methods for the synthesis of analog circuits (mainly cellular neural networks) are presented that find the solution of some of these problems by implementing the discretized Euler-Lagrange equations associated with the pertinent functionals. In this paper, we propose a method for defining analog circuits that directly minimize (in a parallel way) a class of discretized functionals in the frequently occurring case where the solution depends on two spatial variables. The method is a generalization of the one presented in Parodi et al.,Internat. J. Circuit Theory Appl., 26, 477–498, 1998. The analog circuits consist of both a (nonlinear) resistive part and a set of linear capacitors, whose steady-state voltages represent the discrete solution to the problem. The method is based on the potential (co-content) functions associated with voltage-controlled resistive elements. es an example, we describe an application in the field of image processing: the restoration of color images corrupted by additive noise.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. i 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 111-130 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The cumulants defined in terms of moments are basic to the study of higher-order statistics (HOS) of a stationary stochastic process. This paper presents a concurrent systolic array system for the computation of higher-order moments. The system allows for the simultaneous computation of the second-, third-, and fourth-order moments. The architecture achieves good speedup through its excellent exploitation of parallelism, pipelining, and reusability of some intermediate results. The computational complexity and system performance issues related to the architecture are discussed. The concurrent system is designed with the CMOS VLSI technology and is capable of operating at 3.9 MHz.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 183-187 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Necessary and sufficient conditions for the invertibility of a (not necessarily linear) operatorN between normed linear spaces are given. It is shown thatN is invertible precisely if a certain operator associated withN is a contraction.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    ISSN: 1539-6924
    Keywords: Environment ; equity ; coke ; oil ; history ; risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Facility-specific information on pollution was obtained for 36 coke plants and 46 oil refineries in the United States and matched with information on populations surrounding these 82 facilities. These data were analyzed to determine whether environmental inequities were present, whether they were more economic or racial in nature, and whether the racial composition of nearby communities has changed significantly since plants began operations. The Census tracts near coke plants have a disproportionate share of poor and nonwhite residents. Multivariate analyses suggest that existing inequities are primarily economic in nature. The findings for oil refineries are not strongly supportive of the environmental inequity hypothesis. Rank ordering of facilities by race, poverty, and pollution produces limited (although not consistent) evidence that the more risky facilities tend to be operating in communities with above-median proportions of nonwhite residents (near coke plants) and Hispanic residents (near oil refineries). Over time, the racial makeup of many communities near facilities has changed significantly, particularly in the case of coke plants sited in the early 1900s. Further risk-oriented studies of multiple manufacturing facilities in various industrial sectors of the economy are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 231-247 
    ISSN: 1539-6924
    Keywords: Health risk assessment ; hazard characterization ; Acceptable Daily Intake ; Reference Dose ; paradigm ; practices ; cancer ; non-cancer ; Bayesian ; default options
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the “NAS paradigm.” Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as “Acceptable Daily Intake,” “Reference Dose,” and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's “Proposition 65,” where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of “conventional air pollutants.” These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 249-259 
    ISSN: 1539-6924
    Keywords: Reliability ; Monte Carlo simulation ; hazardous waste treatment ; safety factor ; packed tower ; activated sludge
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The reliability of a treatment process is addressed in terms of achieving a regulatory effluent concentration standard and the design safety factors associated with the treatment process. This methodology was then applied to two aqueous hazardous waste treatment processes: packed tower aeration and activated sludge (aerobic) biological treatment. The designs achieving 95 percent reliability were compared with those designs based on conventional practice to determine their patterns of conservatism. Scoping-level treatment costs were also related to reliability levels for these treatment processes. The results indicate that the reliability levels for the physical/chemical treatment process (packed tower aeration) based on the deterministic safety factors range from 80 percent to over 99 percent, whereas those for the biological treatment process range from near 0 percent to over 99 percent, depending on the compound evaluated. Increases in reliability per unit increase in treatment costs are most pronounced at lower reliability levels (less than about 80 percent) than at the higher reliability levels (greater than 90 percent, indicating a point of diminishing returns. Additional research focused on process parameters that presently contain large uncertainties may reduce those uncertainties, with attending increases in the reliability levels of the treatment processes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 321-321 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 295-308 
    ISSN: 1539-6924
    Keywords: Noncancer risk assessment ; uncertainty analysis ; systematic error ; calibration ; censoring ; relative potency ; safety factor
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The prominent role of animal bioassay evidence in environmental regulatory decisions compels a careful characterization of extrapolation uncertainties. In noncancer risk assessment, uncertainty factors are incorporated to account for each of several extrapolations required to convert a bioassay outcome into a putative subthreshold dose for humans. Measures of relative toxicity taken between different dosing regimens, different endpoints, or different species serve as a reference for establishing the uncertainty factors. Ratios of no observed adverse effect levels (NOAELs) have been used for this purpose; statistical summaries of such ratios across sets of chemicals are widely used to guide the setting of uncertainty factors. Given the poor statistical properties of NOAELs, the informativeness of these summary statistics is open to question. To evaluate this, we develop an approach to “calibrate” the ability of NOAEL ratios to reveal true properties of a specified distribution for relative toxicity. A priority of this analysis is to account for dependencies of NOAEL ratios on experimental design and other exogenous factors. Our analysis of NOAEL ratio summary statistics finds (1) that such dependencies are complex and produce pronounced systematic errors and (2) that sampling error associated with typical sample sizes (50 chemicals) is non-negligible. These uncertainties strongly suggest that NOAEL ratio summary statistics cannot be taken at face value; conclusions based on such ratios reported in well over a dozen published papers should be reconsidered.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    ISSN: 1539-6924
    Keywords: Cancer risk ; in vivo doses ; linear multiplicative model ; ethylene oxide ; relative potency ; butadiene ; acrylamide
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A mechanistic model and associated procedures are proposed for cancer risk assessment of genotoxic chemicals. As previously shown for ionizing radiation, a linear multiplicative model was found to be compatible with published experimental data for ethylene oxide, acrylamide, and butadiene. The validity of this model was anticipated in view of the multiplicative interaction of mutation with inherited and acquired growth-promoting conditions. Concurrent analysis led to rejection of an additive model (i.e. the model commonly applied for cancer risk assessment). A reanalysis of data for radiogenic cancer in mouse, dog and man shows that the relative risk coefficient is approximately the same (0.4 to 0.5 percent per rad) for tumours induced in the three species. Doses in vivo, defined as the time-integrated concentrations of ultimate mutagens, expressed in millimol × kg−1 × h (mMh) are, like radiation doses given in Gy or rad, proportional to frequencies of potentially mutagenic events. The radiation dose equivalents of chemical doses are, calculated by multiplying chemical doses (in mMh) with the relative genotoxic potencies (in rad × mMh−1) determined in vitro. In this way the relative cancer incidence increments in rats and mice exposed to ethylene oxide were shown to be about 0.4 percent per rad-equivalent, in agreement with the data for radiogenic cancer. Our analyses suggest that values of the relative risk coefficients for genotoxic chemicals are independent of species and that relative cancer risks determined in animal tests apply also to humans. If reliable animal test data are not available, cancer risks may be estimated by the relative potency. In both cases exposure dose/target dose relationships, the latter via macromolecule adducts, should be determined.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 585-598 
    ISSN: 1539-6924
    Keywords: uncertainty ; threatened plants ; risk ; conservation ; rule sets ; IUCN
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Australian state and federal agencies use a broad range of methods for setting conservation priorities for species at risk. Some of these are based on rule sets developed by the International Union for the Conservation of Nature, while others use point scoring protocols to assess threat. All of them ignore uncertainty in the data. In this study, we assessed the conservation status of 29 threatened vascular plants from Tasmania and New South Wales using a variety of methods including point scoring and rule-based approaches. In addition, several methods for dealing with uncertainty in the data were applied to each of the priority-setting schemes. The results indicate that the choice of a protocol for setting priorities and the choice of the way in which uncertainty is treated may make important differences to the resulting assessments of risk. The choice among methods needs to be rationalized within the management context in which it is to be applied. These methods are not a substitute for more formal risk assessment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    ISSN: 1539-6924
    Keywords: MeHg ; pharmacokinetics ; PBPK model ; variability ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 μg/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 μg/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 μg/kg/day and an MRL of 0.3 μg/kg/day.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 577-584 
    ISSN: 1539-6924
    Keywords: risk assessment ; exposure point concentration ; bootstrapping ; gamma distribution ; lognormal
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The U.S. Environmental Protection Agency (EPA) recommends the use of the one-sided 95% upper confidence limit of the arithmetic mean based on either a normal or lognormal distribution for the contaminant (or exposure point) concentration term in the Superfund risk assessment process. When the data are not normal or lognormal this recommended approach may overestimate the exposure point concentration (EPC) and may lead to unecessary cleanup at a hazardous waste site. The EPA concentration term only seems to perform like alternative EPC methods when the data are well fit by a lognormal distribution. Several alternative methods for calculating the EPC are investigated and compared using soil data collected from three hazardous waste sites in Montana, Utah, and Colorado. For data sets that are well fit by a lognormal distribution, values for the Chebychev inequality or the EPA concentration term may be appropriate EPCs. For data sets where the soil concentration data are well fit by gamma distributions, Wong's method may be used for calculating EPCs. The studentized bootstrap-t and Hall's bootstrap-t transformation are recommended for EPC calculation when all distribution fits are poor. If a data set is well fit by a distribution, parametric bootstrap may provide a suitable EPC.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    ISSN: 1539-6924
    Keywords: risk perception ; air quality ; environmental justice ; community health survey
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper describes a multi-stakeholder process designed to assess the potential health risks associated with adverse air quality in an urban industrial neighborhood. The paper briefly describes the quantitative health risk assessment conducted by scientific experts, with input by a grassroots community group concerned about the impacts of adverse air quality on their health and quality of life. In this case, rather than accept the views of the scientific experts, the community used their powers of perception to advantage by successfully advocating for a professionally conducted community health survey. This survey was designed to document, systematically and rigorously, the health risk perceptions community members associated with exposure to adverse air quality in their neighborhood. This paper describes the institutional and community contexts within which the research is situated as well as the design, administration, analysis, and results of the community health survey administered to 402 households living in an urban industrial neighborhood in Hamilton, Ontario, Canada. These survey results served to legitimate the community's concerns about air quality and to help broaden operational definitions of ‘health.’ In addition, the results of both health risk assessment exercises served to keep issues of air quality on the local political agenda. Implications of these findings for our understanding of the environmental justice process as well as the ability of communities to influence environmental health policy are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    ISSN: 1539-6924
    Keywords: risk perception ; risk characteristics ; outrage factors ; rbGH ; ordered probit
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This study estimates the effect risk characteristics, described as outrage factors by Hadden, have on consumers' risk perceptions toward the food-related biotechnology, recombinant bovine growth hormone (rbGH). The outrage factors applicable to milk from rbGH treated herds are involuntary risk exposure, unfamiliarity with the product's production process, unnatural product characteristics, lack of trust in regulator's ability to protect consumers in the marketplace, and consumers' inability to distinguish milk from rbGH treated herds compared to milk from untreated herds. An empirical analysis of data from a national survey of household food shoppers reveals that outrage factors mediate risk perceptions. The results support the inclusion of outrage factors into the risk perception model for the rbGH product, as they add significantly to the explanatory power of the model and therefore reduce bias compared to a simpler model of attitudinal and demographic factors. The study indicates that outrage factors which have a significant impact on risk perceptions are the lack of trust in the FDA as a food-related information source, and perceiving no consumer benefits from farmers' use of rbGH. Communication strategies to reduce consumer risk perceptions therefore could utilize agencies perceived as more trustworthy and emphasize the benefits of rbGH use to consumers.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    ISSN: 1539-6924
    Keywords: risk perceptions ; psychometric paradigm ; multilevel modeling ; random coefficient models
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Psychometric data on risk perceptions are often collected using the method developed by Slovic, Fischhoff, and Lichtenstein, where an array of risk issues are evaluated with respect to a number of risk characteristics, such as how dreadful, catastrophic or involuntary exposure to each risk is. The analysis of these data has often been carried out at an aggregate level, where mean scores for all respondents are compared between risk issues. However, this approach may conceal important variation between individuals, and individual analyses have also been performed for single risk issues. This paper presents a new methodological approach using a technique called multilevel modelling for analysing individual and aggregated responses simultaneously, to produce unconditional and unbiased results at both individual and aggregate levels of the data. Two examples are given using previously published data sets on risk perceptions collected by the authors, and results between the traditional and new approaches compared. The discussion focuses on the implications of and possibilities provided by the new methodology.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 739-749 
    ISSN: 1539-6924
    Keywords: probabilistic forecasting ; uncertainty quantification ; Bayesian method ; Monte-Carlo simulation ; decision making
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and “ensemble forecasting” technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 759-761 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    ISSN: 1539-6924
    Keywords: regulation ; radioactive waste ; performance assessment ; risk assessment ; regulatory assessment ; bias evaluation ; international collaboration ; underground disposal ; quantitative risk analysis ; public debate ; decision process
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Much has been written about the development and application of quantitative methods for estimating under uncertainty the long-term radiological performance of underground disposal of radioactive wastes. Until recently, interest has been focused almost entirely on the technical challenges regardless of the role of the organization responsible for these analyses. Now the dialogue between regulators, the repository developer or operator, and other interested parties in the decision-making process receives increasing attention, especially in view of some current difficulties in obtaining approvals to construct or operate deep facilities for intermediate or high-level wastes. Consequently, it is timely to consider the options for regulators' review and evaluation of safety submissions, at the various stages in the site selection to repository closure process, and to consider, especially, the role for performance assessment (PA) within the programs of a regulator both before and after delivery of such a submission. The origins and broad character of present regulations in the European Union (EU) and in the OECD countries are outlined and some regulatory PA reviewed. The issues raised are discussed, especially in regard to the interpretation of regulations, the dangers from the desire for simplicity in argument, the use of regulatory PA to review and challenge the PA in the safety case, and the effects of the relationship between proponent and regulator. Finally, a very limited analysis of the role of PA in public hearings is outlined and recommendations are made, together with proposals for improving the mechanisms for international collaboration on technical issues of regulatory concern.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 23-32 
    ISSN: 1539-6924
    Keywords: Software failures ; software hazard analysis ; safety-critical systems ; risk assessment ; context
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a “context-based” approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing “randomly.” The paper elaborates on the concept of “error-forcing context” as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify “error-forcing contexts” for software in the form of fault tree prime implicants.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 47-68 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    ISSN: 1539-6924
    Keywords: Variability ; uncertainty ; maximum likelihood ; bootstrap simulation ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 159-169 
    ISSN: 1539-6924
    Keywords: Trust ; geography ; personality ; environment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A sample of 323 residents of New Jersey stratified by neighborhood quality (excellent, good, fair, poor) was gathered to determine if trust in science and technology to protect public health and environment at the societal scale was associated with trust of the local officials, such as the mayor, health officer, developers, mass media, and legislators who are guardians of the local environment. Societal (trust of science and technology) and neighborhood (mayor, health officer) dimensions of trust were found. These societal and neighborhood trust dimensions were weakly correlated. Respondents were divided into four trust-of-authority groups: high societal–high neighborhood, low societal–low neighborhood, high societal–low neighborhood, and low societal–high neighborhood. High societal–high neighborhood trust respondents were older, had lived in the neighborhoods for many years, were not troubled much by neighborhood or societal environmental threats, and had a strong sense of control over their environment. In strong contrast, low societal–low neighborhood trust respondents were relatively young, typically had lived in their present neighborhood for a short time, were troubled by numerous neighborhood and societal environmental threats, did not practice many personal public health practices, and felt little control over their environment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    ISSN: 1539-6924
    Keywords: Risk ; fishing ; ethnicity ; perception ; toxics ; consumption
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Recreational and subsistence angling are important aspects of urban culture for much of North America where people are concentrated near the coasts or major rivers. Yet there are fish and shellfish advisories for many estuaries, rivers, and lakes, and these are not always heeded. This paper examines fishing behavior, sources of information, perceptions, and compliance with fishing advisories as a function of ethnicity for people fishing in the Newark Bay Complex of the New York–New Jersey Harbor. We test the null hypothesis that there were no ethnic differences in sources of information, perceptions of the safety of fish consumption, and compliance with advisories. There were ethnic differences in consumption rates, sources of information about fishing, knowledge about the safety of the fish, awareness of fishing advisories or of the correct advisories, and knowledge about risks for increased cancer and to unborn and young children. In general, the knowledge base was much lower for Hispanics, was intermediate for blacks, and was greatest for whites. When presented with a statement about the potential risks from eating fish, there were no differences in their willingness to stop eating fish or to encourage pregnant women to stop. These results indicate a willingness to comply with advisories regardless of ethnicity, but a vast difference in the base knowledge necessary to make informed risk decisions about the safety of fish and shellfish. Although the overall median income level of the population was in the $25,000–34,999 income category, for Hispanics it was on the border between $15,000–24,999 and $25,000–34,999.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 315-329 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In earlier studies a multiclass vector quantization (MVQ)-based neural network design was explored for pattern classification. We reconsider that design here in the context of function emulation. With proper adjustment, the MVQ design demonstrates excellent performance. Moreover, the design algorithms sense discontinuities in the data and replicate them in the network.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 365-376 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In earlier studies the concept of a fast form was generalized to a format that unified such discrete transform examples as the FFT, the FCT, the FST, and the FHT. In this study we consider the approximation of arbitrary linear maps by fast forms. Using simulation we evaluate the approximation capabilities of the generalized fast form.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 377-393 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract On-line running spectral analysis is of considerable interest in many electrophysiological signals, such as the EEG (electroencephalograph). This paper presents a new method of implementing the fast Fourier transform (FFT) algorithm. Our “real-time FFT algorithm” efficiently utilizes computer time to perform the FFT computation while data acquisition proceeds so that local butterfly modules are built using the data points that are already available. The real-time FFT algorithm is developed using the decimation-in-time split-radix FFT (DIT sr-FFT) butterfly structure. In order to demonstate the synchronization ability of the proposed algorithm, the authors develop a method of evaluating the number of arithmetic operations that it requires. Both the derivation and the experimental result show that the real-time FFT algorithm is superior to the conventional whole-block FFT algorithm in synchronizing with the data acquisition process. Given that the FFT sizeN=2 r , real-time implementation of the FFT algorithm requires only 2/r the computational time required by the whole-block FFT algorithm.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 523-523 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 539-551 
    ISSN: 1531-5878
    Keywords: H 2 deconvolution filter ; envelope-constrained filter ; finite impulse response filter ; linear matrix inequality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper, we consider an envelope-constrained (EC)H 2 optimal finite impulse response (FIR) filtering problem. Our aim is to design a filter such that theH 2 norm of the filtering error transfer function is minimized subject to the constraint that the filter output with a given input to the signal system is contained or bounded by a prescribed envelope. The filter design problem is formulated as a standard optimization problem with linear matrix inequality (LMI) constraints. Furthermore, by relaxing theH 2 norm constraint, we propose a robust ECFIR filter design algorithm based on the LMI approach.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 565-585 
    ISSN: 1531-5878
    Keywords: Two-dimensional system ; model conversion ; Roesser's model
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract This paper presents a geometric-series method for finding two-dimensional (2D) discrete-time (continuous-time) state-space models from 2D continuous-time (discretetime) systems. This method allows the use of well-developed theorems and algorithms in the 2D discrete-time (continuous-time) domain to indirectly carry out analysis and design of hybrid 2D composite systems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 505-521 
    ISSN: 1531-5878
    Keywords: Multirate systems ; multirate signal processing ; filter banks ; optimal design
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The design of general nonuniform filter banks is studied. Contrary to uniform filter banks, in nonuniform filter banks, it may not be possible to achieve perfect reconstruction, but in some cases by using optimization techniques, we can design acceptable filter banks. Here, the initial finite impulse response (FIR) analysis filters are designed according to the characteristics of the input. By the design procedure, the FIR synthesis filters are found so that theH-norm of an error system is minimized over all synthesis filters that have a prespecified order. Then, the synthesis filters obtained in the previous step are fixed, and the analysis filters are found similarly. By iteration, theH-norm of the error system decreases until it converges to its final value. At each iteration, the coefficients of the analysis or synthesis filters are obtained by finding the least squares solution of a system of linear equations. If necessary, the frequency characteristics of the filters can be altered by adding penalty terms to the objective function.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 617-617 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 43-57 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract We consider multirate digital control systems that consist of an interconnection of a continuous-time nonlinear plant (described by ordinary differential equations) and a digital lifted controller (described by ordinary difference equations). The input to the digital controller consists of the multirate sampled output of the plant, and the input to the continuous-time plant consists of the multirate hold output of the digital controller. In this paper we show that when quantizer nonlinearities are neglected, then under reasonable conditions (which exclude the critical cases), the stability properties (in the Lyapunov sense) of the trivial solution of the nonlinear multirate digital control system can be deduced from the stability properties of the trivial solution of its linearization. We also point out that certain results involving quantization effects and stabilizing controllers can be established which are in the spirit of some existing results.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 59-73 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Chaotic systems provide a simple means of generating deterministic signals that resemble white noise. It is this noise-like property that provides the potential for applying chaotic systems in communications. In this work, we report a detailed study of the logistic map for use as direct-sequence spread-spectrum (DS/SS) codes. The advantages of the chaotic DS/SS codes are the almost unlimited number of distinct sequences of arbitrary lengths, the ease of generating these sequences, and the increased privacy afforded by the noise-like appearance of these sequences. Some design criteria are provided from the correlation properties of these sequences, and bit-error rate (BER) results are generated by Monte Carlo simulations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 75-84 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The asymptotic behavior of block floating-point and floating-point digital filters is analyzed. As a result, mantissa wordlength conditions are derived guaranteeing the absence of limit cycles in the regular dynamic range. Explicitly, the requirements are given for block floating-point state space filters with different quantization formats. Although these conditions are only sufficient, examples are given in which they are also necessary. In most cases the conditions are easily satisfied.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 85-85 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 87-88 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 189-190 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 169-181 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper the Cramér-Rao bound (CRB) for a general nonparametric spectral estimation problem is derived under a local smoothness condition (more exactly, the spectrum is assumed to be well approximated by a piecewise constant function). Further-more, it is shown that under the aforementioned condition the Thomson method (TM) and Daniell method (DM) for power spectral density (PSD) estimation can be interpreted as approximations of the maximum likelihood PSD estimator. Finally the statistical efficiency of the TM and DM as nonparametric PSD estimators is examined and also compared to the CRB for autoregressive moving-average (ARMA)-based PSD estimation. In particular for broadband signals, the TM and DM almost achieve the derived nonparametric performance bound and can therefore be considered to be nearly optimal.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 149-168 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The issue of stability of higher-order, single-stage Sigma-Delta (ΣΔ) modulators is addressed using a method from nonlinear system theory. As a result, theoretical bounds for the quantizer input of the modulators are derived. A new method for stabilizing the ΣΔ modulators is then presented. It uses the quantizer input bound for possible instability detection. Upon detection of such a state, the highest-order integrator is cut off, effectively reducing the order of the modulator, and thus resulting in a stable system. The method is easily implemented and results in a very good signal-to-noise ratio (SNR) and fast return to normal operation compared to other stabilization methods.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 225-239 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The uncorrelated component analysis (UCA) of a stationary random vector process consists of searching for a linear transformation that minimizes the temporal correlation between its components. Through a general analysis we show that under practically reasonable and mild conditions UCA is a solution for blind source separation. The theorems proposed in this paper for UCA provide useful insights for developing practical algorithms. UCA explores the temporal information of the signals, whereas independent component analysis (ICA) explores the spatial information; thus UCA can be applied for source separation in some cases where ICA cannot. For blind source separation, combining ICA and UCA may give improved performance because more information can be utilized. The concept of single UCA (SUCA) is also proposed, which leads to sequential source separation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. i 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 331-350 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract This paper analyzes the nonlinear phenomenon of rotating stall via the methods of projection [Elementary Stability and Bifurcation Theory, G. Iooss and D. D. Joseph, Springer-Verlag, 1980] and Lyapunov [J.-H. Fu,Math. Control Signal Systems, 7, 255–278, 1994]. A compressor model of Moore and Greitzer is adopted in which rotating stall dynamics are associated with Hopf bifurcations. Local stability for each pair of the critical modes is studied and characterized. It is shown that local stability of individual pairs of the critical modes determines collectively local stability of the compressor model. Explicit conditions are obtained for local stability of rotating stall which offer new insight into the design, and active control of axial flow compressors.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 407-429 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract We describe herein a new means of training dynamic multilayer nonlinear adaptive filters, orneural networks. We restrict our discussion to multilayer dynamic Volterra networks, which are structured so as to restrict their degrees of computational freedom, based on a priori knowledge about the dynamic operation to be emulated. The networks consist of linear dynamic filters together with nonlinear generalized single-layer subnets. We describe how a Newton-like optimization strategy can be applied to these dynamic architectures and detail a newmodified Gauss-Newton optimization technique. The new training algorithm converges faster and to a smaller value of cost than backpropagation-through-time for a wide range of adaptive filtering applications. We apply the algorithm to modeling the inverse of a nonlinear dynamic tracking system. The superior performance of the algorithm over standard techniques is demonstrated.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 445-455 
    ISSN: 1531-5878
    Keywords: Linear-phase digital filters ; FIR integrators
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In many signal processing situations, the desired (ideal) magnitude response of the filter is a rational function: $$\tilde H(\omega ) = |1/\omega |$$ (a digital integrator). The requirements of a linear phase response and guaranteed stable performance limit the design to a finite impulse response (FIR) structure. In many applications we require the FIR filter to yield a highly accurate magnitude response for a narrow band of frequencies with maximal flatness at an arbitrary frequencyω 0 in the spectrum (0, π). No techniques for meeting such requirements with respect to approximation of $$\tilde H(\omega )$$ are known in the literature. This paper suggests a design by which the linear phase magnitude response $$|\tilde H(\omega )|$$ can be approximated by an FIR configuration giving a maximally flat (in the Butterworth sense) response at an arbitrary frequency ω0, 0〈ω0〈π*. A technique to compute exact weights for the design has also been given.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 524-524 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 525-537 
    ISSN: 1531-5878
    Keywords: Random sampling, digital signal processing ; spectral estimation ; computational algorithm
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Random sampling is one of the methods used to achieve sub-Nyquist sampling. This paper proposes a novel algorithm to evaluate the circular autocorrelation of a randomly sampled sequence, from which its power density spectrum can be obtained. With uniform sampling, the size of each lag (the step size) for computing an autocorrelation of a sequence is the same as the sampling period. When random sampling is adopted, the step size should be chosen such that the highest-frequency component of interest contained in a sequence can be accommodated. To find overlaps between a time sequence and its shifted version, an appropriate window is opened in one of the time sequences. To speed up the process, a marker is set to limit the range of searching for overlaps. The proposed method of estimating the power spectrum via autocorrelation is comparable in terms of accuracy and signal-to-noise ratio (SNR) to the conventional point rule. The techniques introduced can also apply to other operations for randomly sampled sequences.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 587-609 
    ISSN: 1531-5878
    Keywords: Transfinite graphs ; infinite electrical networks ; current flows through infinity
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Transfinite electrical networks of ranks larger than 1 have previously been defined by arbitrarily joining together various infinite extremities through transfinite nodes that are independent of the networks' resistance values. Thus, some or all of those transfinite nodes may remain ineffective in transmitting current “through infinity.” In this paper, transfinite nodes are defined in terms of the paths that permit currents to ”reach infinity.” This is accomplished by defining a suitable metricd v on the node setN S v of eachv-sectionS v, av-section being a maximal subnetwork whose nodes are connected by two-ended paths of ranks no larger thanv. Upon taking the completion ofN S v under that metricd v, we identify those extremities (now calledv-terminals) that are accessible to current flows. These are used to define transfinite nodes that combine such extremities. The construction is recursive and is carried out through all the natural number ranks, and then through the first arrow rank ω and the first limit-ordinal rank ω. The recursion can be carried still further. All this provides a more natural development of transfinite networks and indeed simplifies the theory of electrical behavior for such networks.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    ISSN: 1539-6924
    Keywords: Cancer dose–response modeling ; multistage model ; two-stage model ; hazard functions ; carcinogenesis ; Benzene ; Dieldrin ; Ethylene Thiourea ; Trichloroethylene ; Vinyl Chloride
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We compare the regulatory implications of applying the traditional (linearized) and exact two-stage dose–response models to animal carcinogenic data. We analyze dose–response data from six studies, representing five different substances, and we determine the “goodness-of-fit” of each model as well as the 95% confidence lower limit of the dose corresponding to a target excess risk of 10−5 (the target risk dose TRD). For the two concave datasets, we find that the exact model gives a substantially better fit to the data than the traditional model, and that the exact model gives a TRD that is an order of magnitude lower than that given by the traditional model. In the other cases, the exact model gives a fit equivalent to or better than the traditional model. We also show that although the exact two-stage model may exhibit dose–response concavity at moderate dose levels, it is always linear or sublinear, and never supralinear, in the low-dose limit. Because regulatory concern is almost always confined to the low-dose region extrapolation, supralinear behavior seems not to be of regulatory concern in the exact two-stage model. Finally, we find that when performing this low-dose extrapolation in cases of dose–response concavity, extrapolating the model fit leads to a more conservative TRD than taking a linear extrapolation from 10% excess risk. We conclude with a set of recommendations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 69-81 
    ISSN: 1539-6924
    Keywords: Parameters ; probability distributions ; validity
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In any model the values of estimates for various parameters are obtained from different sources each with its own level of uncertainty. When the probability distributions of the estimates are obtained as opposed to point values only, the measurement uncertainties in the parameter estimates may be addressed. However, the sources used for obtaining the data and the models used to select appropriate distributions are of differing degrees of uncertainty. A hierarchy of different sources of uncertainty based upon one's ability to validate data and models empirically is presented. When model parameters are aggregated with different levels of the hierarchy represented, this implies distortion or degradation in the utility and validity of the models used. Means to identify and deal with such heterogeneous data sources are explored, and a number of approaches to addressing this problem is presented. One approach, using Range/Confidence Estimates coupled with an Information Value Analysis Process, is presented as an example.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1-2 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...