ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Books
  • Articles  (66,856)
  • Data
  • Springer  (66,856)
  • 1995-1999  (66,856)
  • 1999  (66,856)
Collection
Years
  • 1995-1999  (66,856)
Year
Journal
  • 101
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 269-290 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The controllability and observability properties of a singular system are extensively studied. The definitions of controllability,R-controllability, and impulse controllability are introduced via characteristics of the original state vector. Analogous definitions are presented for the case of observability. The criteria established for controllability and observability are simple rank criteria related to the Markov parameters from the inputs to the states and from the initial conditions to the outputs, respectively. The present results can be considered as the direct extension of Kalman's controllability and observability criteria to the case of singular systems. Finally, the controllability and observability subspaces are derived from the image and the kernel of the controllability and the observability matrices, respectively.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 102
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Given a stationary time seriesX and another stationary time seriesY (with a different power spectral density), we describe an algorithm for constructing a stationary time series Z that contains exactly the same values asX permuted in an order such that the power spectral density ofZ closely resembles that ofY. We call this methodspectral mimicry. We prove (under certain restrictions) that, if the univariate cumulative distribution function (CDF) ofX is identical to the CDF ofY, then the power spectral density ofZ equals the power spectral density ofY. We also show, for a class of examples, that when the CDFs ofX andY differ modestly, the power spectral density ofZ closely approximates the power spectral density ofY. The algorithm, developed to design an experiment in microbial population dynamics, has a variety of other applications.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 103
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 479-487 
    ISSN: 1531-5878
    Keywords: Cramér-Rao bounds ; direction-of-arrival estimation ; unknown noise
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The deterministic and stochastic direction estimation Cramér-Rao bounds (CRBs) are studied in the presence of one signal and spatially uncorrelated sensor noise with unknown nonequal variances in array sensors. The explicit CRB expressions are obtained, and their relationship is studied showing some typical properties inherent in the nonidentical noise case.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 104
    ISSN: 1531-5878
    Keywords: Nonlinear circuit theory ; co-content ; functional minimization ; image processing
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The solutions of many physical-mathematical problems can be obtained by minimizing proper functionals. In the literature, some methods for the synthesis of analog circuits (mainly cellular neural networks) are presented that find the solution of some of these problems by implementing the discretized Euler-Lagrange equations associated with the pertinent functionals. In this paper, we propose a method for defining analog circuits that directly minimize (in a parallel way) a class of discretized functionals in the frequently occurring case where the solution depends on two spatial variables. The method is a generalization of the one presented in Parodi et al.,Internat. J. Circuit Theory Appl., 26, 477–498, 1998. The analog circuits consist of both a (nonlinear) resistive part and a set of linear capacitors, whose steady-state voltages represent the discrete solution to the problem. The method is based on the potential (co-content) functions associated with voltage-controlled resistive elements. es an example, we describe an application in the field of image processing: the restoration of color images corrupted by additive noise.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 105
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 159-184 
    ISSN: 1531-5851
    Keywords: Primary 65T20 ; secondary 42C10 ; 33C55 ; spherical harmonics ; fast transforms ; associated Legendre functions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Spherical harmonics arise on the sphere S2 in the same way that the (Fourier) exponential functions {eikθ}k∈ℤ arise on the circle. Spherical harmonic series have many of the same wonderful properties as Fourier series, but have lacked one important thing: a numerically stable fast transform analogous to the Fast Fourier Transform (FFT). Without a fast transform, evaluating (or expanding in) spherical harmonic series on the computer is slow—for large computations probibitively slow. This paper provides a fast transform. For a grid ofO(N2) points on the sphere, a direct calculation has computational complexityO(N4), but a simple separation of variables and FFT reduce it toO(N3) time. Here we present algorithms with timesO(N5/2 log N) andO(N2(log N)2). The problem quickly reduces to the fast application of matrices of associated Legendre functions of certain orders. The essential insight is that although these matrices are dense and oscillatory, locally they can be represented efficiently in trigonometric series.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 106
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. v 
    ISSN: 1531-5851
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 107
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. i 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 108
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. v 
    ISSN: 1531-5851
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 109
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 3-24 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract Weakly symmetric homogeneous spaces were introduced by A. Selberg in 1956. We prove that, for a real reductive algebraic group, they can be characterized as the spaces of real points of affine spherical homogeneous varieties of the complexified group. As an application, under the same assumption on the transitive group, we show that weakly symmetric spaces are precisely the homogeneous Riemannian manifolds with commutative algebra of invariant differential operators.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 110
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 53-95 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We study the modificationA→A′ of an affine domainA which produces another affine domainA′=A[I/f] whereI is a nontrivial ideal ofA andf is a nonzero element ofI. First appeared in passing in the basic paper of O. Zariski [Zar], it was further considered by E. D. Davis [Da]. In [Ka1] its geometric counterpart was applied to construct contractible smooth affine varieties non-isomorphic to Euclidean spaces. Here we provide certain conditions (more general than those in [Ka1]) which guarantee preservation of the topology under a modification. As an application, we show that the group of biregular automorphisms of the affine hypersurfaceX⊂C k+2, given by the equationuv=(p(x 1,...,xk) wherep∈C[x 1,...,x k ],k≥2, actsm-transitively on the smooth part regX ofX for anym∈N. We present examples of such hypersurfaces diffeomorphic to Euclidean spaces.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 111
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 273-300 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The symmetric varieties considered in this paper are the quotientsG/H, whereG is an adjoint semi-simple group over a fieldk of characteristic ≠ 2, andH is the fixed point group of an involutorial automorphism ofG which is defined overk. In the casek=C, De Concini and Procesi (1983) constructed a “wonderful” compactification ofG/H. We prove the existence of such a compactification for arbitraryk. We also prove cohomology vanishing results for line bundles on the compactification.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 112
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 303-327 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The Yang-Baxter equation admits two classes of elliptic solutions, the vertex type and the face type. On the basis of these solutions, two types of elliptic quantum groups have been introduced (Foda et al. [FIJKMY1], Felder [Fe]). Frønsdal [Fr1, Fr2] made a penetrating observation that both of them are quasi-Hopf algebras, obtained by twisting the standard quantum affine algebraU q(g). In this paper we present an explicit formula for the twistors in the form of an infinite product of the universalR matrix ofU q(g). We also prove the shifted cocycle condition for the twistors, thereby completing Frønsdal's findings. This construction entails that, for generic values of the deformation parameters, the representation theory forU q(g) carries over to the elliptic algebras, including such objects as evaluation modules, highest weight modules and vertex operators. In particular, we confirm the conjectures of Foda et al. concerning the elliptic algebraA q,p ( $$\widehat{\mathfrak{s}\mathfrak{l}}_2 $$ ).
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 113
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 375-404 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this survey we shall prove a convexity theorem for gradient actions of reductive Lie groups on Riemannian symmetric spaces. After studying general properties of gradient maps, this proof is established by (1) an explicit calculation on the hyperbolic plane followed by a transfer of the results to general reductive Lie groups, (2) a reduction to a problem on abelian spaces using Kostant's Convexity Theorem, (3) an application of Fenchel's Convexity Theorem. In the final section the theorem is applied to gradient actions on other homogeneous spaces and we show, that Hilgert's Convexity Theorem for moment maps can be derived from the results.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 114
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. i 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 115
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 111-130 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The cumulants defined in terms of moments are basic to the study of higher-order statistics (HOS) of a stationary stochastic process. This paper presents a concurrent systolic array system for the computation of higher-order moments. The system allows for the simultaneous computation of the second-, third-, and fourth-order moments. The architecture achieves good speedup through its excellent exploitation of parallelism, pipelining, and reusability of some intermediate results. The computational complexity and system performance issues related to the architecture are discussed. The concurrent system is designed with the CMOS VLSI technology and is capable of operating at 3.9 MHz.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 116
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 183-187 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Necessary and sufficient conditions for the invertibility of a (not necessarily linear) operatorN between normed linear spaces are given. It is shown thatN is invertible precisely if a certain operator associated withN is a contraction.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 117
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 118
    ISSN: 1539-6924
    Keywords: Environment ; equity ; coke ; oil ; history ; risk
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Facility-specific information on pollution was obtained for 36 coke plants and 46 oil refineries in the United States and matched with information on populations surrounding these 82 facilities. These data were analyzed to determine whether environmental inequities were present, whether they were more economic or racial in nature, and whether the racial composition of nearby communities has changed significantly since plants began operations. The Census tracts near coke plants have a disproportionate share of poor and nonwhite residents. Multivariate analyses suggest that existing inequities are primarily economic in nature. The findings for oil refineries are not strongly supportive of the environmental inequity hypothesis. Rank ordering of facilities by race, poverty, and pollution produces limited (although not consistent) evidence that the more risky facilities tend to be operating in communities with above-median proportions of nonwhite residents (near coke plants) and Hispanic residents (near oil refineries). Over time, the racial makeup of many communities near facilities has changed significantly, particularly in the case of coke plants sited in the early 1900s. Further risk-oriented studies of multiple manufacturing facilities in various industrial sectors of the economy are recommended.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 119
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 231-247 
    ISSN: 1539-6924
    Keywords: Health risk assessment ; hazard characterization ; Acceptable Daily Intake ; Reference Dose ; paradigm ; practices ; cancer ; non-cancer ; Bayesian ; default options
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We investigated the way results of human health risk assessments are used, and the theory used to describe those methods, sometimes called the “NAS paradigm.” Contrary to a key tenet of that theory, current methods have strictly limited utility. The characterizations now considered standard, Safety Indices such as “Acceptable Daily Intake,” “Reference Dose,” and so on, usefully inform only decisions that require a choice between two policy alternatives (e.g., approve a food additive or not), decided solely on the basis of a finding of safety. Risk is characterized as the quotient of one of these Safety Indices divided by an estimate of exposure: a quotient greater than one implies that the situation may be considered safe. Such decisions are very widespread, both in the U. S. federal government and elsewhere. No current method is universal; different policies lead to different practices, for example, in California's “Proposition 65,” where statutory provisions specify some practices. Further, an important kind of human health risk assessment is not recognized by this theory: this kind characterizes risk as likelihood of harm, given estimates of exposure consequent to various decision choices. Likelihood estimates are necessary whenever decision makers have many possible decision choices and must weigh more than two societal values, such as in EPA's implementation of “conventional air pollutants.” These estimates can not be derived using current methods; different methods are needed. Our analysis suggests changes needed in both the theory and practice of human health risk assessment, and how what is done is depicted.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 120
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 249-259 
    ISSN: 1539-6924
    Keywords: Reliability ; Monte Carlo simulation ; hazardous waste treatment ; safety factor ; packed tower ; activated sludge
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The reliability of a treatment process is addressed in terms of achieving a regulatory effluent concentration standard and the design safety factors associated with the treatment process. This methodology was then applied to two aqueous hazardous waste treatment processes: packed tower aeration and activated sludge (aerobic) biological treatment. The designs achieving 95 percent reliability were compared with those designs based on conventional practice to determine their patterns of conservatism. Scoping-level treatment costs were also related to reliability levels for these treatment processes. The results indicate that the reliability levels for the physical/chemical treatment process (packed tower aeration) based on the deterministic safety factors range from 80 percent to over 99 percent, whereas those for the biological treatment process range from near 0 percent to over 99 percent, depending on the compound evaluated. Increases in reliability per unit increase in treatment costs are most pronounced at lower reliability levels (less than about 80 percent) than at the higher reliability levels (greater than 90 percent, indicating a point of diminishing returns. Additional research focused on process parameters that presently contain large uncertainties may reduce those uncertainties, with attending increases in the reliability levels of the treatment processes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 121
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 321-321 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 122
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 295-308 
    ISSN: 1539-6924
    Keywords: Noncancer risk assessment ; uncertainty analysis ; systematic error ; calibration ; censoring ; relative potency ; safety factor
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The prominent role of animal bioassay evidence in environmental regulatory decisions compels a careful characterization of extrapolation uncertainties. In noncancer risk assessment, uncertainty factors are incorporated to account for each of several extrapolations required to convert a bioassay outcome into a putative subthreshold dose for humans. Measures of relative toxicity taken between different dosing regimens, different endpoints, or different species serve as a reference for establishing the uncertainty factors. Ratios of no observed adverse effect levels (NOAELs) have been used for this purpose; statistical summaries of such ratios across sets of chemicals are widely used to guide the setting of uncertainty factors. Given the poor statistical properties of NOAELs, the informativeness of these summary statistics is open to question. To evaluate this, we develop an approach to “calibrate” the ability of NOAEL ratios to reveal true properties of a specified distribution for relative toxicity. A priority of this analysis is to account for dependencies of NOAEL ratios on experimental design and other exogenous factors. Our analysis of NOAEL ratio summary statistics finds (1) that such dependencies are complex and produce pronounced systematic errors and (2) that sampling error associated with typical sample sizes (50 chemicals) is non-negligible. These uncertainties strongly suggest that NOAEL ratio summary statistics cannot be taken at face value; conclusions based on such ratios reported in well over a dozen published papers should be reconsidered.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 123
    ISSN: 1539-6924
    Keywords: Cancer risk ; in vivo doses ; linear multiplicative model ; ethylene oxide ; relative potency ; butadiene ; acrylamide
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A mechanistic model and associated procedures are proposed for cancer risk assessment of genotoxic chemicals. As previously shown for ionizing radiation, a linear multiplicative model was found to be compatible with published experimental data for ethylene oxide, acrylamide, and butadiene. The validity of this model was anticipated in view of the multiplicative interaction of mutation with inherited and acquired growth-promoting conditions. Concurrent analysis led to rejection of an additive model (i.e. the model commonly applied for cancer risk assessment). A reanalysis of data for radiogenic cancer in mouse, dog and man shows that the relative risk coefficient is approximately the same (0.4 to 0.5 percent per rad) for tumours induced in the three species. Doses in vivo, defined as the time-integrated concentrations of ultimate mutagens, expressed in millimol × kg−1 × h (mMh) are, like radiation doses given in Gy or rad, proportional to frequencies of potentially mutagenic events. The radiation dose equivalents of chemical doses are, calculated by multiplying chemical doses (in mMh) with the relative genotoxic potencies (in rad × mMh−1) determined in vitro. In this way the relative cancer incidence increments in rats and mice exposed to ethylene oxide were shown to be about 0.4 percent per rad-equivalent, in agreement with the data for radiogenic cancer. Our analyses suggest that values of the relative risk coefficients for genotoxic chemicals are independent of species and that relative cancer risks determined in animal tests apply also to humans. If reliable animal test data are not available, cancer risks may be estimated by the relative potency. In both cases exposure dose/target dose relationships, the latter via macromolecule adducts, should be determined.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 124
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 585-598 
    ISSN: 1539-6924
    Keywords: uncertainty ; threatened plants ; risk ; conservation ; rule sets ; IUCN
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Australian state and federal agencies use a broad range of methods for setting conservation priorities for species at risk. Some of these are based on rule sets developed by the International Union for the Conservation of Nature, while others use point scoring protocols to assess threat. All of them ignore uncertainty in the data. In this study, we assessed the conservation status of 29 threatened vascular plants from Tasmania and New South Wales using a variety of methods including point scoring and rule-based approaches. In addition, several methods for dealing with uncertainty in the data were applied to each of the priority-setting schemes. The results indicate that the choice of a protocol for setting priorities and the choice of the way in which uncertainty is treated may make important differences to the resulting assessments of risk. The choice among methods needs to be rationalized within the management context in which it is to be applied. These methods are not a substitute for more formal risk assessment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 125
    ISSN: 1539-6924
    Keywords: MeHg ; pharmacokinetics ; PBPK model ; variability ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract An analysis of the uncertainty in guidelines for the ingestion of methylmercury (MeHg) due to human pharmacokinetic variability was conducted using a physiologically based pharmacokinetic (PBPK) model that describes MeHg kinetics in the pregnant human and fetus. Two alternative derivations of an ingestion guideline for MeHg were considered: the U.S. Environmental Protection Agency reference dose (RfD) of 0.1 μg/kg/day derived from studies of an Iraqi grain poisoning episode, and the Agency for Toxic Substances and Disease Registry chronic oral minimal risk level (MRL) of 0.5 μg/kg/day based on studies of a fish-eating population in the Seychelles Islands. Calculation of an ingestion guideline for MeHg from either of these epidemiological studies requires calculation of a dose conversion factor (DCF) relating a hair mercury concentration to a chronic MeHg ingestion rate. To evaluate the uncertainty in this DCF across the population of U.S. women of child-bearing age, Monte Carlo analyses were performed in which distributions for each of the parameters in the PBPK model were randomly sampled 1000 times. The 1st and 5th percentiles of the resulting distribution of DCFs were a factor of 1.8 and 1.5 below the median, respectively. This estimate of variability is consistent with, but somewhat less than, previous analyses performed with empirical, one-compartment pharmacokinetic models. The use of a consistent factor in both guidelines of 1.5 for pharmacokinetic variability in the DCF, and keeping all other aspects of the derivations unchanged, would result in an RfD of 0.2 μg/kg/day and an MRL of 0.3 μg/kg/day.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 126
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 577-584 
    ISSN: 1539-6924
    Keywords: risk assessment ; exposure point concentration ; bootstrapping ; gamma distribution ; lognormal
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The U.S. Environmental Protection Agency (EPA) recommends the use of the one-sided 95% upper confidence limit of the arithmetic mean based on either a normal or lognormal distribution for the contaminant (or exposure point) concentration term in the Superfund risk assessment process. When the data are not normal or lognormal this recommended approach may overestimate the exposure point concentration (EPC) and may lead to unecessary cleanup at a hazardous waste site. The EPA concentration term only seems to perform like alternative EPC methods when the data are well fit by a lognormal distribution. Several alternative methods for calculating the EPC are investigated and compared using soil data collected from three hazardous waste sites in Montana, Utah, and Colorado. For data sets that are well fit by a lognormal distribution, values for the Chebychev inequality or the EPA concentration term may be appropriate EPCs. For data sets where the soil concentration data are well fit by gamma distributions, Wong's method may be used for calculating EPCs. The studentized bootstrap-t and Hall's bootstrap-t transformation are recommended for EPC calculation when all distribution fits are poor. If a data set is well fit by a distribution, parametric bootstrap may provide a suitable EPC.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 127
    ISSN: 1539-6924
    Keywords: risk perception ; air quality ; environmental justice ; community health survey
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper describes a multi-stakeholder process designed to assess the potential health risks associated with adverse air quality in an urban industrial neighborhood. The paper briefly describes the quantitative health risk assessment conducted by scientific experts, with input by a grassroots community group concerned about the impacts of adverse air quality on their health and quality of life. In this case, rather than accept the views of the scientific experts, the community used their powers of perception to advantage by successfully advocating for a professionally conducted community health survey. This survey was designed to document, systematically and rigorously, the health risk perceptions community members associated with exposure to adverse air quality in their neighborhood. This paper describes the institutional and community contexts within which the research is situated as well as the design, administration, analysis, and results of the community health survey administered to 402 households living in an urban industrial neighborhood in Hamilton, Ontario, Canada. These survey results served to legitimate the community's concerns about air quality and to help broaden operational definitions of ‘health.’ In addition, the results of both health risk assessment exercises served to keep issues of air quality on the local political agenda. Implications of these findings for our understanding of the environmental justice process as well as the ability of communities to influence environmental health policy are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 128
    ISSN: 1539-6924
    Keywords: risk perception ; risk characteristics ; outrage factors ; rbGH ; ordered probit
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This study estimates the effect risk characteristics, described as outrage factors by Hadden, have on consumers' risk perceptions toward the food-related biotechnology, recombinant bovine growth hormone (rbGH). The outrage factors applicable to milk from rbGH treated herds are involuntary risk exposure, unfamiliarity with the product's production process, unnatural product characteristics, lack of trust in regulator's ability to protect consumers in the marketplace, and consumers' inability to distinguish milk from rbGH treated herds compared to milk from untreated herds. An empirical analysis of data from a national survey of household food shoppers reveals that outrage factors mediate risk perceptions. The results support the inclusion of outrage factors into the risk perception model for the rbGH product, as they add significantly to the explanatory power of the model and therefore reduce bias compared to a simpler model of attitudinal and demographic factors. The study indicates that outrage factors which have a significant impact on risk perceptions are the lack of trust in the FDA as a food-related information source, and perceiving no consumer benefits from farmers' use of rbGH. Communication strategies to reduce consumer risk perceptions therefore could utilize agencies perceived as more trustworthy and emphasize the benefits of rbGH use to consumers.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 129
    ISSN: 1539-6924
    Keywords: risk perceptions ; psychometric paradigm ; multilevel modeling ; random coefficient models
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Psychometric data on risk perceptions are often collected using the method developed by Slovic, Fischhoff, and Lichtenstein, where an array of risk issues are evaluated with respect to a number of risk characteristics, such as how dreadful, catastrophic or involuntary exposure to each risk is. The analysis of these data has often been carried out at an aggregate level, where mean scores for all respondents are compared between risk issues. However, this approach may conceal important variation between individuals, and individual analyses have also been performed for single risk issues. This paper presents a new methodological approach using a technique called multilevel modelling for analysing individual and aggregated responses simultaneously, to produce unconditional and unbiased results at both individual and aggregate levels of the data. Two examples are given using previously published data sets on risk perceptions collected by the authors, and results between the traditional and new approaches compared. The discussion focuses on the implications of and possibilities provided by the new methodology.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 130
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 739-749 
    ISSN: 1539-6924
    Keywords: probabilistic forecasting ; uncertainty quantification ; Bayesian method ; Monte-Carlo simulation ; decision making
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Rational decision making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of two Bayesian methods for producing a probabilistic forecast via any deterministic model. The Bayesian Processor of Forecast (BPF) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution. The BFS is compared with Monte Carlo simulation and “ensemble forecasting” technique, none of which can alone produce a probabilistic forecast that quantifies the total uncertainty, but each can serve as a component of the BFS.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 131
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 759-761 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 132
    ISSN: 1539-6924
    Keywords: regulation ; radioactive waste ; performance assessment ; risk assessment ; regulatory assessment ; bias evaluation ; international collaboration ; underground disposal ; quantitative risk analysis ; public debate ; decision process
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Much has been written about the development and application of quantitative methods for estimating under uncertainty the long-term radiological performance of underground disposal of radioactive wastes. Until recently, interest has been focused almost entirely on the technical challenges regardless of the role of the organization responsible for these analyses. Now the dialogue between regulators, the repository developer or operator, and other interested parties in the decision-making process receives increasing attention, especially in view of some current difficulties in obtaining approvals to construct or operate deep facilities for intermediate or high-level wastes. Consequently, it is timely to consider the options for regulators' review and evaluation of safety submissions, at the various stages in the site selection to repository closure process, and to consider, especially, the role for performance assessment (PA) within the programs of a regulator both before and after delivery of such a submission. The origins and broad character of present regulations in the European Union (EU) and in the OECD countries are outlined and some regulatory PA reviewed. The issues raised are discussed, especially in regard to the interpretation of regulations, the dangers from the desire for simplicity in argument, the use of regulatory PA to review and challenge the PA in the safety case, and the effects of the relationship between proponent and regulator. Finally, a very limited analysis of the role of PA in public hearings is outlined and recommendations are made, together with proposals for improving the mechanisms for international collaboration on technical issues of regulatory concern.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 133
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 23-32 
    ISSN: 1539-6924
    Keywords: Software failures ; software hazard analysis ; safety-critical systems ; risk assessment ; context
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract As the use of digital computers for instrumentation and control of safety-critical systems has increased, there has been a growing debate over the issue of whether probabilistic risk assessment techniques can be applied to these systems. This debate has centered on the issue of whether software failures can be modeled probabilistically. This paper describes a “context-based” approach to software risk assessment that explicitly recognizes the fact that the behavior of software is not probabilistic. The source of the perceived uncertainty in its behavior results from both the input to the software as well as the application and environment in which the software is operating. Failures occur as the result of encountering some context for which the software was not properly designed, as opposed to the software simply failing “randomly.” The paper elaborates on the concept of “error-forcing context” as it applies to software. It also illustrates a methodology which utilizes event trees, fault trees, and the Dynamic Flowgraph Methodology (DFM) to identify “error-forcing contexts” for software in the form of fault tree prime implicants.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 134
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 47-68 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 135
    ISSN: 1539-6924
    Keywords: Variability ; uncertainty ; maximum likelihood ; bootstrap simulation ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Variability arises due to differences in the value of a quantity among different members of a population. Uncertainty arises due to lack of knowledge regarding the true value of a quantity for a given member of a population. We describe and evaluate two methods for quantifying both variability and uncertainty. These methods, bootstrap simulation and a likelihood-based method, are applied to three datasets. The datasets include a synthetic sample of 19 values from a Lognormal distribution, a sample of nine values obtained from measurements of the PCB concentration in leafy produce, and a sample of five values for the partitioning of chromium in the flue gas desulfurization system of coal-fired power plants. For each of these datasets, we employ the two methods to characterize uncertainty in the arithmetic mean and standard deviation, cumulative distribution functions based upon fitted parametric distributions, the 95th percentile of variability, and the 63rd percentile of uncertainty for the 81st percentile of variability. The latter is intended to show that it is possible to describe any point within the uncertain frequency distribution by specifying an uncertainty percentile and a variability percentile. Using the bootstrap method, we compare results based upon use of the method of matching moments and the method of maximum likelihood for fitting distributions to data. Our results indicate that with only 5–19 data points as in the datasets we have evaluated, there is substantial uncertainty based upon random sampling error. Both the boostrap and likelihood-based approaches yield comparable uncertainty estimates in most cases.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 136
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 159-169 
    ISSN: 1539-6924
    Keywords: Trust ; geography ; personality ; environment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract A sample of 323 residents of New Jersey stratified by neighborhood quality (excellent, good, fair, poor) was gathered to determine if trust in science and technology to protect public health and environment at the societal scale was associated with trust of the local officials, such as the mayor, health officer, developers, mass media, and legislators who are guardians of the local environment. Societal (trust of science and technology) and neighborhood (mayor, health officer) dimensions of trust were found. These societal and neighborhood trust dimensions were weakly correlated. Respondents were divided into four trust-of-authority groups: high societal–high neighborhood, low societal–low neighborhood, high societal–low neighborhood, and low societal–high neighborhood. High societal–high neighborhood trust respondents were older, had lived in the neighborhoods for many years, were not troubled much by neighborhood or societal environmental threats, and had a strong sense of control over their environment. In strong contrast, low societal–low neighborhood trust respondents were relatively young, typically had lived in their present neighborhood for a short time, were troubled by numerous neighborhood and societal environmental threats, did not practice many personal public health practices, and felt little control over their environment.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 137
    ISSN: 1539-6924
    Keywords: Risk ; fishing ; ethnicity ; perception ; toxics ; consumption
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Recreational and subsistence angling are important aspects of urban culture for much of North America where people are concentrated near the coasts or major rivers. Yet there are fish and shellfish advisories for many estuaries, rivers, and lakes, and these are not always heeded. This paper examines fishing behavior, sources of information, perceptions, and compliance with fishing advisories as a function of ethnicity for people fishing in the Newark Bay Complex of the New York–New Jersey Harbor. We test the null hypothesis that there were no ethnic differences in sources of information, perceptions of the safety of fish consumption, and compliance with advisories. There were ethnic differences in consumption rates, sources of information about fishing, knowledge about the safety of the fish, awareness of fishing advisories or of the correct advisories, and knowledge about risks for increased cancer and to unborn and young children. In general, the knowledge base was much lower for Hispanics, was intermediate for blacks, and was greatest for whites. When presented with a statement about the potential risks from eating fish, there were no differences in their willingness to stop eating fish or to encourage pregnant women to stop. These results indicate a willingness to comply with advisories regardless of ethnicity, but a vast difference in the base knowledge necessary to make informed risk decisions about the safety of fish and shellfish. Although the overall median income level of the population was in the $25,000–34,999 income category, for Hispanics it was on the border between $15,000–24,999 and $25,000–34,999.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 138
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 545-562 
    ISSN: 1531-5851
    Keywords: 42C15 ; Weyl-Heisenberg frame ; dual window ; spline
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We present a simple proof of Ron and Shen's frame bounds estimates for Gabor frames. The proof is based on the Heil and Walnut's representation of the frame operator and shows that it can be decomposed into a continuous family of infinite matrices. The estimates then follow from a simple application of Gershgorin's theorem to each matrix. Next, we show that, if the window function has exponential decay, also the dual function has some exponential decay. Then, we describe a numerical method to compute the dual function and give an estimate of the error. Finally, we consider the spline of order 2; we investigate numerically the region of the time-frequency plane where it generates a frame and we compute the dual function for some values of the parameters.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 139
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 575-588 
    ISSN: 1531-5851
    Keywords: 42C15 ; 94A12 ; sampling ; multiresolution analysis ; Gibbs phenomenon
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We deal with the maximum Gibbs ripple of the sampling wavelet series of a discontinuous function f at a point t ∈R, for all possible values of a satisfyingf(t)=αf(t−0)+(1−a)f(t+0). For the Shannon wavelet series, we make a complete description of all ripples, for any a in [0,1]. We show that Meyer sampling series exhibit Gibbs Phenomenon for α〈0.12495 and α〉0.306853. We also give Meyer sampling formulas with maximum overshoots shorter than Shannon's for several α in [0,1].
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 140
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. v 
    ISSN: 1531-5851
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 141
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 127-157 
    ISSN: 1531-5851
    Keywords: Primary 30D10 ; 42C30 ; Secondary 40G99 ; 41A58 ; 94A12
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract It is well known that Gabor expansions generated by a lattice of Nyquist density are numerically unstable, in the sense that they do not constitute frame decompositions. In this paper, we clarify exactly how “bad” such Gabor expansions are, we make it clear precisely where the edge is between “enough” and “too little,” and we find a remedy for their shortcomings in terms of a certain summability method. This is done through an investigation of somewhat more general sequences of points in the time-frequency plane than lattices (all of Nyquist density), which in a sense yields information about the uncertainty principle on a finer scale than allowed by traditional density considerations. An important role is played by certain Hilbert scales of function spaces, most notably by what we call the Schwartz scale and the Bargmann scale, and the intrinsically interesting fact that the Bargmann transform provides a bounded invertible mapping between these two scales. This permits us to turn the problems into interpolation problems in spaces of entire functions, which we are able to treat.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 142
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 35-52 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract LetG be a classical algebraic group defined over an algebraically closed field. We classify all instances when a parabolic subgroupP ofG acts on its unipotent radicalP u , or onp u , the Lie algebra ofP u , with only a finite number of orbits. The proof proceeds in two parts. First we obtain a reduction to the case of general linear groups. In a second step, a solution for these is achieved by studying the representation theory of a particular quiver with certain relations. Furthermore, for general linear groups we obtain a combinatorial formula for the number of orbits in the finite cases.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 143
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 331-345 
    ISSN: 1531-5851
    Keywords: 26B35 ; 42B05 ; 42B99 ; chirp ; oscillating function
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We show that the trace of an indefinitely oscillating function on a subspace of ℝd is not always indefinitely oscillating. In the periodic case, the number of oscillations of the trace depends on the regularity of the function. In the general case, we exhibit a definitive counter-example.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 144
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 385-408 
    ISSN: 1531-5851
    Keywords: Primary: 45E05, 47A10 ; Secondary: 35J25, 42B20 ; layer potentials ; spectral radius ; polyhedra ; Lipschitz domains
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract By producing a L2 convergent Neumann series, we prove the invertibility of the elastostatics and hydrostatics boundary layer potentials on arbitrary Lipschitz domains with small Lipschitz character and 3D polyhedra with large dihedral angles.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 145
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 431-447 
    ISSN: 1531-5851
    Keywords: Eigenfunction expansions ; localization ; Primary: 42C14 ; Secondary: 42B08
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We state a localization principle for expansions in eigenfunctions of a self-adjoint second order elliptic operator and we prove an equiconvergence result between eigenfunction expansions and trigonometric expansions. We then study the Gibbs phenomenon for eigenfunction expansions of piecewise smooth functions on two-dimensional manifolds.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 146
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 523-544 
    ISSN: 1531-5851
    Keywords: primary 62H05 ; 60E10 ; secondary 32E25 ; Cauchy type integral ; characteristic function ; completely monotonicity ; Liouville numbers ; Plemelj-Sokhotskii formula ; unimodality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract The function $$\varphi _\alpha ^\theta (t) = \frac{1}{{1 + e^{ - i\theta \operatorname{s} gnt} \left| t \right|^\alpha }},\alpha \in (0,2),\theta \in ( - \pi ,\pi ]$$ , is a characteristic function of a probability distribution iff $$\left| \theta \right| \leqslant \min (\tfrac{{\pi \alpha }}{2},\pi - \tfrac{{\pi \alpha }}{2})$$ . This distribution is absolutely continuous; for θ=0 it is symmetric. The latter case was introduced by Linnik in 1953 [13] and several applications were found later. The case θ≠0 was introduced by Klebanov, Maniya, and Melamed in 1984 [9], while some special cases were considered previously by Laha [12] and Pillai [18]. In 1994, Kotz, Ostrovskii and Hayfavi [10] carried out a detailed investigation of analytic and asymptotic properties of the density of the distribution for the symmetric case θ=0. We generalize their results to the non-symmetric case θ≠0. As in the symmetric case, the arithmetical nature of the parameter α plays an important role, but several new phenomena appear.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 147
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 589-597 
    ISSN: 1531-5851
    Keywords: 42C15 ; 39B99 ; multivariate ; nonhomogeneous ; refinement
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We give necessary and sufficient conditions for the existence and uniqueness of compactly supported distribution solutionsf=(f 1,...,f r)T of nonhomogeneous refinement equations of the form $$f(x) = h(x) + \sum\limits_{\alpha \in A} {c_\alpha f(2x - \alpha )(x \in R^s )} $$ , where h=(h1,...,hr)Tis a compactly supported vector-valued multivariate distribution, A⊂Z+ s has compact support, and the coefficientsc α are real-valued r×r matrices. In particular, we find a finite dimensional matrix B, constructed from the coefficientsc α of the equation (I−B)q=p, where the vectorp depends on h. Our proofs proceed in the time domain and allow us to represent each solution regardless of the spectral radius of P(0):=2−s∑c α , which has been a difficulty in previous investigations of this nature.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 148
    Electronic Resource
    Electronic Resource
    Springer
    The journal of Fourier analysis and applications 5 (1999), S. 599-615 
    ISSN: 1531-5851
    Keywords: 41A25 ; 42C15 ; 47B35 ; 15A99 ; shift-invariant systems ; finite section method ; block Toeplitz matrices ; Laurent operator ; Gabor frame ; filter banks ; band matrices
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A shift-invariant system is a collection of functions {gm,n} of the form gm,n(k)=gm(k−an). Such systems play an important role in time-frequency analysis and digital signal processing. A principal problem is to find a dual system γm,n(k)=γm(k−an) such that each functionf can be written asf= ∑〈f, γm,n〉gm,n. The mathematical theory usually addresses this problem in infinite dimensions (typically in L2 (ℝ) or ℓ2(ℤ)), whereas numerical methods have to operate with a finite-dimensional model. Exploiting the link between the frame operator and Laurent operators with matrix-valued symbol, we apply the finite section method to show that the dual functions obtained by solving a finite-dimensional problem converge to the dual functions of the original infinite-dimensional problem in ℓ2(ℤ). For compactly supported gm, n (FIR filter banks) we prove an exponential rate of convergence and derive explicit expressions for the involved constants. Further we investigate under which conditions one can replace the discrete model of the finite section method by the periodic discrete model, which is used in many numerical procedures. Again we provide explicit estimates for the speed of convergence. Some remarks on tight frames complete the paper.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 149
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 25-34 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We consider actions of compact real Lie GroupsK on complex spacesX such that the associated reducedK-space admits a semistable quotient, e.g.X is a Stein space. We show that there is a complex spaceX c endowed with a holomorphic action of the universal complexificationG ofK that containsX as an openK-stable subset. As our main result, we prove that every coherentK-sheaf onX extends uniquely to a holomorphicG-sheaf onX c .
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 150
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 97-101 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract A characterization of the complexity of a homogeneous space $$\mathcal{O}$$ of a reductive groupG is given in terms of the mutual position of the tangent Lie algebra of the stabilizer of a generic point of $$\mathcal{O}$$ and the (−1)-eigenspace of a Weyl involution of $$\mathcal{O}$$ .
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 151
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 119-125 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 152
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 219-272 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract In this paper we classifyℤ-graded transitive Lie superalgebras with prescribed nonpositive parts listed in [K2]. The classification of infinite-dimensional simple linearly compact Lie superalgebras given in [K2] is based on this result. We also study the structure of the exceptionalℤ-graded transitive Lie superalgebras and give their geometric realization.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 153
    Electronic Resource
    Electronic Resource
    Springer
    Transformation groups 4 (1999), S. 329-353 
    ISSN: 1531-586X
    Source: Springer Online Journal Archives 1860-2000
    Topics: Mathematics
    Notes: Abstract We determine the covolumes of all hyperbolic Coxeter simplex reflection groups. These groups exist up to dimension 9. the volume computations involve several different methods according to the parity of dimension, subgroup relations and arithmeticity properties.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 154
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 315-329 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In earlier studies a multiclass vector quantization (MVQ)-based neural network design was explored for pattern classification. We reconsider that design here in the context of function emulation. With proper adjustment, the MVQ design demonstrates excellent performance. Moreover, the design algorithms sense discontinuities in the data and replicate them in the network.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 155
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 365-376 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In earlier studies the concept of a fast form was generalized to a format that unified such discrete transform examples as the FFT, the FCT, the FST, and the FHT. In this study we consider the approximation of arbitrary linear maps by fast forms. Using simulation we evaluate the approximation capabilities of the generalized fast form.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 156
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 377-393 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract On-line running spectral analysis is of considerable interest in many electrophysiological signals, such as the EEG (electroencephalograph). This paper presents a new method of implementing the fast Fourier transform (FFT) algorithm. Our “real-time FFT algorithm” efficiently utilizes computer time to perform the FFT computation while data acquisition proceeds so that local butterfly modules are built using the data points that are already available. The real-time FFT algorithm is developed using the decimation-in-time split-radix FFT (DIT sr-FFT) butterfly structure. In order to demonstate the synchronization ability of the proposed algorithm, the authors develop a method of evaluating the number of arithmetic operations that it requires. Both the derivation and the experimental result show that the real-time FFT algorithm is superior to the conventional whole-block FFT algorithm in synchronizing with the data acquisition process. Given that the FFT sizeN=2 r , real-time implementation of the FFT algorithm requires only 2/r the computational time required by the whole-block FFT algorithm.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 157
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 523-523 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 158
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 539-551 
    ISSN: 1531-5878
    Keywords: H 2 deconvolution filter ; envelope-constrained filter ; finite impulse response filter ; linear matrix inequality
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper, we consider an envelope-constrained (EC)H 2 optimal finite impulse response (FIR) filtering problem. Our aim is to design a filter such that theH 2 norm of the filtering error transfer function is minimized subject to the constraint that the filter output with a given input to the signal system is contained or bounded by a prescribed envelope. The filter design problem is formulated as a standard optimization problem with linear matrix inequality (LMI) constraints. Furthermore, by relaxing theH 2 norm constraint, we propose a robust ECFIR filter design algorithm based on the LMI approach.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 159
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 565-585 
    ISSN: 1531-5878
    Keywords: Two-dimensional system ; model conversion ; Roesser's model
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract This paper presents a geometric-series method for finding two-dimensional (2D) discrete-time (continuous-time) state-space models from 2D continuous-time (discretetime) systems. This method allows the use of well-developed theorems and algorithms in the 2D discrete-time (continuous-time) domain to indirectly carry out analysis and design of hybrid 2D composite systems.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 160
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 505-521 
    ISSN: 1531-5878
    Keywords: Multirate systems ; multirate signal processing ; filter banks ; optimal design
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The design of general nonuniform filter banks is studied. Contrary to uniform filter banks, in nonuniform filter banks, it may not be possible to achieve perfect reconstruction, but in some cases by using optimization techniques, we can design acceptable filter banks. Here, the initial finite impulse response (FIR) analysis filters are designed according to the characteristics of the input. By the design procedure, the FIR synthesis filters are found so that theH-norm of an error system is minimized over all synthesis filters that have a prespecified order. Then, the synthesis filters obtained in the previous step are fixed, and the analysis filters are found similarly. By iteration, theH-norm of the error system decreases until it converges to its final value. At each iteration, the coefficients of the analysis or synthesis filters are obtained by finding the least squares solution of a system of linear equations. If necessary, the frequency characteristics of the filters can be altered by adding penalty terms to the objective function.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 161
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 617-617 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 162
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 43-57 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract We consider multirate digital control systems that consist of an interconnection of a continuous-time nonlinear plant (described by ordinary differential equations) and a digital lifted controller (described by ordinary difference equations). The input to the digital controller consists of the multirate sampled output of the plant, and the input to the continuous-time plant consists of the multirate hold output of the digital controller. In this paper we show that when quantizer nonlinearities are neglected, then under reasonable conditions (which exclude the critical cases), the stability properties (in the Lyapunov sense) of the trivial solution of the nonlinear multirate digital control system can be deduced from the stability properties of the trivial solution of its linearization. We also point out that certain results involving quantization effects and stabilizing controllers can be established which are in the spirit of some existing results.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 163
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 59-73 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Chaotic systems provide a simple means of generating deterministic signals that resemble white noise. It is this noise-like property that provides the potential for applying chaotic systems in communications. In this work, we report a detailed study of the logistic map for use as direct-sequence spread-spectrum (DS/SS) codes. The advantages of the chaotic DS/SS codes are the almost unlimited number of distinct sequences of arbitrary lengths, the ease of generating these sequences, and the increased privacy afforded by the noise-like appearance of these sequences. Some design criteria are provided from the correlation properties of these sequences, and bit-error rate (BER) results are generated by Monte Carlo simulations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 164
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 75-84 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The asymptotic behavior of block floating-point and floating-point digital filters is analyzed. As a result, mantissa wordlength conditions are derived guaranteeing the absence of limit cycles in the regular dynamic range. Explicitly, the requirements are given for block floating-point state space filters with different quantization formats. Although these conditions are only sufficient, examples are given in which they are also necessary. In most cases the conditions are easily satisfied.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 165
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 85-85 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 166
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 87-88 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 167
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 189-190 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 168
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 169-181 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In this paper the Cramér-Rao bound (CRB) for a general nonparametric spectral estimation problem is derived under a local smoothness condition (more exactly, the spectrum is assumed to be well approximated by a piecewise constant function). Further-more, it is shown that under the aforementioned condition the Thomson method (TM) and Daniell method (DM) for power spectral density (PSD) estimation can be interpreted as approximations of the maximum likelihood PSD estimator. Finally the statistical efficiency of the TM and DM as nonparametric PSD estimators is examined and also compared to the CRB for autoregressive moving-average (ARMA)-based PSD estimation. In particular for broadband signals, the TM and DM almost achieve the derived nonparametric performance bound and can therefore be considered to be nearly optimal.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 169
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 149-168 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The issue of stability of higher-order, single-stage Sigma-Delta (ΣΔ) modulators is addressed using a method from nonlinear system theory. As a result, theoretical bounds for the quantizer input of the modulators are derived. A new method for stabilizing the ΣΔ modulators is then presented. It uses the quantizer input bound for possible instability detection. Upon detection of such a state, the highest-order integrator is cut off, effectively reducing the order of the modulator, and thus resulting in a stable system. The method is easily implemented and results in a very good signal-to-noise ratio (SNR) and fast return to normal operation compared to other stabilization methods.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 170
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 225-239 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract The uncorrelated component analysis (UCA) of a stationary random vector process consists of searching for a linear transformation that minimizes the temporal correlation between its components. Through a general analysis we show that under practically reasonable and mild conditions UCA is a solution for blind source separation. The theorems proposed in this paper for UCA provide useful insights for developing practical algorithms. UCA explores the temporal information of the signals, whereas independent component analysis (ICA) explores the spatial information; thus UCA can be applied for source separation in some cases where ICA cannot. For blind source separation, combining ICA and UCA may give improved performance because more information can be utilized. The concept of single UCA (SUCA) is also proposed, which leads to sequential source separation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 171
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. i 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 172
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 331-350 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract This paper analyzes the nonlinear phenomenon of rotating stall via the methods of projection [Elementary Stability and Bifurcation Theory, G. Iooss and D. D. Joseph, Springer-Verlag, 1980] and Lyapunov [J.-H. Fu,Math. Control Signal Systems, 7, 255–278, 1994]. A compressor model of Moore and Greitzer is adopted in which rotating stall dynamics are associated with Hopf bifurcations. Local stability for each pair of the critical modes is studied and characterized. It is shown that local stability of individual pairs of the critical modes determines collectively local stability of the compressor model. Explicit conditions are obtained for local stability of rotating stall which offer new insight into the design, and active control of axial flow compressors.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 173
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 407-429 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract We describe herein a new means of training dynamic multilayer nonlinear adaptive filters, orneural networks. We restrict our discussion to multilayer dynamic Volterra networks, which are structured so as to restrict their degrees of computational freedom, based on a priori knowledge about the dynamic operation to be emulated. The networks consist of linear dynamic filters together with nonlinear generalized single-layer subnets. We describe how a Newton-like optimization strategy can be applied to these dynamic architectures and detail a newmodified Gauss-Newton optimization technique. The new training algorithm converges faster and to a smaller value of cost than backpropagation-through-time for a wide range of adaptive filtering applications. We apply the algorithm to modeling the inverse of a nonlinear dynamic tracking system. The superior performance of the algorithm over standard techniques is demonstrated.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 174
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 445-455 
    ISSN: 1531-5878
    Keywords: Linear-phase digital filters ; FIR integrators
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract In many signal processing situations, the desired (ideal) magnitude response of the filter is a rational function: $$\tilde H(\omega ) = |1/\omega |$$ (a digital integrator). The requirements of a linear phase response and guaranteed stable performance limit the design to a finite impulse response (FIR) structure. In many applications we require the FIR filter to yield a highly accurate magnitude response for a narrow band of frequencies with maximal flatness at an arbitrary frequencyω 0 in the spectrum (0, π). No techniques for meeting such requirements with respect to approximation of $$\tilde H(\omega )$$ are known in the literature. This paper suggests a design by which the linear phase magnitude response $$|\tilde H(\omega )|$$ can be approximated by an FIR configuration giving a maximally flat (in the Butterworth sense) response at an arbitrary frequency ω0, 0〈ω0〈π*. A technique to compute exact weights for the design has also been given.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 175
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 524-524 
    ISSN: 1531-5878
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 176
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 525-537 
    ISSN: 1531-5878
    Keywords: Random sampling, digital signal processing ; spectral estimation ; computational algorithm
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Random sampling is one of the methods used to achieve sub-Nyquist sampling. This paper proposes a novel algorithm to evaluate the circular autocorrelation of a randomly sampled sequence, from which its power density spectrum can be obtained. With uniform sampling, the size of each lag (the step size) for computing an autocorrelation of a sequence is the same as the sampling period. When random sampling is adopted, the step size should be chosen such that the highest-frequency component of interest contained in a sequence can be accommodated. To find overlaps between a time sequence and its shifted version, an appropriate window is opened in one of the time sequences. To speed up the process, a marker is set to limit the range of searching for overlaps. The proposed method of estimating the power spectrum via autocorrelation is comparable in terms of accuracy and signal-to-noise ratio (SNR) to the conventional point rule. The techniques introduced can also apply to other operations for randomly sampled sequences.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 177
    Electronic Resource
    Electronic Resource
    Springer
    Circuits, systems and signal processing 18 (1999), S. 587-609 
    ISSN: 1531-5878
    Keywords: Transfinite graphs ; infinite electrical networks ; current flows through infinity
    Source: Springer Online Journal Archives 1860-2000
    Topics: Electrical Engineering, Measurement and Control Technology
    Notes: Abstract Transfinite electrical networks of ranks larger than 1 have previously been defined by arbitrarily joining together various infinite extremities through transfinite nodes that are independent of the networks' resistance values. Thus, some or all of those transfinite nodes may remain ineffective in transmitting current “through infinity.” In this paper, transfinite nodes are defined in terms of the paths that permit currents to ”reach infinity.” This is accomplished by defining a suitable metricd v on the node setN S v of eachv-sectionS v, av-section being a maximal subnetwork whose nodes are connected by two-ended paths of ranks no larger thanv. Upon taking the completion ofN S v under that metricd v, we identify those extremities (now calledv-terminals) that are accessible to current flows. These are used to define transfinite nodes that combine such extremities. The construction is recursive and is carried out through all the natural number ranks, and then through the first arrow rank ω and the first limit-ordinal rank ω. The recursion can be carried still further. All this provides a more natural development of transfinite networks and indeed simplifies the theory of electrical behavior for such networks.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 178
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 179
    ISSN: 1539-6924
    Keywords: Cancer dose–response modeling ; multistage model ; two-stage model ; hazard functions ; carcinogenesis ; Benzene ; Dieldrin ; Ethylene Thiourea ; Trichloroethylene ; Vinyl Chloride
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract We compare the regulatory implications of applying the traditional (linearized) and exact two-stage dose–response models to animal carcinogenic data. We analyze dose–response data from six studies, representing five different substances, and we determine the “goodness-of-fit” of each model as well as the 95% confidence lower limit of the dose corresponding to a target excess risk of 10−5 (the target risk dose TRD). For the two concave datasets, we find that the exact model gives a substantially better fit to the data than the traditional model, and that the exact model gives a TRD that is an order of magnitude lower than that given by the traditional model. In the other cases, the exact model gives a fit equivalent to or better than the traditional model. We also show that although the exact two-stage model may exhibit dose–response concavity at moderate dose levels, it is always linear or sublinear, and never supralinear, in the low-dose limit. Because regulatory concern is almost always confined to the low-dose region extrapolation, supralinear behavior seems not to be of regulatory concern in the exact two-stage model. Finally, we find that when performing this low-dose extrapolation in cases of dose–response concavity, extrapolating the model fit leads to a more conservative TRD than taking a linear extrapolation from 10% excess risk. We conclude with a set of recommendations.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 180
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 69-81 
    ISSN: 1539-6924
    Keywords: Parameters ; probability distributions ; validity
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract In any model the values of estimates for various parameters are obtained from different sources each with its own level of uncertainty. When the probability distributions of the estimates are obtained as opposed to point values only, the measurement uncertainties in the parameter estimates may be addressed. However, the sources used for obtaining the data and the models used to select appropriate distributions are of differing degrees of uncertainty. A hierarchy of different sources of uncertainty based upon one's ability to validate data and models empirically is presented. When model parameters are aggregated with different levels of the hierarchy represented, this implies distortion or degradation in the utility and validity of the models used. Means to identify and deal with such heterogeneous data sources are explored, and a number of approaches to addressing this problem is presented. One approach, using Range/Confidence Estimates coupled with an Information Value Analysis Process, is presented as an example.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 181
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1-2 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 182
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 183
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 33-42 
    ISSN: 1539-6924
    Keywords: Uncertainty ; model uncertainty ; epistemic uncertainty ; integrated assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The characterization and treatment of uncertainty poses special challenges when modeling indeterminate or complex coupled systems such as those involved in the interactions between human activity, climate and the ecosystem. Uncertainty about model structure may become as, or more important than, uncertainty about parameter values. When uncertainty grows so large that prediction or optimization no longer makes sense, it may still be possible to use the model as a “behavioral test bed” to examine the relative robustness of alternative observational and behavioral strategies. When models must be run into portions of their phase space that are not well understood, different submodels may become unreliable at different rates. A common example involves running a time stepped model far into the future. Several strategies can be used to deal with such situations. The probability of model failure can be reported as a function of time. Possible alternative “surprises” can be assigned probabilities, modeled separately, and combined. Finally, through the use of subjective judgments, one may be able to combine, and over time shift between models, moving from more detailed to progressively simpler order-of-magnitude models, and perhaps ultimately, on to simple bounding analysis.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 184
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 187-203 
    ISSN: 1539-6924
    Keywords: Combining probabilities ; expert judgment ; probability assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper concerns the combination of experts' probability distributions in risk analysis, discussing a variety of combination methods and attempting to highlight the important conceptual and practical issues to be considered in designing a combination process in practice. The role of experts is important because their judgments can provide valuable information, particularly in view of the limited availability of “hard data” regarding many important uncertainties in risk analysis. Because uncertainties are represented in terms of probability distributions in probabilistic risk analysis (PRA), we consider expert information in terms of probability distributions. The motivation for the use of multiple experts is simply the desire to obtain as much information as possible. Combining experts' probability distributions summarizes the accumulated information for risk analysts and decision-makers. Procedures for combining probability distributions are often compartmentalized as mathematical aggregation methods or behavioral approaches, and we discuss both categories. However, an overall aggregation process could involve both mathematical and behavioral aspects, and no single process is best in all circumstances. An understanding of the pros and cons of different methods and the key issues to consider is valuable in the design of a combination process for a specific PRA. The output, a “combined probability distribution,” can ideally be viewed as representing a summary of the current state of expert opinion regarding the uncertainty of interest.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 185
    ISSN: 1539-6924
    Keywords: Hormesis ; U-shaped ; adaptive response ; low dose ; β-curve ; stimulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract From a comprehensive search of the literature, the hormesis phenomenon was found to occur over a wide range of chemicals, taxonomic groups, and endpoints. By use of computer searches and extensive cross-referencing, nearly 3000 potentially relevant articles were identified. Evidence of chemical and radiation hormesis was judged to have occurred in approximately 1000 of these by use of a priori criteria. These criteria included study design features (e.g., number of doses, dose range), dose–response relationship, statistical analysis, and reproducibility of results. Numerous biological endpoints were assessed, with growth responses the most prevalent, followed by metabolic effects, reproductive responses, longevity, and cancer. Hormetic responses were generally observed to be of limited magnitude with an average maximum stimulation of 30 to 60 percent over that of the controls. This maximum usually occurred 4- to 5-fold below the NOAEL for a particular endpoint. The present analysis suggests that hormesis is a reproducible and generalizable biological phenomenon and is a fundamental component of many, if not most, dose–response relationships. The relatively infrequent observation of hormesis in the literature is believed to be due primarily to experimental design considerations, especially with respect to the number and range of doses and endpoint selection. Because of regulatory considerations, most toxicologic studies have been carried out at high doses above the low-dose region where the hormesis phenomenon occurs.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 186
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 323-326 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 187
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 327-334 
    ISSN: 1539-6924
    Keywords: Biological introductions ; binucleate Rhizoctonia ; biocontrol ; risk assessment ; seedlings ; susceptibility
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This article describes an application of a method for assessing risks associated with the introduction of an organism into a new environment. The test organism was a binucleate Rhizoctonia fungal isolate that has potential for commercial development as a biological control agent for damping-off diseases in bedding plants. A test sample of host plant species was selected using the centrifugal phylogenetic host range principles, but with an emphasis on economic species. The effect of the fungus on the plant was measured for each species and expressed on a logarithmic scale. The effects on weights of shoots and roots per container were not normally distributed, nor were the effects on the number of plants standing (those which survived). Statements about the effect on the number standing and the shoot weight per container involved using the observed (empirical) distribution. This is illustrated with an example. Problems were encountered in defining the population of species at risk, and in deciding how this population should be formally sampled. The limitations of the method are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 188
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract This paper discusses a successful public involvement effort that addressed and resolved several highly controversial water management issues involving environmental and flood risks associated with an electrical generation facility in British Columbia. It begins with a discussion of concepts for designing public involvement, summarizing research that indicates why individuals and groups may find it difficult to make complex choices. Reasons for public involvement, and the range of current practices are discussed. Next, four principles for designing group decision process are outlined, emphasizing decision-aiding concepts that include “value-focused thinking” and “adaptive management.” The next sections discuss the Alouette River Stakeholder Committee process in terms of objectives, participation, process, methods for structuring values and creating alternatives, information sources, and results. Discussion and conclusions complete the paper.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 189
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 417-426 
    ISSN: 1539-6924
    Keywords: Pathway analysis ; radiological risk assessment ; dose assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Many different radionuclides have been released to the environment from the Savannah River Site (SRS) during the facility's operational history. However, as shown by this analysis, only a small number of the released radionuclides have been significant contributors to potential doses and risks to off-site people. This article documents the radiological critical contaminant/critical pathway analysis performed for SRS. If site missions and operations remain constant over the next 30 years, only tritium oxide releases are projected to exceed a maximally exposed individual (MEI) risk of 1.0E-06 for either the airborne or liquid pathways. The critical exposure pathways associated with site airborne releases are inhalation and vegetation consumption, whereas the critical exposure pathways associated with liquid releases are drinking water and fish consumption. For the SRS-specific, nontypical exposure pathways (i.e., recreational fishing and deer and hog hunting), cesium-137 is the critical radionuclide.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 190
    ISSN: 1539-6924
    Keywords: nuclear weapons sites ; accelerated cleanup ; economic impact ; Savannah River ; Rocky Flats ; Hanford ; INEEL ; Oak Ridge ; Los Alamos ; Sandia
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The regional economic impacts of the U.S. Department of Energy's accelerated environmental cleanup plan are estimated for the major nuclear weapons sites in Colorado, Idaho, New Mexico, South Carolina, Tennessee, and Washington. The analysis shows that the impact falls heavily on the three relatively rural regions around the Savannah River (SC), Hanford (WA), and Idaho National Engineering and Environmental Laboratory (ID) sites. A less aggressive phase-down of environmental management funds and separate funds to invest in education and infrastructure in the regions helps buffer the impacts on jobs, personal income, and gross regional product. Policy options open to the federal and state and local governments are discussed.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 191
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 649-659 
    ISSN: 1539-6924
    Keywords: risk perception ; risk communication ; intuitive toxicology ; mental models
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The concept of exposure is central to chemical risk assessment and plays an important role in communicating to the public about the potential health risks of chemicals. Research on chemical risk perception has found some indication that the model lay people use to judge chemical exposure differs from that of toxicologists, thereby leading to different conclusions about chemical safety. This paper presents the results of a series of studies directed toward developing a model for understanding how lay people interpret the concept of chemical exposure. The results indicate that people's beliefs about chemical exposure (and its risks) are based on two broad categories of inferences. One category of inferences relates to the nature in which contact with a chemical has taken place, including the amount of a chemical involved and its potential health consequences. A second category of inferences about chemical exposure relates to the pragmatics of language interpretation, leading to beliefs about the motives and purposes behind chemical risk communication. Risk communicators are encouraged to consider how alternative models of exposure and language interpretation can lead to conflicting conclusions on the part of the public about chemical safety.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 192
    ISSN: 1539-6924
    Keywords: Cancer model ; cell proliferation ; two-stage model ; approximate solution ; MVK model ; hazard rate
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The approximate solution of the two-stage clonal expansion model of cancer may substantially deviate from the exact solution, and may therefore lead to erroneous conclusions in particular applications. However, for time-varying parameters the exact solution (method of characteristics) is not easy to implement, hampering the accessibility of the model to nonmathematicians. Based on intuitive reasoning, Clewell et al. (1995) proposed an improved approximate solution that is easy to implement whatever time-varying behavior the parameters may have. Here we provide the mathematical foundation for the approximation suggested by Clewell et al. (1995) and show that, after a slight modification, it is in fact an exact solution for the case of time-constant parameters. We were not able to prove that it is an exact solution for time-varying parameters as well. However, several computer simulations showed that the numerical results do not differ from the exact solution as proposed by Moolgavkar and Luebeck (1990). The advantage of this alternative solution is that the hazard rate of the first malignant cell can be evaluated by numerically integrating a single differential equation.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 193
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 43-46 
    ISSN: 1539-6924
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 194
    ISSN: 1539-6924
    Keywords: risk-tradeoff analysis ; building codes ; housing ; health effects ; QALY
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Residential building codes intended to promote health and safety may produce unintended countervailing risks by adding to the cost of construction. Higher construction costs increase the price of new homes and may increase health and safety risks through “income” and “stock” effects. The income effect arises because households that purchase a new home have less income remaining for spending on other goods that contribute to health and safety. The stock effect arises because suppression of new-home construction leads to slower replacement of less safe housing units. These countervailing risks are not presently considered in code debates. We demonstrate the feasibility of estimating the approximate magnitude of countervailing risks by combining the income effect with three relatively well understood and significant home-health risks. We estimate that a code change that increases the nationwide cost of constructing and maintaining homes by $150 (0.1% of the average cost to build a single-family home) would induce offsetting risks yielding between 2 and 60 premature fatalities or, including morbidity effects, between 20 and 800 lost quality-adjusted life years (both discounted at 3%) each year the code provision remains in effect. To provide a net health benefit, the code change would need to reduce risk by at least this amount. Future research should refine these estimates, incorporate quantitative uncertainty analysis, and apply a full risk-tradeoff approach to real-world case studies of proposed code changes.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 195
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1071-1076 
    ISSN: 1539-6924
    Keywords: upper confidence limit ; likelihood-based confidence limit ; multistage carcinogenesis model ; Monte Carlo simulation
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The estimation of health risks from exposure to a mixture of chemical carcinogens is generally based on the combination of information from several available single compound studies. The current practice of directly summing the upper bound risk estimates of individual carcinogenic components as an upper bound on the total risk of a mixture is known to be generally too conservative. Gaylor and Chen (1996, Risk Analysis) proposed a simple procedure to compute an upper bound on the total risk using only the upper confidence limits and central risk estimates of individual carcinogens. The Gaylor-Chen procedure was derived based on an underlying assumption of the normality for the distributions of individual risk estimates. In this paper we evaluated the Gaylor-Chen approach in terms of the coverage probability. The performance of the Gaylor-Chen approach in terms the coverages of the upper confidence limits on the true risks of individual carcinogens. In general, if the coverage probabilities for the individual carcinogens are all approximately equal to the nominal level, then the Gaylor-Chen approach should perform well. However, the Gaylor-Chen approach can be conservative or anti-conservative if some or all individual upper confidence limit estimates are conservative or anti-conservative.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 196
    ISSN: 1539-6924
    Keywords: dose-response ; models ; food-borne ; pathogens ; risk assessment
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Food-related illness in the United States is estimated to affect over six million people per year and cost the economy several billion dollars. These illnesses and costs could be reduced if minimum infectious doses were established and used as the basis of regulations and monitoring. However, standard methodologies for dose-response assessment are not yet formulated for microbial risk assessment. The objective of this study was to compare dose-response models for food-borne pathogens and determine which models were most appropriate for a range of pathogens. The statistical models proposed in the literature and chosen for comparison purposes were log-normal, log-logistic, exponential, β-Poisson and Weibull-Gamma. These were fit to four data sets also taken from published literature, Shigella flexneri, Shigella dysenteriae,Campylobacter jejuni, and Salmonella typhosa, using the method of maximum likelihood. The Weibull-gamma, the only model with three parameters, was also the only model capable of fitting all the data sets examined using the maximum likelihood estimation for comparisons. Infectious doses were also calculated using each model. Within any given data set, the infectious dose estimated to affect one percent of the population ranged from one order of magnitude to as much as nine orders of magnitude, illustrating the differences in extrapolation of the dose response models. More data are needed to compare models and examine extrapolation from high to low doses for food-borne pathogens.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 197
    ISSN: 1539-6924
    Keywords: ethylene oxide ; risk assessment ; epidemiology ; cancer guidelines
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Ethylene oxide (EO) research has significantly increased since the 1980s, when regulatory risk assessments were last completed on the basis of the animal cancer chronic bioassays. In tandem with the new scientific understanding, there have been evolutionary changes in regulatory risk assessment guidelines, that encourage flexibility and greater use of scientific information. The results of an updated meta-analysis of the findings from 10 unique EO study cohorts from five countries, including nearly 33,000 workers, and over 800 cancers are presented, indicating that EO does not cause increased risk of cancers overall or of brain, stomach or pancreatic cancers. The findings for leukemia and non-Hodgkin's lymphoma (NHL) are inconclusive. Two studies with the requisite attributes of size, individual exposure estimates and follow up are the basis for dose-response modeling and added lifetime risk predictions under environmental and occupational exposure scenarios and a variety of plausible alternative assumptions. A point of departure analysis, with various margins of exposure, is also illustrated using human data. The two datasets produce remarkably similar leukemia added risk predictions, orders of magnitude lower than prior animal-based predictions under conservative, default assumptions, with risks on the order of 1 × 10−6 or lower for exposures in the low ppb range. Inconsistent results for “lymphoid” tumors, a non-standard grouping using histologic information from death certificates, are discussed. This assessment demonstrates the applicability of the current risk assessment paradigm to epidemiological data.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 198
    ISSN: 1539-6924
    Keywords: physiologically-based toxicokinetics ; empirical Bayes ; MAP estimation ; mathematical model ; toluene ; error analysis
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Physiologically-based toxicokinetic (PBTK) models are widely used to quantify whole-body kinetics of various substances. However, since they attempt to reproduce anatomical structures and physiological events, they have a high number of parameters. Their identification from kinetic data alone is often impossible, and other information about the parameters is needed to render the model identifiable. The most commonly used approach consists of independently measuring, or taking from literature sources, some of the parameters, fixing them in the kinetic model, and then performing model identification on a reduced number of less certain parameters. This results in a substantial reduction of the degrees of freedom of the model. In this study, we show that this method results in final estimates of the free parameters whose precision is overestimated. We then compared this approach with an empirical Bayes approach, which takes into account not only the mean value, but also the error associated with the independently determined parameters. Blood and breath 2H8-toluene washout curves, obtained in 17 subjects, were analyzed with a previously presented PBTK model suitable for person-specific dosimetry. Model parameters with the greatest effect on predicted levels were alveolar ventilation rate QPC, fat tissue fraction VFC, blood-air partition coefficient Kb, fraction of cardiac output to fat Qa/co and rate of extrahepatic metabolism Vmax-p. Differences in the measured and Bayesian-fitted values of QPC, VFC and Kb were significant (p 〈 0.05), and the precision of the fitted values Vmax-p and Qa/co went from 11 ± 5% to 75 ± 170% (NS) and from 8 ± 2% to 9 ± 2% (p 〈 0.05) respectively. The empirical Bayes approach did not result in less reliable parameter estimates: rather, it pointed out that the precision of parameter estimates can be overly optimistic when other parameters in the model, either directly measured or taken from literature sources, are treated as known without error. In conclusion, an empirical Bayes approach to parameter estimation resulted in a better model fit, different final parameter estimates, and more realistic parameter precisions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 199
    ISSN: 1539-6924
    Keywords: computer animation ; probability plots ; mixture models ; LogNormal distributions
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract Risk assessors often use different probability plots as a way to assess the fit of a particular distribution or model by comparing the plotted points to a straight line and to obtain estimates of the parameters in parametric distributions or models. When empirical data do not fall in a sufficiently straight line on a probability plot, and when no other single parametric distribution provides an acceptable (graphical) fit to the data, the risk assessor may consider a mixture model with two component distributions. Animated probability plots are a way to visualize the possible behaviors of mixture models with two component distributions. When no single parametric distribution provides an adequate fit to an empirical dataset, animated probability plots can help an analyst pick some plausible mixture models for the data based on their qualitative fit. After using animations during exploratory data analysis, the analyst must then use other statistical tools, including but not limited to: Maximum Likelihood Estimation (MLE) to find the optimal parameters, Goodness of Fit (GoF) tests, and a variety of diagnostic plots to check the adequacy of the fit. Using a specific example with two LogNormal components, we illustrate the use of animated probability plots as a tool for exploring the suitability of a mixture model with two component distributions. Animations work well with other types of probability plots, and they may be extended to analyze mixture models with three or more component distributions.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 200
    Electronic Resource
    Electronic Resource
    Springer
    Risk analysis 19 (1999), S. 1193-1204 
    ISSN: 1539-6924
    Keywords: multimedia modeling ; uncertainty ; variability ; exposure efficiency ; toxicity scoring ; toxics release inventory (TRI) ; life cycle assessment (LCA)
    Source: Springer Online Journal Archives 1860-2000
    Topics: Energy, Environment Protection, Nuclear Power Engineering
    Notes: Abstract The human toxicity potential, a weighting scheme used to evaluate toxic emissions for life cycle assessment and toxics release inventories, is based on potential dose calculations and toxicity factors. This paper evaluates the variance in potential dose calculations that can be attributed to the uncertainty in chemical-specific input parameters as well as the variability in exposure factors and landscape parameters. A knowledge of the uncertainty allows us to assess the robustness of a decision based on the toxicity potential; a knowledge of the sources of uncertainty allows us to focus our resources if we want to reduce the uncertainty. The potential dose of 236 chemicals was assessed. The chemicals were grouped by dominant exposure route, and a Monte Carlo analysis was conducted for one representative chemical in each group. The variance is typically one to two orders of magnitude. For comparison, the point estimates in potential dose for 236 chemicals span ten orders of magnitude. Most of the variance in the potential dose is due to chemical-specific input parameters, especially half-lives, although exposure factors such as fish intake and the source of drinking water can be important for chemicals whose dominant exposure is through indirect routes. Landscape characteristics are generally of minor importance.
    Type of Medium: Electronic Resource
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...