ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Environment Pollution  (233)
  • Instrumentation and Photography  (185)
  • Computer Programming and Software  (181)
  • 2015-2019
  • 1995-1999  (599)
  • 1980-1984
  • 1999  (599)
Collection
Keywords
Years
  • 2015-2019
  • 1995-1999  (599)
  • 1980-1984
Year
  • 1
    Publication Date: 2011-08-24
    Description: The Mayak Production Association was the first Russian site for the production and separation of plutonium. The extensive increase in plutonium production during 1948-1955, as well as the absence of reliable waste-management technology, resulted in significant releases of liquid radioactive effluent into the rather small Techa River. This resulted in chronic external and internal exposure of about 30,000 residents of riverside communities; these residents form the cohort of an epidemiologic investigation. Analysis of the available historical monitoring data indicates that the following reliable data sets can be used for reconstruction of doses received during the early periods of operation of the Mayak Production Association: Temporal pattern of specific beta activity of river water for several sites in the upper Techa region since July 1951; average annual values of specific beta activity of river water and bottom sediments as a function of downstream distance for the whole river since 1951; external gamma-exposure rates near the shoreline as a function of downstream distance for the whole Techa River since 1952; and external gamma-exposure rate as a function of distance from the shoreline for several sites in the upper and middle Techa since 1951.
    Keywords: Environment Pollution
    Type: Health physics (ISSN 0017-9078); Volume 76; 6; 605-18
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2011-08-24
    Description: The Techa River (Southern Urals, Russia) was contaminated in 1949-1956 by liquid radioactive wastes from the Mayak complex, the first Russian facility for the production of plutonium. The measurements of environmental contamination were started in 1951. A simple model describing radionuclide transport along the free-flowing river and the accumulation of radionuclides by bottom sediments is presented. This model successfully correlates the rates of radionuclide releases as reconstructed by the Mayak experts, hydrological data, and available environmental monitoring data for the early period of contamination (1949-1951). The model was developed to reconstruct doses for people who lived in the riverside communities during the period of the releases and who were chronically exposed to external and internal irradiation. The model fills the data gaps and permits reconstruction of external gamma-exposure rates in air on the river bank and radionuclide concentrations in river water used for drinking and other household needs in 1949-1951.
    Keywords: Environment Pollution
    Type: Health physics (ISSN 0017-9078); Volume 77; 2; 142-9
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2009-05-04
    Description: This presentation looks at logic design from early in the US Space Program, it examines faults in recent logic designs, and gives some examples from the analysis of new tools and techniques.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2004-12-03
    Description: The Hydrodynamic Focusing Bioreactor (HDFB) technology is designed to provide a flow field with nearly uniform shear force throughout the vessel, which can provide the desired low shear force spatial environment to suspend three-dimensional cell aggregates while providing optimum mass transfer. The reactor vessel consists of a dome-shaped cell culture vessel, a viscous spinner, an access port, and a rotating base. The domed vessel face has a radius of R(o). and rotates at 0mega(o) rpm, while the internal viscous spinner has a radius of R(i) and rotates at 0mega(i) rpm. The culture vessel is completely filled with cell culture medium into which three-dimensional cellular structures are introduced. The HDFB domed vessel and spinner were driven by two independent step motors,
    Keywords: Instrumentation and Photography
    Type: KC-135 and Other Microgravity Simulations; 62-64; NASA/CR-1999-208922
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2004-12-03
    Description: Coherent Doppler lidar is a promising technique for the global measurements of winds using a space-based platform. Doppler lidar produces estimates of the radial component of the velocity vector averaged over the resolution volume of the measurement. Profiles of the horizontal vector winds are produced by scanning the lidar beam or stepping the lidar beam through a sequence of different angles (step-stare). The first design for space-based measurements proposed a conical scan which requires a high power laser to produce acceptable signal levels for every laser pulse. Performance is improved by fixing the laser beam and accumulating the signal from many lidar pulses for each range-gate. This also improves the spatial averaging of the wind estimates and reduces the threshold signal energy required for a good estimate. Coherent Doppler lidar performance for space-based operation is determined using computer simulations and including the wind variability over the measurement volume as well as the variations of the atmospheric aerosol backscatter.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 298-301; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2004-12-03
    Description: A useful measure of sensor performance is the transceiver system efficiency n (sub sys). Which consists of the antenna efficiency n (sub a) and optical and electronic losses. Typically, the lidar equation and the antenna efficiency are defined in terms of the telescope aperture area. However, during the assembly of a coherent transceiver, it is important to measure the system efficiency before the installation of the beamexpanding telescope (i.e., the untruncated-beam system efficiency). Therefore, to accommodate both truncated and untruncated beam efficiency measurements, we define the lidar equation and the antenna efficiency in terms of the beam area rather than the commonly used aperture area referenced definition. With a well-designed Gaussian-beam lidar, aperture area referenced system efficiencies of 15 to 20 % (23-31% relative to the beam area) are readily achievable. In this paper we compare the differences between these efficiency definitions. We then describe techniques by which high efficiency can be achieved, followed by a discussion several novel auto alignment techniques developed to maintain high efficiency.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 247-250; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2004-12-03
    Description: Transmissive scanning elements for coherent laser radar systems are typically optical wedges, or prisms, which deflect the lidar beam at a specified angle and are then rotated about the instrument optical axis to produce a scan pattern. The wedge is placed in the lidar optical system subsequent to a beam-expanding telescope, implying that it has the largest diameter of any element in the system. The combination of the wedge diameter and asymmetric profile result in the element having very large mass and, consequently, relatively large power consumption required for scanning. These two parameters, mass and power consumption, are among the instrument requirements which need to be minimized when designing a lidar for a space-borne platform. Reducing the scanner contributions in these areas will have a significant effect on the overall instrument specifications, Replacing the optical wedge with a diffraction grating on the surface of a thin substrate is a straight forward approach with potential to reduce the mass of the scanning element significantly. For example, the optical wedge that will be used for the SPAce Readiness Coherent Lidar Experiment (SPARCLE) is approximately 25 cm in diameter and is made from silicon with a wedge angle designed for 30 degree deflection of a beam operating at approx. 2 micrometer wavelength. The mass of this element could be reduced by a factor of four by instead using a fused silica substrate, 1 cm thick, with a grating fabricated on one of the surfaces. For a grating to deflect a beam with a 2 micrometer wavelength by 30 degrees, a period of approximately 4 micrometers is required. This is small enough that fabrication of appropriate high efficiency blazed or multi-phase level diffractive optical gratings is prohibitively difficult. Moreover, bulk or stratified volume holographic approaches appear impractical due to materials limitations at 2 micrometers and the need to maintain adequate wavefront quality. In order to avoid the difficulties encountered in these approaches, we have developed a new type of high-efficiency grating which we call a Stratified Volume Diffractive Optical Element (SVDOE). The features of the gratings in this approach can be easily fabricated using standard photolithography and etching techniques and the materials used in the grating can be chosen specifically for a given application, In this paper we will briefly discuss the SVDOE technique and will present an example design of a lidar scanner using this approach. We will also discuss performance predictions for the example design.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 119-122; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2004-12-03
    Description: NASA's New Millennium Program (NMP) has been chartered to identify and validate in space emerging, revolutionary technologies that will enable less costly, more capable future science missions. The program utilizes a unique blend of science guidance and industry partnering to ferret out technology solutions to enable science capabilities in space which are presently technically infeasible, or unaffordable. Those technologies which present an unacceptably high risk to future science missions (whether small PI-led or operational) are bundled into technology validation missions. These missions seek to validate the technologies in a manner consistent with their future uses, thus reducing the associated risk to the first user, and obtaining meaningful science data as well. The Space Readiness Coherent Lidar Experiment (SPARCLE) was approved as the second NMP Earth Observing mission (EO2) in October 1997, and assigned to Marshall Space Flight Center for implementation. Leading up to mission confirmation, NMP sponsored a community workshop in March 1996 to draft Level-1 requirements for a doppler wind lidar mission, as well as other space-based lidar missions (such as DIAL). Subsequently, a study group was formed and met twice to make recommendations on how to perform a comparison of coherent and direct detection wind lidars in space. These recommendations have guided the science validation plan for the SPARCLE mission, and will ensure that future users will be able to confidently assess the risk profile of future doppler wind missions utilizing EO2 technologies. The primary risks to be retired are: (1) Maintenance of optical alignments through launch and operations on orbit, and (2) Successful velocity estimation compensation for the Doppler shift due to the platform motion, and due to the earth's rotation. This includes the need to account for all sources of error associated with pointing control and knowledge. The validation objectives are: (1) Demonstrate measurement of tropospheric winds from space using a scanning coherent Doppler lidar technique that scales to meet future research (e.g. ESSP) and operational (e.g. NPOESS) mission requirements. Specifically, produce and validate LOS wind data with single shot accuracy of 1-2 m/s in regions of high signal-to-noise ratio (SNR), and low atmospheric wind turbulence and wind shear, (2) Collect the atmospheric and instrument performance data in various scanning modes necessary to validate and improve instrument performance models that will enable the definition of future missions with greater confidence. Such data include aerosol backscatter data over much of the globe, and high SNR data such as that from surface returns, and (3) Produce a set of raw instrument data with which advanced signal processing techniques can be developed. This objective will permit future missions to better understand how to extract wind information from low backscatter regions of the atmosphere.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 38-39; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: This presentation discusses the problem of local air quality as it is affected by modern aircraft engine exhaust and the objective of this workshop. It begins with a discussion on the nature and sources of particulates and aerosols. The problems, and the technical considerations of how to regulate the aircraft emissions, are reviewed. There is no local (i.e., state or county) regulations of the aircraft operations. Amongst the conclusions are: (1) there is an inadequate database of information regarding the emittants from aircrafts. (2) That data which does exist represents older engines and aircraft, it is not representative of the advanced and future fleet.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 21-44; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2004-12-03
    Description: This paper describes a method to determine the uncertainties of measured forces and moments from multi-component force balances used in wind tunnel tests. A multivariate regression technique is first employed to estimate the uncertainties of the six balance sensitivities and 156 interaction coefficients derived from established balance calibration procedures. These uncertainties are then employed to calculate the uncertainties of force-moment values computed from observed balance output readings obtained during tests. Confidence and prediction intervals are obtained for each computed force and moment as functions of the actual measurands. Techniques are discussed for separate estimation of balance bias and precision uncertainties.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 1; 279-306; NASA/CP-1999-209101/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2004-12-03
    Description: In this presentation we review the current ongoing research within George Mason University's (GMU) Center for Information Systems Integration and Evolution (CISE). We define characteristics of advanced information systems, discuss a family of agents for such systems, and show how GMU's Domain modeling tools and techniques can be used to define a product line Architecture for configuring NASA missions. These concepts can be used to define Advanced Engineering Environments such as those envisioned for NASA's new initiative for intelligent design and synthesis environments.
    Keywords: Computer Programming and Software
    Type: Intelligent Agents and Their Potential for Future Design and Synthesis Environment; 21-38; NASA/CP-1999-208986
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2004-12-03
    Description: Direct measurements of forces and moments are some of the most important data acquired during aerodynamic testing. This paper deals with the force and strain measurement capabilities at the Langley Research Center (LaRC). It begins with a progressive history of LaRC force measurement developments beginning in the 1940's and ends with the center's current capabilities. Various types of force and moment transducers used at LaRC are discussed including six-component sting mounted balances, semi-span balances, hinge moment balances, flow-through balances, rotor balances, and many other unique transducers. Also discussed are some unique strain-gage applications, such as those used in extreme environments. The final topics deal with the LaRC's ability to perform custom calibrations and our current levels of effort in the area of force and strain measurement.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 1; 105-114; NASA/CP-1999-209101/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2004-12-03
    Description: This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Major advances must occur to protect astronauts from prolonged periods in near-zero gravity and high radiation associated with extended space travel. The dangers of living in space must be thoroughly understood and methods developed to reverse those effects that cannot be avoided. Six of the seven research teams established by the National Space Biomedical Research Institute (NSBRI) are studying biomedical factors for prolonged space travel to deliver effective countermeasures. To develop effective countermeasures, each of these teams require identification of and quantitation of complex pharmacological, hormonal, and growth factor compounds (biomarkers) in humans and in experimental animals to develop an in-depth knowledge of the physiological changes associated with space travel. At present, identification of each biomarker requires a separate protocol. Many of these procedures are complicated and the identification of each biomarker requires a separate protocol and associated laboratory equipment. To carry all of this equipment and chemicals on a spacecraft would require a complex clinical laboratory; and it would occupy much of the astronauts time. What is needed is a small, efficient, broadband medical diagnostic instrument to rapidly identify important biomarkers for human space exploration. The Miniature Time-Of- Flight Mass Spectrometer Project in the Technology Development Team is developing a small, high resolution, time-of-flight mass spectrometer (TOFMS) to quantitatively measure biomarkers for human space exploration. Virtues of the JHU/APL TOFMS technologies reside in the promise for a small (less than one cubic ft), lightweight (less than 5 kg), low-power (less than 50 watts), rugged device that can be used continuously with advanced signal processing diagnostics. To date, the JHU/APL program has demonstrated mass capability from under 100 to beyond 10,000 atomic mass units (amu) in a very small, low power prototype for biological analysis. Further, the electronic nature of the TOFMS output makes it ideal for rapid telemetry to earth for in-depth analysis by ground support teams.
    Keywords: Instrumentation and Photography
    Type: National Space Biomedical Research Institute; B-111 - B-113
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2004-12-03
    Description: The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing. attitude information from lidar and non-lidar sensors, and pointing knowledge algorithms will meet this second requirement. The topic of this paper is the pre-launch demonstration of the first requirement, adequate sensitivity of the SPARCLE lidar.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 156-159; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2004-12-03
    Description: Routine backscatter, beta, measurements by an airborne or space-based lidar from designated earth surfaces with known and fairly uniform beta properties can potentially offer lidar calibration opportunities. This can in turn be used to obtain accurate atmospheric aerosol and cloud beta measurements on large spatial scales. This is important because achieving a precise calibration factor for large pulsed lidars then need not rest solely on using a standard hard target procedure. Furthermore, calibration from designated earth surfaces would provide an inflight performance evaluation of the lidar. Hence, with active remote sensing using lasers with high resolution data, calibration of a space-based lidar using earth's surfaces will be extremely useful. The calibration methodology using the earth's surface initially requires measuring beta of various earth surfaces simulated in the laboratory using a focused continuous wave (CW) CO2 Doppler lidar and then use these beta measurements as standards for the earth surface signal from airborne or space-based lidars. Since beta from the earth's surface may be retrieved at different angles of incidence, beta would also need to be measured at various angles of incidences of the different surfaces. In general, Earth-surface reflectance measurements have been made in the infrared, but the use of lidars to characterize them and in turn use of the Earth's surface to calibrate lidars has not been made. The feasibility of this calibration methodology is demonstrated through a comparison of these laboratory measurements with actual earth surface beta retrieved from the same lidar during the NASA/Multi-center Airborne Coherent Atmospheric Wind Sensor (MACAWS) mission on NASA's DC8 aircraft from 13 - 26 September, 1995. For the selected earth surface from the airborne lidar data, an average beta for the surface was established and the statistics of lidar efficiency was determined. This was compared with the actual lidar efficiency determined with the standard calibrating hard target.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 128-131; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2004-12-03
    Description: The GSFC IVS Technology Development Center (TDC) develops station software including the Field System (FS), scheduling software (SKED), hardware including tools for station timing and meteorology, scheduling algorithms, operational procedures, and provides a pool of individuals to assist with station implementation, check-out, upgrades, and training.
    Keywords: Computer Programming and Software
    Type: International VLBI Service for Geodesy and Astrometry: 1999 Annual Report; 256-258; NASA/TP-1999-209243
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The assignments and charges to the three workgroups are discussed. The three workgroups were: (1) Trace Chemistry, (2) Instrumentation, (3) Venues and procedures.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 163-176; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2004-12-03
    Description: We presented results from the SASS Near-Field Interactions Flight (SNIF-III) Experiment which was conducted during May and June 1997 in collaboration with the Vermont and New Jersey Air National Guard Units. The project objectives were to quantify the fraction of fuel sulfur converted to S(VI) species by jet engines and to gain a better understanding of particle formation and growth processes within aircraft wakes. Size and volatility segregated aerosol measurements along with sulfur species measurements were recorded in the exhaust of F-16 aircraft equipped with F-100 engines burning fuels with a range of fuel S concentrations at different altitudes and engine power settings. A total of 10 missions were flown in which F-16 exhaust plumes were sampled by an instrumented T-39 Sabreliner aircraft. On six of the flights, measurements were obtained behind the same two aircraft, one burning standard JP-8 fuel and the other either approximately 28 ppm or 1100 ppm S fuel or an equal mixture of the two (approximately 560 ppm S). A pair of flights was conducted for each fuel mixture, one at 30,000 ft altitude and the other starting at 35,000 ft and climbing to higher altitudes if contrail conditions were not encountered at the initial flight level. In each flight, the F-16s were operated at two power settings, approx. 80% and full military power. Exhaust emissions were sampled behind both aircraft at each flight level, power setting, and fuel S concentration at an initial aircraft separation of 30 m, gradually widening to about 3 km. Analyses of the aerosol data in the cases where fuel S was varied suggest results were consistent with observations from project SUCCESS, i.e., a significant fraction of the fuel S was oxidized to form S(VI) species and volatile particle emission indices (EIs) in comparably aged plumes exhibited a nonlinear dependence upon the fuel S concentration. For the high sulfur fuel, volatile particle EIs in 10-second-old-plumes were 2 to 3 x 10 (exp 17) / kg of fuel burned and exhibited no obvious trend with engine power setting or flight altitude. In contrast, about 8-fold fewer particles were observed in similarly aged plumes from the same aircraft burning fuel with 560 ppm S content and EIs of 1 x 10(exp 15)/ kg of fuel burned were observed in the 28 ppm S fuel case. Moreover, data recorded as a function of plume age indicates that formation and growth of the volatile particles proceeds more slowly as the fuel S level is reduced. For example, ultrafine particle concentrations appear to stabilize within 5 seconds after emission in the 1100 ppm S cases but are still increasing in 20-second old plumes produced from burning the 560 ppm S fuel.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 83-100; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2004-12-03
    Description: The overall focus of our research is to document long-term elevation change of the Greenland ice sheet using satellite altimeter data. In addition, we are investigating seasonal and interannual variations in the ice-sheet elevations to place the long-term measurements in context. Specific objectives of this research include: 1) Developing new techniques to significantly improve the accuracy of elevation-change estimates derived from satellite altimetry. 2) Measuring the elevation change of the Greenland ice sheet over a 10-year time period using Seasat (1978) and Geosat GM (1985-86) and Geosat ERM (1986-88) altimeter data. 3) Quantifying seasonal/interannual variations in the elevation-change estimates using the continuous time series of surface elevations from the Geosat GM and ERM datasets. 4) Extending the long-term elevation change analysis to two decades by incorporating data from the ERS-1/2 missions (1991-99) and, if available, the Geosat-Follow On (GFO) mission (1998-??).
    Keywords: Environment Pollution
    Type: Program for Arctic Regional Climate Assessment (PARCA); 6-11; NASA/TM-1999-209205
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2004-12-03
    Description: Surface mounted strain gages and strain gage application techniques are as varied as they are versatile. There is an abundance of technical literature, available throughout the strain gage community, offering techniques for installing strain gages and methods of obtaining useful information from them. This paper, while providing more of the same, will focus its discussions on recent Langley developments for using strain gages reliably and accurately in very harsh environments. With Langley's extensive use of wind tunnel balances, its ongoing effort in materials development, and its currently focused activities in structural testing, the use of strain gages in unusual and demanding environments has led to several innovative improvements in the "how to gage it" department. Several of these innovations will be addressed that hopefully will provide some practical information for the strain gage user who is finding the test environment and (or) the materials to be tested too demanding for previously utilized strain gage application technology. Specifically, this paper will include discussions in the following three areas: (1) technical considerations when gaging cryogenic wind tunnel balances, including areas for improving accuracy and reliability; (2) addressing technical difficulties associated with gaging composite test articles and certain alloys for testing at temperatures approaching -450F, or elevated temperatures up to 350F, or both temperatures inclusive during the same test scenario; (3) gaging innovations for testing metal/matrix and carbon/carbon composites at temperatures above 700F.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 1; 413-429; NASA/CP-1999-209101/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2004-12-03
    Description: This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.
    Keywords: Computer Programming and Software
    Type: Intelligent Agents and Their Potential for Future Design and Synthesis Environment; 129-138; NASA/CP-1999-208986
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The focus of the HPCC Earth and Space Sciences (ESS) Project is capability computing - pushing highly scalable computing testbeds to their performance limits. The drivers of this focus are the Grand Challenge problems in Earth and space science: those that could not be addressed in a capacity computing environment where large jobs must continually compete for resources. These Grand Challenge codes require a high degree of communication, large memory, and very large I/O (throughout the duration of the processing, not just in loading initial conditions and saving final results). This set of parameters led to the selection of an SGI/Cray T3E as the current ESS Computing Testbed. The T3E at the Goddard Space Flight Center is a unique computational resource within NASA. As such, it must be managed to effectively support the diverse research efforts across the NASA research community yet still enable the ESS Grand Challenge Investigator teams to achieve their performance milestones, for which the system was intended. To date, all Grand Challenge Investigator teams have achieved the 10 GFLOPS milestone, eight of nine have achieved the 50 GFLOPS milestone, and three have achieved the 100 GFLOPS milestone. In addition, many technical papers have been published highlighting results achieved on the NASA T3E, including some at this Workshop. The successes enabled by the NASA T3E computing environment are best illustrated by the 512 PE upgrade funded by the NASA Earth Science Enterprise earlier this year. Never before has an HPCC computing testbed been so well received by the general NASA science community that it was deemed critical to the success of a core NASA science effort. NASA looks forward to many more success stories before the conclusion of the NASA-SGI/Cray cooperative agreement in June 1999.
    Keywords: Computer Programming and Software
    Type: HPCCP/CAS Workshop Proceedings 1998; 53-58; NASA/CP-1999-208757
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2004-12-03
    Description: The Second Workshop on Stratospheric Models and Measurements Workshop (M&M II) is the continuation of the effort previously started in the first Workshop (M&M I, Prather and Remsberg [1993]) held in 1992. As originally stated, the aim of M&M is to provide a foundation for establishing the credibility of stratospheric models used in environmental assessments of the ozone response to chlorofluorocarbons, aircraft emissions, and other climate-chemistry interactions. To accomplish this, a set of measurements of the present day atmosphere was selected. The intent was that successful simulations of the set of measurements should become the prerequisite for the acceptance of these models as having a reliable prediction for future ozone behavior. This section is divided into two: model experiment and model descriptions. In the model experiment, participant were given the charge to design a number of experiments that would use observations to test whether models are using the correct mechanisms to simulate the distributions of ozone and other trace gases in the atmosphere. The purpose is closely tied to the needs to reduce the uncertainties in the model predicted responses of stratospheric ozone to perturbations. The specifications for the experiments were sent out to the modeling community in June 1997. Twenty eight modeling groups responded to the requests for input. The first part of this section discusses the different modeling group, along with the experiments performed. Part two of this section, gives brief descriptions of each model as provided by the individual modeling groups.
    Keywords: Environment Pollution
    Type: Models and Measurements Intercomparison 2; 10-109; NASA/TM-1999-209554
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2004-12-03
    Description: It is critically important to be able to assess alterations in cardiovascular regulation during and after space flight. We propose to develop an instrument for the non-invasive assessment of such alterations that can be used on the ground and potentially during space flight. This instrumentation would be used by the Cardiovascular Alterations Team at multiple sites for the study of the effects of space flight on the cardiovascular system and the evaluation of countermeasures. In particular, the Cardiovascular Alterations Team will use this instrumentation in conjunction with ground-based human bed-rest studies and during application of acute stresses e.g., tilt, lower body negative pressure, and exercise. In future studies, the Cardiovascular Alterations Team anticipates using this instrumentation to study astronauts before and after space flight and ultimately, during space flight. The instrumentation may also be used by the Bone Demineralization/Calcium Metabolism Team, the Neurovestibular Team and the Human Performance Factors, Sleep and Chronobiology Team to measure changes in autonomic nervous function. The instrumentation will be based on a powerful new technology - cardiovascular system identification (CSI) - which has been developed in our laboratory. CSI provides a non-invasive approach for the study of alterations in cardiovascular regulation. This approach involves the analysis of second-to-second fluctuations in physiologic signals such as heart rate and non-invasively measured arterial blood pressure in order to characterize quantitatively the physiologic mechanisms responsible for the couplings between these signals. Through the characterization of multiple physiologic mechanisms, CSI provides a closed-loop model of the cardiovascular regulatory state in an individual subject.
    Keywords: Instrumentation and Photography
    Type: National Space Biomedical Research Institute; B-110
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The purpose of the Dual Energy X-ray Absorptiometry (DEXA) project is to design, build, and test an advanced X-ray absorptiometry scanner capable of being used to monitor the deleterious effects of weightlessness on the human musculoskeletal system during prolonged spaceflight. The instrument is based on the principles of dual energy x-ray absorptiometry and is designed not only to measure bone, muscle, and fat masses but also to generate structural information about these tissues so that the effects on mechanical integrity may be assessed using biomechanical principles. A skeletal strength assessment could be particularly important for an astronaut embarking on a remote planet where the consequences of a fragility fracture may be catastrophic. The scanner will employ multiple projection images about the long axis of the scanned subject to provide geometric properties in three dimensions, suitable for a three-dimensional structural analysis of the scanned region. The instrument will employ advanced fabrication techniques to minimize volume and mass (100 kg current target with a long-term goal of 60 kg) of the scanner as appropriate for the space environment, while maintaining the required mechanical stability for high precision measurement. The unit will have the precision required to detect changes in bone mass and geometry as small as 1% and changes in muscle mass as small as 5%. As the system evolves, advanced electronic fabrication technologies such as chip-on-board and multichip modules will be combined with commercial (off-the-shelf) parts to produce a reliable, integrated system which not only minimizes size and weight, but, because of its simplicity, is also cost effective to build and maintain. Additionally, the system is being designed to minimize power consumption. Methods of heat dissipation and mechanical stowage (for the unit when not in use) are being optimized for the space environment.
    Keywords: Instrumentation and Photography
    Type: National Space Biomedical Research Institute; B-108 - B-109
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2004-12-03
    Description: The objectives of this study are threefold: (1) Provide insight into water delivery in microgravity and determine optimal germination paper wetting for subsequent seed germination in microgravity; (2) Observe the behavior of water exposed to a strong localized magnetic field in microgravity; and (3) Simulate the flow of fixative (using water) through the hardware. The Magnetic Field Apparatus (MFA) is a new piece of hardware slated to fly on the Space Shuttle in early 2001. MFA is designed to expose plant tissue to magnets in a microgravity environment, deliver water to the plant tissue, record photographic images of plant tissue, and deliver fixative to the plant tissue.
    Keywords: Instrumentation and Photography
    Type: KC-135 and Other Microgravity Simulations; 142-146; NASA/CR-1999-208922
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2004-12-03
    Description: Sensors 2000! (S2K!) is a specialized, integrated projects team organized to provide focused, directed, advanced biosensor and bioinstrumentation systems technology support to NASA's spaceflight and ground-based research and development programs. Specific technology thrusts include telemetry-based sensor systems, chemical/ biological sensors, medical and physiological sensors, miniaturized instrumentation architectures, and data and signal processing systems. A concurrent objective is to promote the mutual use, application, and transition of developed technology by collaborating in academic-commercial-govemment leveraging, joint research, technology utilization and commercialization, and strategic partnering alliances. Sensors 2000! is organized around three primary program elements: Technology and Product Development, Technology infusion and Applications, and Collaborative Activities. Technology and Product Development involves development and demonstration of biosensor and biotelemetry systems for application to NASA Space Life Sciences Programs; production of fully certified spaceflight hardware and payload elements; and sensor/measurement systems development for NASA research and development activities. Technology Infusion and Applications provides technology and program agent support to identify available and applicable technologies from multiple sources for insertion into NASA's strategic enterprises and initiatives. Collaborative Activities involve leveraging of NASA technologies with those of other government agencies, academia, and industry to concurrently provide technology solutions and products of mutual benefit to participating members.
    Keywords: Instrumentation and Photography
    Type: Proceedings of the First Biennial Space Biomedical Investigators' Workshop; 578
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2004-12-03
    Description: The Instrumentation Working Group compiled a summary of measurement techniques applicable to gas turbine engine aerosol precursors and particulates. An assessment was made of the limits, accuracy, applicability, and technology readiness of the various techniques. Despite advances made in emissions characterization of aircraft engines, uncertainties still exist in the mechanisms by which aerosols and particulates are produced in the near-field engine exhaust. To adequately assess current understanding of the formation of sulfuric acid aerosols in the exhaust plumes of gas turbine engines, measurements are required to determine the degree and importance of sulfur oxidation in the turbine and at the engine exit. Ideally, concentrations of all sulfur species would be acquired, with emphasis on SO2 and SO3. Numerous options exist for extractive and non-extractive measurement of SO2 at the engine exit, most of which are well developed. SO2 measurements should be performed first to place an upper bound on the percentage of SO2 oxidation. If extractive and non-extractive techniques indicate that a large amount of the fuel sulfur is not detected as SO2, then efforts are needed to improve techniques for SO3 measurements. Additional work will be required to account for the fuel sulfur in the engine exhaust. Chemical Ionization Mass Spectrometry (CI-MS) measurements need to be pursued, although a careful assessment needs to be made of the sampling line impact on the extracted sample composition. Efforts should also be placed on implementing non-intrusive techniques and extending their capabilities by maximizing exhaust coverage for line-of-sight measurements, as well as development of 2-D techniques, where feasible. Recommendations were made to continue engine exit and combustor measurements of particulates. Particulate measurements should include particle size distribution, mass fraction, hydration properties, and volatile fraction. However, methods to ensure that unaltered samples are obtained need to be developed. Particulate speciation was also assigned a high priority for quantifying the fractions of carbon soot, PAH, refractory materials, metals, sulfates, and nitrates. High priority was also placed on performing a comparison of particle sizing instruments. Concern was expressed by the workshop attendees who routinely make particulate measurements about the variation in number density measured during in-flight tests by different instruments. In some cases, measurements performed by different groups of researchers during the same flight tests showed an order of magnitude variation. Second priority was assigned to measuring concentrations of odd hydrogen and oxidizing species. Since OH, HO2, H2O2, and O are extremely reactive, non-extractive measurements are recommended. A combination of absorption and fluorescence is anticipated to be effective for OH measurements in the combustor and at the engine exit. Extractive measurements of HO2 have been made in the stratosphere, where the ambient level of OH is relatively low. Use of techniques that convert HO2 to OH for combustor and engine exit measurements needs to be evaluated, since the ratio of HO2/OH may be 1% or less at both the combustor and engine exit. CI-MS might be a viable option for H2O2, subject to sampling line conversion issues. However, H2O2 is a low priority oxidizing species in the combustor and at the engine exit. Two candidates for atomic oxygen measurements are Resonance Enhanced Multi-Photon Ionization (REMPI) and Laser-Induced Fluorescence (LIF). Particulate measurement by simultaneous extractive and non-extractive techniques was given equal priority to the oxidizer measurements. Concern was expressed over the ability of typical ground test sampling lines to deliver an unaltered sample to a remotely located instrument. It was suggested that the sampling probe and line losses be checked out by attempting measurements using an optical or non-extractive technique immediately upstream of the sampling probe. This is a possible application for Laser Induced Incandescence (LII) as a check on the volume fraction of soot. Optical measurements of size distribution are not well developed for ultrafine particles less than about 20 nm in diameter, so a non-extractive technique for particulate size distribution cannot be recommended without further development. Carbon dioxide measurements need to be made to complement other extractive measurement techniques. CO2 measurements enable conversion of other species concentrations to emission indices. Carbon monoxide, which acts as a sink for oxidizing species, should be measured using non-extractive techniques. CO can be rapidly converted to CO2 in extractive probes, and a comparison between extractive and non-extractive measurements should be performed. Development of non-extractive techniques would help to assess the degree of CO conversion, and might be needed to improve the concentration measurement accuracy. Measurements of NO(x) will continue to be critical due to the role of NO and NO2 in atmospheric chemistry, and their influence on atmospheric ozone. Time-resolved measurements of temperature, velocity, and species concentrations were included on the list of desired measurement. Thermocouples are typically adequate for engine exit measurements. PIV and LDV are well established for obtaining velocity profiles. The techniques are listed in the accompanying table; are divided into extractive and non-extractive techniques. Efforts were made to include a measurement uncertainty for each technique. An assessment of the technology readiness was included.
    Keywords: Instrumentation and Photography
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 179-186; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2004-12-03
    Description: Solid (soot) and liquid (presumed sulfate) particle emissions from aircraft engines may have serious impacts on the atmosphere. While the direct radiative impact of these particles is expected to be small relative to those from natural sources (Atmospheric Effects of Subsonic Aircraft: Interim Assessment of the Advanced Subsonic Technology Program, NASA Ref. Pub. 1400, 1997), their indirect effects on atmospheric chemistry and cloud formation may have a significant impact. The potential impacts of primary concern are the increase of sulfate surface area and accelerated heterogeneous chemical reactions, and the potential for either modified soot or sulfate particles to serve as cloud nuclei which would change the frequency or radiative characteristics of clouds. Volatile (sulfate) particle concentrations measured behind the Concorde aircraft in flight in the stratosphere were much higher than expected from near-field model calculations of particle formation and growth. Global model calculations constrained by these data calculate a greater level of stratospheric ozone depletion from the proposed High speed Civil Transport (HSCT) fleet than those without particle emission. Soot particles have also been proposed as important in heterogeneous chemistry but this remains to be substantiated. Aircraft volatile particle production in the troposphere has been shown by measurements to depend strongly on fuel sulfur content. Sulfate particles of sufficient size are known to provide a good nucleating surface for cloud growth. Although pure carbon soot is hydrophobic, the solid particle surface may incorporate more suitable nucleating sites. The non-volatile (soot) particles also tend to occupy the large end of aircraft particle size spectra. Quantitative connection between aircraft particle emissions and cloud modification has not been established yet, however, even small changes in cloud amount or properties could have a significant effect on the radiative balance of the atmosphere.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 55-60; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The objectives of this work are to measure the ice discharge of the Greenland Ice Sheet close to the grounding line and/or calving front, and compare the results with mass accumulation and ablation in the interior to estimate the ice sheet mass balance.
    Keywords: Environment Pollution
    Type: Program for Arctic Regional Climate Assessment (PARCA); 12-15; NASA/TM-1999-209205
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2004-12-03
    Description: Abstract In this paper, an approach to increase the degree of autonomy of flight software is proposed. We describe an enhancement of the Attitude Determination and Control System by augmenting it with self-calibration capability. Conventional attitude estimation and control algorithms are combined with higher level decision making and machine learning algorithms in order to deal with the uncertainty and complexity of the problem.
    Keywords: Instrumentation and Photography
    Type: 1999 Flight Mechanics Symposium; 17-24; NASA/CP-1999-209235
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2004-12-03
    Description: The NASA Langley Research Center (LARC) participated in a national cooperative evaluation of the Israel Aircraft Industries (IAI) automatic balance calibration machine at Microcraft, San Diego in September 1995. A LaRC-designed six-component strain gauge balance was selected for test and calibration during LaRC's scheduled evaluation period. Eight calibrations were conducted using three selected experimental designs. Raw data were exported to LaRC facilities for reduction and statistical analysis using the techniques outlined in Tripp and Tcheng (1994). This report presents preliminary assessments of the results, and compares IAI calibration results with manual calibration results obtained at the Modern Machine and Tool Co., Inc. (MM & T). Newport News, VA. A more comprehensive report is forthcoming.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 1; 353-371; NASA/CP-1999-209101/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2004-12-03
    Description: This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2004-12-03
    Description: Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.
    Keywords: Computer Programming and Software
    Type: Ninth Thermal and Fluids Analysis Workshop Proceedings; 217-232; NASA/CP-1999-208695
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2004-12-03
    Description: Integrated analysis methods have the potential to substantially decrease the time required for analysis modeling. Integration with computer aided design (CAD) software can also allow a model to be more accurate by facilitating import of exact design geometry. However, the integrated method utilized must sometimes be tailored to the specific modeling situation, in order to make the process most efficient. Two cases are presented here that illustrate different processes used for thermal analysis on two different models. These examples are used to illustrate how the requirements, available input, expected output, and tools available all affect the process selected by the analyst for the most efficient and effective analysis.
    Keywords: Computer Programming and Software
    Type: Ninth Thermal and Fluids Analysis Workshop Proceedings; 37-48; NASA/CP-1999-208695
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The primary role of models in the assessment process is to predict changes to ozone. It is crucial therefore that the ability of the models to reproduce the actual distribution of ozone be tested. Historically, maps of the ozone column (latitude by month) have been used for this purpose. In MM I a climatology was developed for the vertical distribution of ozone for 15-60 km, based on SBUV data for 1979-80. SBUV profiles are reported with vertical resolution of approx. 5 km, but the true resolution is lower, approx. 8 km above the ozone maximum and approx. 15 km for 10-25 km. The climatology was considered valid to about 20-30% at 20 km and to 50% at 15 km. Comparisons were made with models in mixing ratio (ppm), which emphasizes the middle and upper stratosphere. A new ozone climatology was developed for the vertical distribution of ozone for MM II. Our goal was to develop a product that could be used to evaluate models in the lower stratosphere, the region where most of the ozone column resides and where most of the ozone loss is occurring, as well as the middle and upper stratosphere.
    Keywords: Environment Pollution
    Type: Models and Measurements Intercomparison 2; 307-362; NASA/TM-1999-209554
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2004-12-03
    Description: The SPAce Readiness Coherent Lidar Experiment (SPARCLE) is the first demonstration of a coherent Doppler wind lidar in space. SPARCLE will be flown aboard a space shuttle In the middle part of 2001 as a stepping stone towards the development and deployment of a long-life-time operational instrument in the later part of next decade. SPARCLE is an ambitious project that is intended to evaluate the suitability of coherent lidar for wind measurements, demonstrate the maturity of the technology for space application, and provide a useable data set for model development and validation. This paper describes the SPARCLE's optical system design, fabrication methods, assembly and alignment techniques, and its anticipated operational characteristics. Coherent detection is highly sensitive to aberrations in the signal phase front, and to relative alignment between the signal and the local oscillator beams. Consequently, the performance of coherent lidars is usually limited by the optical quality of the transmitter/receiver optical system. For SPARCLE having a relatively large aperture (25 cm) and a very long operating range (400 km), compared to the previously developed 2-micron coherent lidars, the optical performance requirements are even more stringent. In addition with stringent performance requirements, the physical and environment constraints associated with this instrument further challenge the limit of optical fabrication technologies.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 284-287; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2004-12-03
    Description: The SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission was proposed as a low cost technology demonstration mission, using a 2-micron, 100-mJ, 6-Hz, 25-cm, coherent lidar system based on demonstrated technology. SPARCLE was selected in late October 1997 to be NASA's New Millennium Program (NMP) second earth-observing (EO-2) mission. To maximize the success probability of SPARCLE, NASA/MSFC desired expert guidance in the areas of coherent laser radar (CLR) theory, CLR wind measurement, fielding of CLR systems, CLR alignment validation, and space lidar experience. This led to the formation of the NASA/MSFC Coherent Lidar Technology Advisory Team (CLTAT) in December 1997. A threefold purpose for the advisory team was identified as: 1) guidance to the SPARCLE mission, 2) advice regarding the roadmap of post-SPARCLE coherent Doppler wind lidar (CDWL) space missions and the desired matching technology development plan 3, and 3) general coherent lidar theory, simulation, hardware, and experiment information exchange. The current membership of the CLTAT is shown. Membership does not result in any NASA or other funding at this time. We envision the business of the CLTAT to be conducted mostly by email, teleconference, and occasional meetings. The three meetings of the CLTAT to date, in Jan. 1998, July 1998, and Jan. 1999, have all been collocated with previously scheduled meetings of the Working Group on Space-Based Lidar Winds. The meetings have been very productive. Topics discussed include the SPARCLE technology validation plan including pre-launch end-to-end testing, the space-based wind mission roadmap beyond SPARCLE and its implications on the resultant technology development, the current values and proposed future advancement in lidar system efficiency, and the difference between using single-mode fiber optical mixing vs. the traditional free space optical mixing.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 153-155; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2004-12-03
    Description: The coherent Doppler lidar, when operated from an airborne platform, offers a unique measurement capability for study of atmospheric dynamical and physical properties. This is especially true for scientific objectives requiring measurements in optically-clear air, where other remote sensing technologies such as Doppler radar are at a disadvantage in terms of spatial resolution and coverage. Recent experience suggests airborne coherent Doppler lidar can yield unique wind measurements of--and during operation within--extreme weather phenomena. This paper presents the first airborne coherent Doppler lidar measurements of hurricane wind fields. The lidar atmospheric remote sensing groups of National Aeronautics and Space Administration (NASA) Marshall Space Flight Center, National Oceanic and Atmospheric Administration (NOAA) Environmental Technology Laboratory, and Jet Propulsion Laboratory jointly developed an airborne lidar system, the Multi-center Airborne Coherent Atmospheric Wind Sensor (MACAWS). The centerpiece of MACAWS is the lidar transmitter from the highly successful NOAA Windvan. Other field-tested lidar components have also been used, when feasible, to reduce costs and development time. The methodology for remotely sensing atmospheric wind fields with scanning coherent Doppler lidar was demonstrated in 1981; enhancements were made and the system was reflown in 1984. MACAWS has potentially greater scientific utility, compared to the original airborne scanning lidar system, owing to a factor of approx. 60 greater energy-per-pulse from the NOAA transmitter. MACAWS development was completed and the system was first flown in 1995. Following enhancements to improve performance, the system was re-flown in 1996 and 1998. The scientific motivation for MACAWS is three-fold: obtain fundamental measurements of subgrid scale (i.e., approx. 2-200 km) processes and features which may be used to improve parameterizations in hydrological, climate, and general/regional circulation models; obtain similar datasets to improve understanding and predictive capabilities for similarly-scaled processes and features; and simulate and validate the performance of prospective satellite Doppler lidars for global tropospheric wind measurement.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 29-32; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2004-12-03
    Description: The AIRPLANE process starts with an aircraft geometry stored in a CAD system. The surface is modeled with a mesh of triangles and then the flow solver produces pressures at surface points which may be integrated to find forces and moments. The biggest advantage is that the grid generation bottleneck of the CFD process is eliminated when an unstructured tetrahedral mesh is used. MESH3D is the key to turning around the first analysis of a CAD geometry in days instead of weeks. The flow solver part of AIRPLANE has proven to be robust and accurate over a decade of use at NASA. It has been extensively validated with experimental data and compares well with other Euler flow solvers. AIRPLANE has been applied to all the HSR geometries treated at Ames over the course of the HSR program in order to verify the accuracy of other flow solvers. The unstructured approach makes handling complete and complex geometries very simple because only the surface of the aircraft needs to be discretized, i.e. covered with triangles. The volume mesh is created automatically by MESH3D. AIRPLANE runs well on multiple platforms. Vectorization on the Cray Y-MP is reasonable for a code that uses indirect addressing. Massively parallel computers such as the IBM SP2, SGI Origin 2000, and the Cray T3E have been used with an MPI version of the flow solver and the code scales very well on these systems. AIRPLANE can run on a desktop computer as well. AIRPLANE has a future. The unstructured technologies developed as part of the HSR program are now targeting high Reynolds number viscous flow simulation. The pacing item in this effort is Navier-Stokes mesh generation.
    Keywords: Computer Programming and Software
    Type: 1999 NASA High-Speed Research Program Aerodynamic Performance Workshop; Volume 1; Part 1; 213-252; NASA/CP-1999-209704/VOL1/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2004-12-03
    Description: An orbiting coherent Doppler lidar for measuring winds is required to provide two basic pieces of data to the user community. The first is the line of sight wind velocity and the second is knowledge of the position at which the measurement was made. In order to provide this information in regions of interest the instrument is also required to have a certain backscatter sensitivity level. This paper outlines some of the considerations necessary in designing a coherent Doppler lidar for this purpose.
    Keywords: Instrumentation and Photography
    Type: Tenth Biennial Coherent Laser Radar Technology and Applications Conference; 302-305; NASA/CP-1999-209758
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2004-12-03
    Description: The Sampling Procedures and Venues Workgroup discussed the potential venues available and issues associated with obtaining measurements. Some of the issues included Incoming Air Quality, Sampling Locations, Probes and Sample Systems. The following is a summary of the discussion of the issues and venues. The influence of inlet air to the measurement of exhaust species, especially trace chemical species, must be considered. Analysis procedures for current engine exhaust emissions regulatory measurements require adjustments for air inlet humidity. As a matter of course in scientific investigations, it is recommended that "background" measurements for any species, particulate or chemical, be performed during inlet air flow before initiation of combustion, if possible, and during the engine test period as feasible and practical. For current regulatory measurements, this would be equivalent to setting the "zero" level for conventional gas analyzers. As a minimum, it is recommended that measurements of the humidity and particulates in the incoming air be taken at the start and end of each test run. Additional measurement points taken during the run are desirable if they can be practically obtained. It was felt that the presence of trace gases in the incoming air is not a significant problem. However, investigators should consider the ambient levels and influences of local air pollution for species of interest. Desired measurement locations depend upon the investigation requirements. A complete investigation of phenomenology of particulate formation and growth requires measurements at a number of locations both within the engine and in the exhaust field downstream of the nozzle exit plane. Desirable locations for both extractive and in situ measurements include: (1) Combustion Zone (Multiple axial locations); (2) Combustor Exit (Multiple radial locations for annular combustors); (3) Turbine Stage (Inlet and exit of the stage); (4) Exit Nozzle (Multiple axial locations downstream of the nozzle). Actual locations with potential for extractive or non-intrusive measurements depend upon the test article and test configuration. Committee members expressed the importance of making investigators aware of various ports that could allow access to various stages of the existing engines. Port locations are engine si)ecific and might allow extractive sampling or innovative hybrid optical-probe access. The turbine stage region was one the most desirable locations for obtaining samples and might be accessed through boroscope ports available in some engine designs. Discussions of probes and sampling systems quickly identified issues dependent on particular measurement quantities. With general consensus, the group recommends SAE procedures for measurements and data analyses of currently regulated exhaust species (CO2, CO, THC, NO(x),) using conventional gas sampling techniques. Special procedures following sound scientific practices must be developed as required for species and/or measurement conditions not covered by SAE standards. Several issues arose concerning short lived radicals and highly reactive species. For conventional sampling, there are concerns of perturbing the sample during extraction, line losses, line-wall reactions, and chemical reactions during the sample transport to the analyzers. Sample lines coated with quartz.or other materials should be investigated for minimization of such effects. The group advocates the development of innovative probe techniques and non-intrusive optical techniques for measurement of short lived radicals and highly reactive species that cannot be sampled accurately otherwise. Two innovative probe concepts were discussed. One concept uses specially designed probes to transfer optical beams to and from a region of flow inaccessible by traditional ports or windows. The probe can perturb the flow field but must have a negligible impact on the region to be optically sampled. Such probes are referred to as hybrid probes and are under development at AEDC for measurement in the high pressure, high temperature of a combustor under development for power generation. The other concept consists of coupling an instrument directly to the probe. The probe would isolate a representative sample stream, freeze chemical reactions and direct the sample into the analyzer portion of the probe. Thus, the measurement would be performed in situ without sample line losses due either to reactions or binding at the wall surfaces. This concept was used to develop a fast, in situ, time-of-flight mass spectrometer measurement system for temporal quantification of NO in the IMPULSE facility at AEDC. Additional work is required in this area to determine the best probe and sampling technique for each species measurement requirement identified by the Trace Chemistry Working Group. A partial list of Venues was used as a baseline for discussion. Additional venues were added to the list and the list was broken out into the following categories: (1)Engines (a) Sea Level Test Stands (b) Altitude Chambers; (2) Annular Combustor Test Stands, (3) Sector Flametube Test Stands, (4) Fundamentals Rigs/Experiments.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 187-237; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Although the importance of aerosols and their precursors are now well recognized, the characterization of current subsonic engines for these emissions is far from complete. Furthermore, since the relationship of engine operating parameters to aerosol emissions is not known, extrapolation to untested and unbuilt engines necessarily remains highly uncertain. 1997 NASA LaRC engine test, as well as the parallel 1997 NASA LaRC flight measurement, attempts to address both issues by expanding measurements of aerosols and aerosol precursors with fuels containing different levels of fuel sulfur content. The specific objective of the 1997 engine test is to obtain a database of sulfur oxides emissions as well as the non-volatile particulate emission properties as a function of fuel sulfur and engine operating conditions. Four diagnostic systems, extractive and non-intrusive (optical), will be assembled for the gaseous and particulate emissions characterization measurements study. NASA is responsible for the extractive gaseous emissions measurement system which contains an array of analyzers dedicated to examining the concentrations of specific gases (NO, NO(x), CO, CO2, O2, THC, SO2) and the smoke number. University of Missouri-Rolla uses the Mobile Aerosol Sampling System to measure aerosol/particulate total concentration, size distribution, volatility and hydration property. Air Force Research Laboratory uses the Chemical Ionization Mass Spectrometer to measure SO2, SO3/H2SO4, and HN03 Aerodyne Research, Inc. uses Infrared Tunable Diode Laser system to measure SO2, SO3, NO, H2O, and CO2.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 123-134; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The goals of the trace chemistry group were to identify the processes relevant to aerosol and aerosol precursor formation occurring within aircraft gas turbine engines; that is, within the combustor, turbine, and nozzle. The topics of discussion focused on whether the chemistry of aerosol formation is homogeneous or heterogeneous; what species are important for aerosol and aerosol precursor formation; what modeling/theoretical activities to pursue; what experiments to carry out that both support modeling activities and elucidate fundamental processes; and the role of particulates in aerosol and aerosol precursor formation. The consensus of the group was that attention should be focused on SO2, SO3, and aerosols. Of immediate concern is the measurement of the concentration of the species SO3, SO2, H2SO4 OH, HO2, H2O2, O, NO, NO2, HONO, HNO3, CO, and CO2 and particulates in various engines, both those currently in use and those in development. The recommendation was that concentration measurements should be made at both the combustor exit and the engine exit. At each location the above species were classified into one of four categories of decreasing importance, Priority I through IV, as follows: Combustor exit: Priority I species - SO3:SO2 ratio, SO3, SO2, and particulates; Priority II species: OH and O; Priority III species - NO and NO2; and Priority IV species - CO and CO2. For the Engine exit: Priority I species - SO3:SO2 ratio, SO3, SO2,H2SO4, and particulates; Priority II species: OH,HO2, H2O2, and O; Priority III species - NO, NO2, HONO, and HNO3; and Priority IV species - CO and CO2. Table I summarizes the anticipated concentration range of each of these species. For particulate matter, the quantities of interest are the number density, size distribution, and composition. In order to provide data for validating multidimensional reacting flow models, it would be desirable to make 2-D, time-resolved measurements of the concentrations of the above species and, in addition, of the pressure, temperature, and velocity. A near term goal of the experimental program should be to confirm the nonlinear effects of sulfur speciation, and if present, to provide an explanation for them. It is also desirable to examine if the particulate matter retains any sulfur. The recommendation is to examine the effects on SOx production of variations in fuel-bound sulfur and aromatic content (which may affect the amount of particulates formed). These experiments should help us to understand if there is a coupling between particulate formation and SO, concentration. Similarly, any coupling with NOx can be examined either by introducing NOx into the combustion air or by using fuel-bound nitrogen. Also of immediate urgency is the need to establish and validate a detailed mechanism for sulfur oxidation/aerosol formation, whose chemistry is concluded to be homogeneous, because there is not enough surface area for heterogeneous effects. It is envisaged that this work will involve both experimental and theoretical programs. The experimental work will require, in addition to the measurements described above, fundamental studies in devices such as flow reactors and shock tubes. Complementing this effort should be modeling and theoretical activities. One impediment to the successful modeling of sulfur oxidation is the lack of reliable data for thermodynamic and transport properties for several species, such as aqueous nitric acid, sulfur oxides, and sulfuric acid. Quantum mechanical calculations are recommended as a convenient means of deriving values for these properties. Such calculations would also help establish rate constants for several important reactions for which experimental measurements are inherently fraught with uncertainty. Efforts to implement sufficiently detailed chemistry into computational fluid dynamic codes should be continued. Zero- and one-dimensional flow models are also useful vehicles for elucidating the minimal set of species and reactions that must be included in two- and three-dimensional modeling studies.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 177-178; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2004-12-03
    Description: This paper reviews the relationships of the programs and projects and reviews the purpose of the Engine Exhaust Trace Chemistry (EETC) Committee. The charges of the Committee are: (1) to prioritize the engine trace constituents for assessing impacts of aircraft; (2) Assess both extractive and insitu measurement techniques; and (3) Determine the best venues for performing the necessary measurements. A synopsis of evidence supporting and questions concerning the role(s) of aerosol/particulates was presented. The presentation also reviewed how sulfur oxidation kinetics interactions in the hot-section and nozzle play a role in the formation of aerosol precursors. The objective of the workshop, and its organization is reviewed.
    Keywords: Environment Pollution
    Type: Workshop on Aerosols and Particulates from Aircraft Gas Turbine Engines; 5-19; NASA/CP-1999-208918
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: A general overview of the Information Systems Center (ISC) at NASA's Goddard Space Flight Center is presented. The background of the center, its reorganization, as well as the applied engineering and technology directorate and its organization are outlined. The ISC's mission and vision as well as a breakdown of the branch are reviewed. Finally, the role of the Software Engineering Laboratory (SEL) within the ISC is reported, it's short and long term goals, and current activities discussed.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2011-08-23
    Description: The theory of special relativity is used to analyze some of the physical phenomena associated with space-based coherent Doppler lidars aimed at Earth and the atmosphere. Two important cases of diffuse scattering and retroreflection by lidar targets are treated. For the case of diffuse scattering, we show that for a coaligned transmitter and receiver on the moving satellite, there is no angle between transmitted and returned radiation. However, the ray that enters the receiver does not correspond to a retroreflected ray by the target. For the retroreflection case there is misalignment between the transmitted ray and the received ray. In addition, the Doppler shift in the frequency and the amount of tip for the receiver aperture when needed are calculated, The error in estimating wind because of the Doppler shift in the frequency due to special relativity effects is examined. The results are then applied to a proposed space-based pulsed coherent Doppler lidar at NASA's Marshall Space Flight Center for wind and aerosol backscatter measurements. The lidar uses an orbiting spacecraft with a pulsed laser source and measures the Doppler shift between the transmitted and the received frequencies to determine the atmospheric wind velocities. We show that the special relativity effects are small for the proposed system.
    Keywords: Instrumentation and Photography
    Type: Applied Optics (ISSN 0003-6935); Volume 38; No. 30; 6374-6381
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2011-08-23
    Description: By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.
    Keywords: Computer Programming and Software
    Type: Optical Engineering (ISSN 0091-3286); Volume 38; No. 5; 742-762
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2011-08-23
    Description: The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.
    Keywords: Computer Programming and Software
    Type: Aeronautical Journal of the Royal Aeronautical Society; No. 2451; 373-382
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2011-08-23
    Description: Analyses of satellite, ground-based, and balloon measurements allow updated estimates of trends in the vertical profile of ozone since 1979. The results show overall consistency among several independent measurement systems, particularly for northern hemisphere midlatitudes where most balloon and ground-based measurements are made. Combined trend estimates over these latitudes for the period 1979-96 show statistically significant negative trends at ail attitudes between 10 and 45 km, with two local extremes: -7.4 +/- 2.0% per decade at 40 km and -7.3 +/- 4.6% per decade at 15 km attitude. There is a strong seasonal variation in trends over northern midlatitudes in the altitude range of 10 to 18 km, with the largest ozone loss during winter and spring. The profile trends are in quantitative agreement with independently measured trends in column ozone, the amount of ozone in a column above the surface. The vertical profiles of ozone trends provide a fingerprint for the mechanisms of ozone depletion over the last two decades.
    Keywords: Environment Pollution
    Type: Science; Volume 285; 1689-1692
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2011-08-23
    Description: We take advantage of the May 1998 biomass burning event in Southern Mexico to test the global applicability of a smoke aerosol size model developed from data observed in South America. The Mexican event is an unique opportunity to observe well-aged, residual smoke. Observations of smoke aerosol size distribution made from vertical profiles of airborne in situ measurements show an inverse relationship between concentration and particle size that suggests the aging process continues more than a week after the smoke is separated from its fire sources. The ground-based radiometer retrievals show that the column-averaged, aged, Mexican smoke particles are larger (diameter = 0.28 - 0.33 micrometers) than the mean smoke particles in South America (diameter = 0.22 - 0.30 micrometers). However, the difference (delta - 0.06 micrometer) translates into differences in backscattering coefficient of only 4-7% and an increase of direct radiative forcing of only 10%.
    Keywords: Environment Pollution
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2011-08-23
    Description: Toxic gases produced by the combustion or thermo-oxidative degradation of materials such as wire insulation, foam, plastics, or electronic circuit boards in space shuttle or space station crew cabins may pose a significant hazard to the flight crew. Toxic gas sensors are routinely evaluated in pure gas standard mixtures, but the possible interferences from polymer combustion products are not routinely evaluated. The NASA White Sands Test Facility (WSTF) has developed a test system that provides atmospheres containing predetermined quantities of target gases combined with the coincidental combustion products of common spacecraft materials. The target gases are quantitated in real time by infrared (IR) spectroscopy and verified by grab samples. The sensor responses are recorded in real time and are compared to the IR and validation analyses. Target gases such as carbon monoxide, hydrogen cyanide, hydrogen chloride, and hydrogen fluoride can be generated by the combustion of poly(vinyl chloride), polyimide-fluoropolymer wire insulation, polyurethane foam, or electronic circuit board materials. The kinetics and product identifications for the combustion of the various materials were determined by thermogravimetric-IR spectroscopic studies. These data were then scaled to provide the required levels of target gases in the sensor evaluation system. Multisensor toxic gas monitors from two manufacturers were evaluated using this system. In general, the sensor responses satisfactorily tracked the real-time concentrations of toxic gases in a dynamic mixture. Interferences from a number of organic combustion products including acetaldehyde and bisphenol-A were minimal. Hydrogen bromide in the products of circuit board combustion registered as hydrogen chloride. The use of actual polymer combustion atmospheres for the evaluation of sensors can provide additional confidence in the reliability of the sensor response.
    Keywords: Instrumentation and Photography
    Type: JANNAF 28th Propellant Development and Characterization Subcommittee and 17th Safety and Environmental Protection Subcommitte Joint Meeting; Volume 1; 127-136; CPIA-Publ-687-Vol-1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2011-08-23
    Description: The SASS (Subsonic Assessment) Ozone and NO(x) Experiment (SONEX) was an airborne field campaign conducted in October-November 1997 in the vicinity of the North Atlantic Flight Corridor Lo study the impact of aircraft emissions on NOx and ozone (03). A fully instrumented NASA DC-8 aircraft was used as the primary SONEX platform. SONEX activities were closely coordinated with the European POLINAT-2 (Pollution from Aircraft Emissions in the North Atlantic Flight Corridor) program, which used a Falcon-20 aircraft and an instrumented in-service Swissair B-747. Both campaigns focused on the upper troposphere/"lowermost" stratosphere (UT/LS) as the region of greatest interest. Specific sampling goals were achieved with the aid of a state-of-the art modeling and meteorological support system, which allowed targeted sampling of air parcels with desired characteristics. A substantial impact of aircraft emissions on NO(x) and O3 in the UT/LS of the study region is shown to be present. It is further shown that the NO(x)- HO(x)-O3 relationships are highly nonlinear and must be accurately simulated to make meaningful future predictions with global models. SONEXIPOLINAT-2 results are being published in Special Sections of GRL and JGR. Here we provide a brief overview of SONEX design, implementation, and expected results to provide a context within which these publications can be understood.
    Keywords: Environment Pollution
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2011-08-23
    Description: XRS is the microcalorimeter X-ray detector aboard the US-Japanese ASTRO-E observatory, which is scheduled to be launched in early 2000. XRS is a high resolution spectrometer- with less than 9 eV resolution at 3 keV and better than 14 eV resolution over its bandpass ranging from about 0.3 keV to 15 keV. Here we present the results of our first calibration of the XRS instrument. We describe the methods used to extract detailed information about the detection efficiency and spectral redistribution of the instrument. We also present comparisons of simulations and real data to test our detector models.
    Keywords: Instrumentation and Photography
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2011-08-23
    Description: We describe the signal processing system of the Astro-E XRS Instrument. The Calorimeter Analog Processor (CAP) provides bias and power for the detectors and amplifies the detector signals by a factor of 20,000. The Calorimeter Digital Processor (CDP) performs the digital processing of the calorimeter signals, detecting X-ray pulses and analyzing them by optimal filtering. We describe the operation of pulse detection, pulse height analysis, and risetime determination. We also discuss performance, including the three event grades (hi-res, mid-res, and low-res), anticoincidence detection, counting rate dependence, and noise rejection.
    Keywords: Instrumentation and Photography
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2011-08-23
    Description: To estimate the effect of subsonic and supersonic aircraft exhaust on the stratospheric concentration of NO(y), we employ a trajectory model initialized with air parcels based on the standard release scenarios. The supersonic exhaust simulations are in good agreement with 2D and 3D model results and show a perturbation of about 1-2 ppbv of NO(y) in the stratosphere. The subsonic simulations show that subsonic emissions are almost entirely trapped below the 380 K potential temperature surface. Our subsonic results contradict results from most other models, which show exhaust products penetrating above 380 K, as summarized. The disagreement can likely be attributed to an excessive vertical diffusion in most models of the strong vertical gradient in NO(y) that forms at the boundary between the emission zone and the stratosphere above 380 K. Our results suggest that previous assessments of the impact of subsonic exhaust emission on the stratospheric region above 380 K should be considered to be an upper bound.
    Keywords: Environment Pollution
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2013-08-31
    Description: Twenty years of progress in 200 GHz receivers for spaceborne remote sensing has yielded a 180-220 GHz technology with maturing characteristics, as evident by increasing availability of relevant hardware, paralleled by further refinement in receiver performance requirements at this spectrum band. The 177-207 GHz superheterodyne receiver, for the Earth observing system (EOS) microwave limb sounder (MLS), effectively illustrates such technology developments. This MLS receiver simultaneously detects six different signals, located at sidebands below and above its 191.95 GHZ local-oscillator (LO). The paper describes the MLS 177-207 GHz receiver front-end (RFE), and provides measured data for its lower and upper sidebands. Sideband ratio data is provided as a function of IF frequency, at different LO power drive, and for variation in the ambient temperature.
    Keywords: Instrumentation and Photography
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2013-08-31
    Description: The Terrestrial Planet Finder (TPF) is a space-based infrared interferometer that will combine high sensitivity and spatial resolution to detect and characterize planetary systems within 15 pc of our sun. TPF is a key element in NASA's Origins Program and is currently under study in its Pre-Project Phase. We review some of the interferometer designs that have been considered for starlight nulling, with particular attention to the architecture and subsystems of the central beam-combiner.
    Keywords: Instrumentation and Photography
    Type: Optical and IR Interferometry from Ground and Space; 207-212
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2013-08-31
    Description: We describe an optical amplifier designed to amplify a spatially sampled component of an optical wavefront to kilowatt average power. The goal is means for implementing a strategy of spatially segmenting a large aperture wavefront, amplifying the individual segments, maintaining the phase coherence of the segments by active means, and imaging the resultant amplified coherent field. Applications of interest are the transmission of space solar power over multi-megameter distances, as to distant spacecraft, or to remote sites with no preexisting power grid.
    Keywords: Instrumentation and Photography
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2013-08-31
    Description: To retrieve temperature and humidity profiles from SSM/T and AMSU, it is important to quantify the contribution of the Earth surface emission. So far, no global estimates of the land surface emissivities are available at SSM/T and AMSU frequencies and scanning conditions. The land surface emissivities have been previously calculated for the globe from the SSM/I conical scanner between 19 and 85 GHz. To analyze the feasibility of deriving SSM/T and AMSU land surface emissivities from SSM/I emissivities, the spectral and angular variations of the emissivities are studied, with the help of ground-based measurements, models and satellite estimates. Up to 100 GHz, for snow and ice free areas, the SSM/T and AMSU emissivities can be derived with useful accuracy from the SSM/I emissivities- The emissivities can be linearly interpolated in frequency. Based on ground-based emissivity measurements of various surface types, a simple model is proposed to estimate SSM/T and AMSU emissivities for all zenith angles knowing only the emissivities for the vertical and horizontal polarizations at 53 deg zenith angle. The method is tested on the SSM/T-2 91.655 GHz channels. The mean difference between the SSM/T-2 and SSM/I-derived emissivities is less than or equal to 0.01 for all zenith angles with an r.m.s. difference of approx. = 0.02. Above 100 GHz, preliminary results are presented at 150 GHz, based on SSM/T-2 observations and are compared with the very few estimations available in the literature.
    Keywords: Environment Pollution
    Type: IEEE Transactions on Geoscience and Remote Sensing; Volume 20
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2013-08-31
    Description: This century, especially in the last few decades, Earth's history was marked by intense study and concern about our environment and how we affect it. Scientific studies show that the level of carbon dioxide in the atmosphere is rising, the ocean's productivity is changing, and the average global temperatures have risen by 0.511. What we do not completely understand is: What fraction of this variation is due to human interference with the environment? What fraction is due to natural phenomena? How do these changes correlate with each other? In order to obtain a better understanding of how land, atmosphere and ocean interact to produce changes on Earth's climate and how human intervention affects these changes, NASA started planning for the Earth Observing System (EOS) in the early 1980's. As a result, a series of satellites will be sent into orbit to monitor the Earth for the next 18 years, providing scientists with necessary data to help them answer these questions.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: The Space Experiment Module (SEM) Program is an education initiative sponsored by the National Aeronautics and Space Administration (NASA) Shuttle Small Payloads Project. The program provides nationwide educational access to space for Kindergarten through University level students. The SEM program focuses on the science of zero-gravity and microgravity. Within the program, NASA provides small containers or "modules" for students to fly experiments on the Space Shuttle. The experiments are created, designed, built, and implemented by students with teacher and/or mentor guidance. Student experiment modules are flown in a "carrier" which resides in the cargo bay of the Space Shuttle. The carrier supplies power to, and the means to control and collect data from each experiment.
    Keywords: Instrumentation and Photography
    Type: 1999 Shuttle Small Payloads Symposium; 25-26; NASA/CP-1999-209476
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2016-06-07
    Description: The first International Symposium on Strain Gauge Balances was sponsored under the auspices of the NASA Langley Research Center (LaRC), Hampton, Virginia during October 22-25, 1996. Held at the LaRC Reid Conference Center, the Symposium provided an open international forum for presentation, discussion, and exchange of technical information among wind tunnel test technique specialists and strain gauge balance designers. The Symposium also served to initiate organized professional activities among the participating and relevant international technical communities. The program included a panel discussion, technical paper sessions, tours of local facilities, and vendor exhibits. Over 130 delegates were in attendance from 15 countries. A steering committee was formed to plan a second international balance symposium tentatively scheduled to be hosted in the United Kingdom in 1998 or 1999. The Balance Symposium was followed by the half-day Workshop on Angle of Attack and Model Deformation on the afternoon of October 25. The thrust of the Workshop was to assess the state of the art in angle of attack (AoA) and model deformation measurement techniques and to discuss future developments.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 2; 727-738; NASA/CP-1999-209101/PT2
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2016-06-07
    Description: This paper will cover the standard force balance calibration and data reduction techniques used at Langley Research Center. It will cover balance axes definition, balance type, calibration instrumentation, traceability of standards to NIST, calibration loading procedures, balance calibration mathematical model, calibration data reduction techniques, balance accuracy reporting, and calibration frequency.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 2; 565-572; NASA/CP-1999-209101/PT2
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: The NASA Langley Research Center (LaRC) has been designing strain-gage balances for more than fifty years. These balances have been utilized in Langley's wind tunnels, which span over a wide variety of aerodynamic test regimes, as well as other ground based test facilities and in space flight applications. As a result, the designs encompass a large array of sizes, loads, and environmental effects. Currently Langley has more than 300 balances available for its researchers. This paper will focus on the design concepts for internal sting mounted strain-gage balances. However, these techniques can be applied to all force measurement design applications. Strain-gage balance concepts that have been developed over the years including material selection, sting, model interfaces, measuring, sections, fabrication, strain-gaging and calibration will be discussed.
    Keywords: Instrumentation and Photography
    Type: First International Symposium on Strain Gauge Balances; Pt. 2; 525-541; NASA/CP-1999-209101/PT2
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2013-08-29
    Description: Measurements of NO(x) and ozone performed during the NOXAR project are compared with results from the coupled chemistry-climate models ECHAM4.L39(DLR)/CHEM and GISS-model. The measurements are based on flights between Europe and the East coast of America and between Europe and the Far East in the latitude range 40 deg N to 65 deg N. The comparison concentrates on tropopause altitudes and reveals strong longitudinal variations of seasonal mean NO,, of 200 pptv. Either model reproduced strong variations 3 km below but not at the tropopause, indicating a strong missing NO(x) or NO(y) sink over remote areas, e.g. NO(x) to HNO3 conversion by OH from additional OH sources or HNO3 wash-out. Vertical profiles show maximum NO(x) values 2-3 km below the tropopause with a strong seasonal cycle. ECHAM4.L39(DLR)/CHEM reproduces a maximum, although located at the tropopause with a less pronounced seasonal cycle, whereas the GISS model reproduces the seasonal cycle but not the profile's shape due to its coarser vertical resolution. A comparison of NO(x) frequency distributions reveals that both models are capable of reproducing the observed variability, except that ECHAM4.L39(DLR)/CHEM shows no very high NO(x) mixing ratios. Ozone mean values, vertical profiles and frequency distributions are much better reproduced in either model, indicating that the NO(x) frequency distribution, namely the most frequent NO(x) mixing ratio, is more important for the tropospheric photochemical ozone production than its mean value. Both models show that among all sources, NO(x) from lightning contributes most to the seasonal cycle of NO(x) at tropopause altitudes. The impact of lightning in the upper troposphere on NO(x) does not vary strongly with altitude, whereas the impact of surface emissions decreases with altitude. However, the models show significant differences in lightning induced NO(x) concentrations, especially in winter, which may be related to the different treatment of the lower stratospheric coupling between dynamics and chemistry.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2013-08-29
    Description: Bayesian inference has been wed successfully for many problems where the aim is to infer the parameters of a model of interest. In this paper we formulate the three dimensional reconstruction problem as the problem of inferring the parameters of a surface model from image data, and show how Bayesian methods can be used to estimate the parameters of this model given the image data. Thus we recover the three dimensional description of the scene. This approach also gives great flexibility. We can specify the geometrical properties of the model to suit our purpose, and can also use different models for how the surface reflects the light incident upon it. In common with other Bayesian inference problems, the estimation methodology requires that we can simulate the data that would have been recoded for any values of the model parameters. In this application this means that if we have image data we must be able to render the surface model. However it also means that we can infer the parameters of a model whose resolution can be chosen irrespective of the resolution of the images, and may be super-resolved. We present results of the inference of surface models from simulated aerial photographs for the case of super-resolution, where many surface elements project into a single pixel in the low-resolution images.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2013-08-29
    Description: Annual zonal averages of ozone amounts from Nimbus-7/TOMS (Total Ozone Mapping Spectrometer) (1979 to 1992) are used to estimate the interannual variability of ozone and UVB (290 - 315 nm) irradiance between plus or minus 60 deg. latitude. Clear-sky interannual ozone and UVB changes are mainly caused by the Quasi Biennial Oscillation (QBO) of stratospheric winds, and can amount to plus or minus 15% at 300 nm and plus or minus 5% at 310 nm (or erythemal irradiance) at the equator and at middle latitudes. Near the equator, the interannual variability of ozone amounts and UV irradiance caused by the combination of the 2.3 year QBO and annual cycles implies that there is about a 5-year periodicity in UVB variability. At higher latitudes, the appearance of the interannual UVB maximum is predicted by the QBO, but without the regular periodicity. The 5-year periodic QBO effects on UVB irradiance are larger than the currently evaluated long-term changes caused by the decrease in ozone amounts.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2013-08-29
    Description: Sea level has been rising for the past century, and inhabitants of the Earth's coastal regions will want to understand and predict future sea level changes. In this study we present results from new simulations of the Goddard Institute for Space Studies (GISS) global atmosphere-ocean model from 1950 to 2099. Model results are compared with observed sea level changes during the past 40 years at 17 coastal stations around the world. Using observed levels of greenhouse gases between 1950 and 1990 and a compounded 0.5% annual increase in Co2 after 1990, model projections show that global sea level measured from 1950 will rise by 61 mm in the year 2000, by 212 mm in 2050, and by 408 mm in 2089. By 2089, two thirds of the global sea level rise will be due to thermal expansion and one third will be due to ocean mass changes. The spatial distribution of sea level rise is different than that projected by rigid lid ocean models.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2013-08-29
    Description: An algorithm is presented for retrieving vertical profiles of O3 concentration using measurements of UV and visible light scattered from the limb of the atmosphere. The UV measurements provide information about the O3 profile in the upper and middle stratosphere, while only visible wavelengths are capable of probing the lower stratospheric O3 profile. Sensitivity to the underlying scene reflectance is greatly reduced by normalizing measurements at a tangent height high in the atmosphere (approximately 55 km), and relating measurements taken at lower altitudes to this normalization point. To decrease the effect of scattering by thin aerosols/clouds that may be present in the field of view, these normalized measurements are then combined by pairing wavelengths with strong and weak O3 absorption. We conclude that limb scatter can be used to measure O3 between 15 km and 50 km with 2-3 km vertical resolution and better than 10% accuracy.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2013-08-29
    Description: The stretched-grid approach provides an efficient down-scaling and consistent interactions between global and regional scales due to using one variable-resolution model for integrations. It is a workable alternative to the widely used nested-grid approach introduced over a decade ago as a pioneering step in regional climate modeling. A variable-resolution General Circulation Model (GCM) employing a stretched grid, with enhanced resolution over the US as the area of interest, is used for simulating two anomalous regional climate events, the US summer drought of 1988 and flood of 1993. The special mode of integration using a stretched-grid GCM and data assimilation system is developed that allows for imitating the nested-grid framework. The mode is useful for inter-comparison purposes and for underlining the differences between these two approaches. The 1988 and 1993 integrations are performed for the two month period starting from mid May. Regional resolutions used in most of the experiments is 60 km. The major goal and the result of the study is obtaining the efficient down-scaling over the area of interest. The monthly mean prognostic regional fields for the stretched-grid integrations are remarkably close to those of the verifying analyses. Simulated precipitation patterns are successfully verified against gauge precipitation observations. The impact of finer 40 km regional resolution is investigated for the 1993 integration and an example of recovering subregional precipitation is presented. The obtained results show that the global variable-resolution stretched-grid approach is a viable candidate for regional and subregional climate studies and applications.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2013-08-29
    Description: Two instruments were flown on shuttle flight STS-87 to test a new technique for inferring the ozone vertical profile using measurements of scattered sunlight from the Earth's limb. The instruments were an ultraviolet imaging spectrometer designed to measure ozone between 30 and 50 km, and a multi-filter imaging photometer that uses 600 nm radiances to measure ozone between 15 km and 35 km. Two orbits of limb data were obtained on December 2, 1997. For the scans analyzed the ozone profile was measured from 15 km to 50 km with approximately 3 km vertical resolution. Comparisons with a profile from an ozonesonde launched from Ascension Island showed agreement mostly within +/- 5%. The tropopause at 15 km was clearly detected.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2013-08-29
    Description: During the Aerosols99 trans-Atlantic cruise from Norfolk, VA, to Cape Town, South Africa, daily ozonesondes were launched from the NOAA R/V Ronald H Brown between 17 January and 6 February l999. A composite of tropospheric ozone profiles along the latitudinal transect shows 4 zones, which are interpreted using correlative shipboard ozone, CO, water vapor, and overhead aerosol optical thickness measurements. Elevated ozone associated with biomass burning north of the ITCZ (Intertropical Convergence Zone) is prominent at 3-5 km from 10-0N, but even higher ozone (100 ppbv, 7-10 km) occurred south of the ITCZ, where it was not burning. Column-integrated tropospheric ozone was 44 Dobson Units (DU) in one sounding, 10 DU lower than the maximum in a January-February 1993 Atlantic cruise with ozonesondes [Weller et al., 1996]. TOMS tropospheric ozone shows elevated ozone extending throughout the tropical Atlantic in January 1999. Several explanations are considered. Back trajectories, satellite aerosol observations and shipboard tracers suggest a combination of convection and interhemispheric transport of ozone and/or ozone precursors, probably amplified by a lightning NO source over Africa.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2013-08-29
    Description: The Goddard trajectory chemistry model was used with ER-2 aircraft data to test our current knowledge of radical photochemistry during the POLARIS (Polar Ozone Loss in the Arctic Region In Summer) campaign. The results of the trajectory chemistry model with and without trajectories are used to identify cases where steady state does not accurately describe the measurements. Over the entire mission, using trajectory chemistry reduces the variability in the modeled NO(x) comparisons to data by 25% with respect to the same model simulating steady state. Although the variability is reduced, NO(x)/NO(y) trajectory model results were found to be systematically low relative to the observations by 20-30% as seen in previous studies. Using new rate constants for reactions important in NO(y) partitioning improves the agreement of NO(x)/NO(y) with the observations but a 5-10% bias still exists. OH and HO2 individually are underpredicted by 15% of the standard steady state model and worsen with the new rate constants. Trajectory chemistry model results of OH/HO2 were systematically low by 10-20% but improve using the new rates constants because of the explicit dependence on NO. This suggests that our understanding of NO(x) is accurate to the 20% level and HO(x) chemistry is accurate to the 30% level in the lower stratosphere or better for the POLARIS regime. The behavior of the NO(x) and HO(x) comparisons to data using steady state versus trajectory chemistry and with updated rate coefficients is discussed in ten-ns of known chemical mechanisms and lifetimes.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2013-08-29
    Description: Using state-of-the-art satellite-gauge monthly rainfall estimate and optimally interpolated sea surface temperature (SST) data, we have assessed the 1997-98 AA-monsoon anomalies in terms of three basic causal factors: basin-scale SST, regional coupling, and internal variability. Singular Value Decomposition analyses of rainfall and SST are carried out globally over the entire tropics and regionally over the AA-monsoon domain. Contributions to monsoon rainfall predictability by various factors are evaluated from cumulative anomaly correlation with dominant regional SVD modes. Results reveal a dominant, large-scale monsoon-El Nino coupled mode with well-defined centers of action in the near-equatorial monsoon regions during the boreal summer and winter respectively. The observed 1997-98 AA-monsoon anomalies are found to be very complex with approximately 34% of the anomalies of the Asian (boreal) summer monsoon and 74% of the Australia (austral) monsoon attributable to basin-scale SST influence associated with El Nino. Regional coupled processes contribute an additional 19% and 10%, leaving about 47% and 16% due to internal dynamics for the boreal and austral monsoon respectively. For the boreal summer monsoon, it is noted that the highest monsoon predictability is not necessary associated with major El Nino events (e.g. 1997, 1982) but rather in non-El Nino years (e.g. 1980, 1988) when contributions from the regional coupled modes far exceed those from the basin-scale SST. The results suggest that in order to improve monsoon seasonal-to-interannual predictability, there is a need to exploit not only monsoon-El Nino relationship, but also intrinsic monsoon regional coupled processes.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2013-08-29
    Description: This chapter presents the science of "COllective INtelligence" (COIN). A COIN is a large multi-agent systems where: i) the agents each run reinforcement learning (RL) algorithms; ii) there is little to no centralized communication or control; iii) there is a provided world utility function that, rates the possible histories of tile full system. Tile conventional approach to designing large distributed systems to optimize a world utility does not use agents running RL algorithms. Rather that approach begins with explicit modeling of the overall system's dynamics, followed by detailed hand-tuning of the interactions between the components to ensure that they "cooperate" as far as the world utility is concerned. This approach is labor-intensive, often results in highly non-robust systems, and usually results in design techniques that, have limited applicability. In contrast, with COINs we wish to solve the system design problems implicitly, via the 'adaptive' character of the RL algorithms of each of the agents. This COIN approach introduces an entirely new, profound design problem: Assuming the RL algorithms are able to achieve high rewards, what reward functions for the individual agents will, when pursued by those agents, result in high world utility? In other words, what reward functions will best ensure that we do not have phenomena like the tragedy of the commons, or Braess's paradox? Although still very young, the science of COINs has already resulted in successes in artificial domains, in particular in packet-routing, the leader-follower problem, and in variants of Arthur's "El Farol bar problem". It is expected that as it matures not only will COIN science expand greatly the range of tasks addressable by human engineers, but it will also provide much insight into already established scientific fields, such as economics, game theory, or population biology.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2013-08-29
    Description: The impact of aircraft emissions on reactive nitrogen in the upper troposphere (UT) and lowermost stratosphere (LS) was estimated using the NO(y)-O3 correlation obtained during the SASS Ozone and NO(x) Experiment (SONEX) carried out over the US continent and North Atlantic Flight Corridor (NAFC) region in October and November 1997. To evaluate the large scale impact, we made a reference NO(y)-O3 relationship in air masses, upon which aircraft emissions were considered to have little impact. For this purpose, the integrated input of NO(x) from aircraft into an air mass along a 10-d back trajectory (DELTA-NO(y)) was calculated based on the ANCAT/EC2 emission inventory. The excess NO(y) (dNO(y)) was calculated from the observed NO(y) and the reference NO(y)-O3 relationship. As a result, a weak positive correlation was found between the dNO(y) and DELTA-NO(y), and dNO(y) and NO(x)/NO(y) values, while no positive correlation between the dNO(y) and CO values was found, suggesting that dNO(y) values can be used as a measure of the NO(x) input from aircraft emissions. The excess NO(y) values calculated from another NO(y)-O3 reference relationship made using in-situ CN data also agreed with these dNO(y) values, within the uncertainties. At the NAFC region (45 N - 60 N), the median value of dNO(y) in the troposphere increased with altitude above 9 km and reached 70 pptv (20% of NO(y)) at 11 km. The excess NO(x) was estimated to be about half of the dNO(y) values, corresponding to 30% of the observed NO(x) level. Higher dNO(y) values were generally found in air masses where O3 = 75 - 125 ppbv, suggesting a more pronounced effect around the tropopause. The median value of dNO(y) in the stratosphere at the NAFC region at 8.5 - 11.5 km was about 120 pptv. The higher dNO(y) values in the LS were probably due to the accumulated effect of aircraft emissions, given the long residence time of affected air in the LS. Similar dNO(y) values were also obtained in air masses sampled over the US continent.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2013-08-29
    Description: For the past decade, the NASA Atmospheric Effects of Aviation Project (AEAP) has been the U.S. focal point for research on aircraft effects. In conjunction with U.S. basic research programs, AEAP and concurrent European research programs have driven remarkable progress reports released in 1999 [IPCC, 1999; Kawa et al., 1999]. The former report primarily focuses on aircraft effects in the upper troposphere, with some discussion on stratospheric impacts. The latter report focuses entirely on the stratosphere. The current status of research regarding aviation effects on stratospheric ozone and climate, as embodied by the findings of these reports, is reviewed. The following topics are addressed: Aircraft Emissions, Pollution Transport, Atmospheric Chemistry, Polar Processes, Climate Impacts of Supersonic Aircraft, Subsonic Aircraft Effect on the Stratosphere, Calculations of the Supersonic Impact on Ozone and Sensitivity to Input Conditions.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2013-08-29
    Description: Chemical data from flight 8 of NASA's Subsonic Assessment (SASS) Ozone and Nitrogen Oxide Experiment (SONEX) exhibited signatures consistent with aircraft emissions, stratospheric air, and surface-based pollution. These signatures are examined in detail, focussing on the broad aircraft emission signatures that are several hundred kilometers in length. A mesoscale meteorological model provides high resolution wind data that are used to calculate backward trajectories arriving at locations along the flight track. These trajectories are compared to aircraft locations in the North Atlantic Flight Corridor over a 27-33 hour period. Time series of flight level NO and the number of trajectory/aircraft encounters within the NAFC show excellent agreement. Trajectories arriving within the stratospheric and surface-based pollution regions are found to experience very few aircraft encounters. Conversely, there are many trajectory/aircraft encounters within the two chemical signatures corresponding to aircraft emissions. Even many detailed fluctuations of NO within the two aircraft signature regions correspond to similar fluctuations in aircraft encountered during the previous 27-33 hours. Results indicate that high resolution meteorological modeling, when coupled with detailed aircraft location data, is useful for understanding chemical signatures from aircraft emissions at scales of several hundred kilometers.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2013-08-29
    Description: Previous modeling of the performance of spaceborne direct-detection Doppler lidar systems has assumed extremely idealized atmospheric models. Here we develop a technique for modeling the performance of these systems in a more realistic atmosphere, based on actual airborne lidar observations. The resulting atmospheric model contains cloud and aerosol variability that is absent in other simulations of spaceborne Doppler lidar instruments. To produce a realistic simulation of daytime performance, we include solar radiance values that are based on actual measurements and are allowed to vary as the viewing scene changes. Simulations are performed for two types of direct-detection Doppler lidar systems: the double-edge and the multi-channel techniques. Both systems were optimized to measure winds from Rayleigh backscatter at 355 nm. Simulations show that the measurement uncertainty during daytime is degraded by only about 10-20% compared to nighttime performance, provided a proper solar filter is included in the instrument design.
    Keywords: Instrumentation and Photography
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2013-08-29
    Description: Flight 10 of NASA's Subsonic Assessment (SASS) Ozone and Nitrogen Oxide Experiment (SONEX) extended southwest of Lajes, Azores. A variety of chemical signatures were encountered. These signatures are examined in detail, relating them to meteorological data from a high resolution numerical model having horizontal grid spacing of 30 and 90 km and 26 vertical levels. The meteorological output at hourly intervals is used to create backward trajectories from the locations of the chemical signatures. Four major categories of chemical signatures are discussed-stratospheric, lightning, continental pollution, and a transition layer. The strong stratospheric signal is encountered just south of the Azores in a region of depressed tropopause height. Three chemical signatures at different altitudes in the upper troposphere are attributed to lightning. Backward trajectories arriving at locations of these signatures are related to locations of cloud-to-ground lightning. Results show that the trajectories pass through regions of lightning 1-2 days earlier over the eastern Gulf of Mexico and off the southeast coast of the United States. The lowest leg of the flight exhibits a chemical signature consistent with continental pollution. Trajectories arriving at this signature are found to pass over the highly populated Northeast Corridor of the United States. Surface based pollution apparently is lofted to the altitudes of the trajectories by convective clouds along the East Coast that did not contain lightning. Finally, a chemical transition layer is described. Its chemical signature is intermediate to those of lightning and continental pollution. Trajectories arriving in this layer pass between the trajectories of the lightning and pollution signatures. Thus, they probably are impacted by both sources.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2013-08-29
    Description: As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-29
    Description: This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter the focus is on some experimental data on low voltage drop out regulators to support mixed 5 and 3.3 volt systems. A discussion of the Small Explorer WIRE spacecraft will also be given. Lastly, we show take a first look at robust state machines in Hardware Description Languages (VHDL) and their use in critical systems. If you have information that you would like to submit or an area you would like discussed or researched, please give me a call or e-mail.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2013-08-29
    Description: We present a study of the distribution of ozone in the lowermost stratosphere with the goal of characterizing the observed variability. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High (low) potential vorticity at 300 hPa indicates that the tropopause is low (high), and the identification of these two groups is made to account for the dynamic variability. Conditional probability distribution functions are used to define the statistics of the ozone distribution from both observations and a three-dimensional model simulation using winds from the Goddard Earth Observing System Data Assimilation System for transport. Ozone data sets include ozonesonde observations from northern midlatitude stations (1991-96) and midlatitude observations made by the Halogen Occultation Experiment (HALOE) on the Upper Atmosphere Research Satellite (UARS) (1994- 1998). The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause (approximately 380K). The probability distribution functions are similar for the two data sources, despite differences in horizontal and vertical resolution and spatial and temporal sampling. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. Results show that during summer, much of the observed variability is explained by the height of the tropopause. During the winter and spring, when the tropopause fluctuations are larger, less of the variability is explained by tropopause height. This suggests that more mixing occurs during these seasons. During all seasons, there is a transition zone near the tropopause that contains air characteristic of both the troposphere and the stratosphere. The relevance of the results to the assessment of the environmental impact of aircraft effluence is also discussed.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2013-08-29
    Description: An SRAM (static random access memory)-based reprogrammable FPGA (field programmable gate array) is investigated for space applications. A new commercial prototype, named the RS family, was used as an example for the investigation. The device is fabricated in a 0.25 micrometers CMOS technology. Its architecture is reviewed to provide a better understanding of the impact of single event upset (SEU) on the device during operation. The SEU effect of different memories available on the device is evaluated. Heavy ion test data and SPICE simulations are used integrally to extract the threshold LET (linear energy transfer). Together with the saturation cross-section measurement from the layout, a rate prediction is done on each memory type. The SEU in the configuration SRAM is identified as the dominant failure mode and is discussed in detail. The single event transient error in combinational logic is also investigated and simulated by SPICE. SEU mitigation by hardening the memories and employing EDAC (error detection and correction) at the device level are presented. For the configuration SRAM (CSRAM) cell, the trade-off between resistor de-coupling and redundancy hardening techniques are investigated with interesting results. Preliminary heavy ion test data show no sign of SEL (single event latch-up). With regard to ionizing radiation effects, the increase in static leakage current (static I(sub CC)) measured indicates a device tolerance of approximately 50krad(Si).
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2013-08-29
    Description: Architecture and process, combined, significantly affect the hardness of programmable technologies. The effects of high energy ions, ferroelectric memory architectures, and shallow trench isolation are investigated. A detailed single event latchup (SEL) study has been performed.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2013-08-29
    Description: Dragonfly - NASA and the Crisis Aboard MIR (New York: HarperCollins Publishers), the story of the Russian-American misadventures on MIR. An expose with almost embarrassing detail about the inner-workings of Johnson Space Center in Houston, this book is best read with the JSC organization chart in hand. Here's the real world of engineering and life in extreme environments. It makes most other accounts of "requirements analysis" appear glib and simplistic. The book vividly portrays the sometimes harrowing experiences of the American astronauts in the web of Russian interpersonal relations and literally in the web of MIR's wiring. Burrough's exposition reveals how handling bureaucratic procedures and bulky facilities is as much a matter of moxie and goodwill as technical capability. Lessons from MIR showed NASA that getting to Mars required a different view of knowledge and improvisation-long-duration missions are not at all like the scripted and pre-engineered flights of Apollo or the Space Shuttle.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2013-08-29
    Description: We describe the current GISS analysis of surface temperature change based primarily on meteorological station measurements. The global surface temperature in 1998 was the warmest in the period of instrumental data. The rate of temperature change is higher in the past 25 years than at any previous time in the period of instrumental data. The warmth of 1998 is too large and pervasive to be fully accounted for by the recent El Nino, suggesting that global temperature may have moved to a higher level, analogous to the increase that occurred in the late 1970s. The warming in the United States over the past 50 years is smaller than in most of the world, and over that period there is a slight cooling trend in the Eastern United States and the neighboring Atlantic ocean. The spatial and temporal patterns of the temperature change suggest that more than one mechanism is involved in this regional cooling.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2013-08-29
    Description: In the framework of the project POLINAT 2 (Pollution in the North Atlantic Flight Corridor) we measured NO(x) (NO and NO2) and ozone on 98 flights through the North Atlantic Flight Corridor (NAFC) with a fully automated system permanently installed aboard an in-service Swissair B-747 airliner in the period of August to November 1997. The averaged NO, concentrations both in the NAFC and at the U.S. east coast were similar to that measured in autumn 1995 with the same system. The patchy occurrence of NO(x), enhancements up to 3000 pptv over several hundred kilometers (plumes), predominately found over the U.S. east coast lead to a log-normal NO(x) probability density function. In three case-studies we examine the origins of such plumes by combining back-trajectories with brightness temperature enhanced (IR) satellite imagery, with lightning observations from the U.S. National Lightning Detection Network (NLDN) or with the Optical Transient Detector (OTD) satellite. For frontal activity above the continental U.S., we demonstrate that the location of NO(x) plumes can be well explained with maps of convective influence. For another case we show that the number of lightning flashes in a cluster of marine thunderstorms is proportional to the NO(x) concentrations observed several hundred kilometers downwind of the anvil outflows and suggest that lightning was the dominant source. From the fact that in autumn the NO, maximum was found several hundred kilometers off the U.S. east coast, it can be inferred that thunderstorms triggered over the warm Gulf Stream current are an important source for the regional upper tropospheric NO(x) budget in autumn.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2013-08-29
    Description: Key questions to which SONEX was directed were the following: Can aircraft corridors be detected? Is there a unique tracer for aircraft NO(x)? Can a "background" NO(x) (or NO(y) be defined? What fraction of NO(x) measured during SONEX was from aircraft? How representative was SONEX of the North Atlantic in 1997 and how typical of other years? We attempt to answer these questions through species-species correlations, probability distribution functions (PDFs), and meteorological history. There is not a unique aircraft tracer, largely due to the high variability of air mass origins and tracer ratios, which render "average" quantities meaningless. The greatest NO and NO(y) signals were associated with lightning and convective NO sources. Well-defined background CO, NO(y) and NO(y)/ozone ratio appear in subsets of two cross-track flights with subtropical origins and five flights with predominantly mid-latitude air. Forty percent of the observations on these 7 flights showed NO(y)/ozone to be above background, evidently due to unreacted NO(x). This NO(x) is a combination of aircraft, lightning and surface pollution injected by convection. The strongly subtropical signatures in SONEX observations, confirmed by pv (potential vorticity) values along flight tracks, argues for most of the unreacted NO(x) originating from lightning. Potential vorticity statistics along SONEX flight tracks in 1992-1998, and for the North Atlantic as a whole, show the SONEX meteorological environment to be representative of the North Atlantic flight corridor in the October-November period.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2013-08-29
    Description: Airborne measurements of NO(x) total reactive nitrogen (NO(y)), O3 and condensation nuclei (CN) were made within air traffic corridors over the U.S. and North Atlantic regions (35-60 deg N) in the fall of 1997. NO(x) and NO(y) data obtained in the lowermost stratosphere (LS) were examined using the calculated increase in NO(y) ((delta)NO(y)) along five-day back trajectories as a parameter to identify possible effects of aircraft on reactive nitrogen. It is very likely that aircraft emissions had a significant impact on the NO(x) levels in the LS inasmuch as the NO(s), mixing ratios at 8.5-12 km were significantly correlated with the independent parameters of aircraft emissions, i.e., (delta)NO(y) levels and CN values. In order to estimate quantitatively the impact of aircraft emissions on NO(x), and CN, the background levels of CN and NO(x) at O3 = 100-200 ppbv were derived from the correlations of these quantities with (delta)NO(y)). On average, the aircraft emissions are estimated to have increased the NO(x) and CN values by 130 pptv and 400 STP,cc, respectively, which corresponds to 70 -/+ 30 % and 30 -/+ 20 % of the observed median values.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2013-08-29
    Description: The American Geophysical Union (AGU), as a scientific organization devoted to research on the Earth and space sciences, provides current scientific information to the public on issues pertinent to geophysics. The Council of the AGU approved a position statement on Climate Change and Greenhouse Gases in December 1998. The statement, together with a short summary of the procedures that were followed in its preparation, review, and adoption were published in the February 2, 1999 issue of Eos ([AGU, 1999]. The present article reviews scientific understanding of this issue as presented in peer-reviewed publications that serves as the underlying basis of the position statement.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2013-08-29
    Description: The role of naturally varying vegetation in influencing the climate variability in the Sahel is explored in a coupled atmosphere-land-vegetation model. The Sahel rainfall variability is influenced by sea surface temperature (SST) variations in the oceans. Land-surface feedback is found to increase this variability both on interannual and interdecadal time scales. Interactive vegetation enhances the interdecadal variation significantly, but can reduce year to year variability due to a phase lag introduced by the relatively slow vegetation adjustment time. Variations in vegetation accompany the changes in rainfall, in particular, the multi-decadal drying trend from the 1950s to the 80s.
    Keywords: Environment Pollution
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: The onboard control software for spacecraft such as Mars Pathfinder and Cassini is composed of many subsystems including executive control, navigation, attitude control, imaging, data management, and telecommunications. The software in all of these subsystems needs to be instrumented for several purposes: to report required telemetry data, to report warning and error events, to verify internal behavior during system testing, and to provide ground operators with detailed data when investigating in-flight anomalies. Events can range in importance from purely informational events to major errors. It is desirable to provide a uniform mechanism for reporting such events and controlling their subsequent processing. Since radiation-hardened flight processors are several years behind the speed and memory of their commercial cousins, and since most subsystems require real-time control, and since downlink rates to earth can be very low from deep space, there are limits to how much of the data can be saved and transmitted. Some kinds of events are more important than others and should therefore be preferentially retained when memory is low. Some faults can cause an event to recur at a high rate, but this must not be allowed to consume the memory pool. Some event occurrences may be of low importance when reported but suddenly become more important when a subsequent error event gets reported. Some events may be so low-level that they need not be saved and reported unless specifically requested by ground operators.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2011-08-23
    Description: Analyses of satellite, ground-based, and balloon measurements allow updated estimates of trends in the vertical profile of ozone since 1979. The results show overall consistency among several independent measurement systems, particularly for northern hemisphere midlatitudes where most balloon and ground-based measurements are made. Combined trend estimates over these latitudes for the period 1979-96 show statistically significant negative trends at all altitudes between 10 and 45 km, with two local extremes: -7.4 plus or minus 2.0% per decade at 40 km and -7.3 plus or minus -4.6% per decade at 15 km altitude. There is a strong seasonal variation in trends over northern midlatitudes in the attitude range of 10 to 18 km, with the largest ozone loss during winter and spring. The profile trends are in quantitative agreement with independently measured trends in column ozone, the amount of ozone in a column above the surface. The vertical profiles of ozone trends provide a fingerprint for the mechanisms of ozone depletion over the last two decades.
    Keywords: Environment Pollution
    Type: Science; Volume 285; 1689-1692
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2011-08-23
    Description: Leaks in the hydrazine supply system of the Shuttle APU can result in hydrazine ignition and fire in the aft compartment of the Shuttle. Indication of the location of a leak could provide valuable information required for operational decisions. WSTF has developed a small, single use sensor for detection of hydrazine leaks. The sensor is composed of a thermistor bead coated with copper(II) oxide (CuO) dispersed in a clay or alumina binder. The CuO-coated thermistor is one of a pair of closely located thermistors, the other being a reference. On exposure to hydrazine the CuO reacts exothermically with the hydrazine and increases the temperature of the coated-thermistor by several degrees. The temperature rise is sensed by a resistive bridge circuit and an alarm registered by data acquisition software. Responses of this sensor to humidity changes, hydrazine concentration, binder characteristics, distance from a liquid leak, and ambient pressure levels as well as application of this sensor concept to other fluids are presented.
    Keywords: Instrumentation and Photography
    Type: JANNAF 28th Propellant Development and Characterization Subcommittee and 17th Safety and Environmental Protection Subcommitte Joint Meeting; Volume 1; 137-144; CPIA-Publ-687-Vol-1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2011-08-23
    Description: We report results from a systematic study of breakdown limits for novel high-rate gaseous detectors: MICROMEGAS, CAT and GEM, together with more conventional devices such as thin-gap parallel-mesh chambers and high-rate wire chambers. It was found that for all these detectors, the maximum achievable pin, before breakdown appears, drops dramatically with incident flux, and is sometimes inversely proportional to it. Further, in the presence of alpha particles, typical of the breakgrounds in high-energy experiments, additional gain drops of 1-2 orders of magnitude were observed for many detectors. It was found that breakdowns at high rates occur through what we have termed an "accumulative" mechanism, which does not seem to have been previously reported in the literature. Results of these studies may help in choosing the optimum detector for given experimental conditions.
    Keywords: Instrumentation and Photography
    Type: Nuclear Instruments and Methods in Physics Research A (ISSN 0168-9002); Volume 422; 300-304
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2016-02-04
    Description: Results are presented for six simulations of the Goddard Institute for Space Studies (GISS) global atmosphere-ocean model for the years 1950 to 2099. There are two control simulations with constant 1950 atmospheric composition from different initial states, two GHG experiments with observed greenhouse gases up to 1990 and compounded .5% CO2 annual increases thereafter, and two GHG+SO4 experiments with the same varying greenhouse gases plus varying tropospheric sulfate aerosols. Surface air temperature trends in the two GHG experiments are compared between themselves and with the observed temperature record from 1960 and 1998. All comparisons show high positive spatial correlation in the northern hemisphere except in summer when the greenhouse signal is weakest. The GHG+SO4 experiments show weaker correlations. In the southern hemisphere, correlations are either weak or negative which in part are due to the model's unrealistic interannual variability of southern sea ice cover. The model results imply that temperature changes due to forcing by increased greenhouse gases have risen above the level of regional interannual temperature variability in the northern hemisphere over the past 40 years. This period is thus an important test of reliability of coupled climate models.
    Keywords: Environment Pollution
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2011-08-23
    Description: We discuss the methodology of interpreting channel 1 and 2 AVHRR radiance data to retrieve tropospheric aerosol properties over the ocean and describe a detailed analysis of the sensitivity of monthly average retrievals to the assumed aerosol models. We use real AVHRR data and accurate numerical techniques for computing single and multiple scattering and spectral absorption of light in the vertically inhomogeneous atmosphere-ocean system. Our analysis shows that two-channel algorithms can provide significantly more accurate retrievals of the aerosol optical thickness than one-channel algorithms and that imperfect cloud screening is the largest source of errors in the retrieved optical thickness. Both underestimating and overestimating aerosol absorption as well as strong variability of the aerosol refractive index may lead to regional and/or seasonal biases in optical thickness retrievals. The Angstrom exponent appears to be the most invariant aerosol size characteristic and should be retrieved along with optical thickness as the second aerosol parameter.
    Keywords: Environment Pollution
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...