ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Meteorology and Climatology  (264)
  • Computer Programming and Software  (184)
  • Cell & Developmental Biology
  • 2000-2004  (448)
  • 1995-1999
  • 2000  (448)
Collection
Keywords
Years
  • 2000-2004  (448)
  • 1995-1999
Year
  • 1
    Publication Date: 2004-12-03
    Description: We have completed a new generation of water vapor radiometers (WVR), the A- series, in order to support radio science experiments with the Cassini spacecraft. These new instruments sense three frequencies in the vicinity of the 22 GHz emission line of atmospheric water vapor within a 1 degree beamwidth from a clear aperture antenna that is co-pointed with the radio telescope down to 10 degree elevation. The radiometer electronics features almost an order of magnitude improvement in temperature stability compared with earlier WVR designs. For many radio science experiments, the error budget is likely to be dominated by path delay fluctuations due to variable atmospheric water vapor along the line-of-sight to the spacecraft. In order to demonstrate the performance of these new WVRs we are attempting to calibrate the delay fluctuations as seen by a radio interferometer operating over a 21 km baseline with a WVR near each antenna. The characteristics of these new WVRs will be described and the results of our preliminary analysis will be presented indicating an accuracy of 0.2 to 0.5 mm in tracking path delay fluctuations over time scales of 10 to 10,000 seconds.
    Keywords: Meteorology and Climatology
    Type: International VLBI Service for Geodesy and Astrometry: 2000 General Meeting Proceedings; 274-279; NASA/CP-2000-209893
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2004-12-03
    Description: Science education is taking the teaching of science from a traditional (lecture) approach to a multidimensional sense-making approach which allows teachers to support students by providing exploratory experiences. Using projects is one way of providing students with opportunities to observe and participate in sense-making activity. We created a learning environment that fostered inquiry-based learning. Students were engaged in a variety of Inquiry activities that enabled them to work in cooperative planning teams where respect for each other was encouraged and their ability to grasp, transform and transfer information was enhanced. Summer, 1998: An air pollution workshop was conducted for high school students in the Medgar Evers College/Middle College High School Liberty Partnership Summer Program. Students learned the basics of meteorology: structure and composition of the atmosphere and the processes that cause weather. The highlight of this workshop was the building of hand-held sunphotometers, which measure the intensity of the sunlight striking the Earth. Summer, 1999: high school students conducted a research project which measured the mass and size of ambient particulates and enhanced our ability to observe through land based measurements changes in the optical depth of ambient aerosols over Brooklyn. Students used hand held Sunphotometers to collect data over a two week period and entered it into the NASA GISS database by way of the internet.
    Keywords: Meteorology and Climatology
    Type: Materials Presented at the MU-SPIN Ninth Annual Users' Conference; 33-36; NASA/CP-2000-209970
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2011-08-24
    Description: Rapid climate change characterizes numerous terrestrial sediment records during and since the last glaciation. Vegetational response is best expressed in terrestrial records near ecotones, where sensitivity to climate change is greatest, and response times are as short as decades.
    Keywords: Meteorology and Climatology
    Type: Proceedings of the National Academy of Sciences of the United States of America (ISSN 0027-8424); Volume 97; 4; 1359-61
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2011-08-24
    Description: Geological, geophysical, and geochemical data support a theory that Earth experienced several intervals of intense, global glaciation ("snowball Earth" conditions) during Precambrian time. This snowball model predicts that postglacial, greenhouse-induced warming would lead to the deposition of banded iron formations and cap carbonates. Although global glaciation would have drastically curtailed biological productivity, melting of the oceanic ice would also have induced a cyanobacterial bloom, leading to an oxygen spike in the euphotic zone and to the oxidative precipitation of iron and manganese. A Paleoproterozoic snowball Earth at 2.4 Giga-annum before present (Ga) immediately precedes the Kalahari Manganese Field in southern Africa, suggesting that this rapid and massive change in global climate was responsible for its deposition. As large quantities of O(2) are needed to precipitate this Mn, photosystem II and oxygen radical protection mechanisms must have evolved before 2.4 Ga. This geochemical event may have triggered a compensatory evolutionary branching in the Fe/Mn superoxide dismutase enzyme, providing a Paleoproterozoic calibration point for studies of molecular evolution.
    Keywords: Meteorology and Climatology
    Type: Proceedings of the National Academy of Sciences of the United States of America (ISSN 0027-8424); Volume 97; 4; 1400-5
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2011-08-23
    Description: The Texas A&M monthly total oceanic rainfall retrieval algorithm is based on radiative transfer models and can only be modified on a physically sound basis. Within this constraint we have examined some improvements to the algorithm and it appears that it can be made significantly better. In particular, it appears that by proper use of the range of frequencies available on TMI (TRMM Microwave Imager) and AMSR that the need for the log-normal fit can be eliminated.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 235-242
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2011-08-23
    Description: We evaluated the performance of the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) at-launch algorithm for monthly oceanic rain rate using two years (January 1998 - December 1999) of TMI data. The TMI at-launch algorithm is based on Wilheit et al.'s technique for estimating monthly oceanic rainfall that relies on histograms of multichannel microwave measurements. Comparisons with oceanic monthly rain rates derived from the Defense Meteorological Satellite Program (DMSP) F-13 and F-14 Special Sensor Microwave Imager (SSM/I) data show the average rain rates over the TRMM region (between 400S and 40N) are 3.0, 2.85 and 2.89 mm/day, respectively for F-13, F-14 and TMI. Based on the latest version of TB data (version 5), both rainrate and freezing height derived from TMI are similar to those from the F-13 and F-14 SSM/I data. However, regionally the differences are statistically significant at the 95% confidence. Three hourly monthly rainrates are also computed from 3-hourly TB histograms to examine the diurnal cycle of precipitation. Over most of the oceanic TRMM area, a distinct early morning rainfall peak is found. A harmonic analysis shows that the amplitude of the 12h harmonic is significant and comparable to that of the 24h harmonic.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 198-207
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2011-08-24
    Description: A one-week in situ intercomparison campaign was completed on the Rice University campus for measuring HCHO using three different techniques, including a novel optical sensor based on difference frequency generation (DFG) operating at room temperature. Two chemical derivatization methods, 2,4-dinitrophenylhydrazine (DNPH) and o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA), were deployed during the daylight hours for three- to four-hour time-integrated samples. A real-time optical sensor based on laser absorption spectroscopy was operated simultaneously, including nighttime hours. This tunable spectroscopic source based on difference frequency mixing of two fiber-amplified diode lasers in periodically poled LiNb03 (PPLN) was operated at 3.5315 micrometers (2831.64 cm 1) to access a strong HCHO ro-vibrational transition free of interferences from other species. The results showed a bias of -1.7 and -1.2 ppbv and a gross error of 2.6 and 1.5 ppbv for DNPH and PFBHA measurements, respectively, compared with DFG measurements. These results validate the DFG sensor for time-resolved measurements of HCHO in urban areas.
    Keywords: Meteorology and Climatology
    Type: Geophysical research letters (ISSN 0094-8276); Volume 27; 14; 2093-6
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2013-08-31
    Description: The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2013-08-31
    Description: Submillimeter-wave cloud ice radiometry is an innovative technique for determining the amount of ice present in cirrus clouds, measuring median crystal size, and constraining crystal shape. The radiometer described in this poster is being developed to acquire data to validate radiometric retrievals of cloud ice at submillimeter wavelengths. The goal of this effort is to develop a technique to enable spaceborne characterization of cirrus, meeting key climate modeling and NASA measurement needs.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2013-08-31
    Description: A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: In this project we continued the development of a visual editor in the Java programming language to create screens on which to display real-time data. The data comes from the numerous systems monitoring the operation of the space shuttle while on the ground and in space, and from the many tests of subsystems. The data can be displayed on any computer platform running a Java-enabled World Wide Web (WWW) browser and connected to the Internet. Previously a special-purpose program bad been written to display data on emulations of character-based display screens used for many years at NASA. The goal now is to display bit-mapped screens created by a visual editor. We report here on the visual editor that creates the display screens. This project continues the work we bad done previously. Previously we had followed the design of the 'beanbox,' a prototype visual editor created by Sun Microsystems. We abandoned this approach and implemented a prototype using a more direct approach. In addition, our prototype is based on newly released Java 2 graphical user interface (GUI) libraries. The result has been a visually more appealing appearance and a more robust application.
    Keywords: Computer Programming and Software
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 189-195; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2016-06-07
    Description: This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
    Keywords: Computer Programming and Software
    Type: Lfm2000: Fifth NASA Langley Formal Methods Workshop; NASA/CP-2000-210100
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2014-10-07
    Description: The goals of this study are the evaluation of current fast radiative transfer models (RTMs) and line-by-line (LBL) models. The intercomparison focuses on the modeling of 11 representative sounding channels routinely used at numerical weather prediction centers: seven HIRS (High-resolution Infrared Sounder) and four AMSU (Advanced Microwave Sounding Unit) channels. Interest in this topic was evidenced by the participation of 24 scientists from 16 institutions. An ensemble of 42 diverse atmospheres was used and results compiled for 19 infrared models and 10 microwave models, including several LBL RTMs. For the first time, not only radiances, but also Jacobians (of temperature, water vapor, and ozone) were compared to various LBL models for many channels. In the infrared, LBL models typically agree to within 0.05-0.15 K (standard deviation) in terms of top-of-the-atmosphere brightness temperature (BT). Individual differences up to 0.5 K still exist, systematic in some channels, and linked to the type of atmosphere in others. The best fast models emulate LBL BTs to within 0.25 K, but no model achieves this desirable level of success for all channels. The ozone modeling is particularly challenging. In the microwave, fast models generally do quite well against the LBL model to which they were tuned. However significant differences were noted among LBL models. Extending the intercomparison to the Jacobians proved very useful in detecting subtle and more obvious modeling errors. In addition, total and single gas optical depths were calculated, which provided additional insight on the nature of differences. Recommendations for future intercomparisons are suggested.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2013-08-29
    Description: Nearly three years of Tropical Rainfall Measuring Mission Satellite (TRMM Satellite) monthly estimates of tropical surface rainfall are analyzed to document and understand the differences among the TRMM-based estimates and how these differences relate to the pre-TRMM estimates and current operational analyses. Variation among the TRMM estimates is shown to be considerably smaller than among a pre-TRMM collection of passive microwave-based products. Use of both passive and active microwave techniques in TRMM should lead to increased confidence in converged estimates. Current TRMM estimates are shown to have a range of about 20% for the tropical ocean as a whole, with variations in heavily raining ocean areas of the ITCZ and SPCZ having differences over 30%. In mid-latitude ocean areas the differences are smaller. Over land there is a distinct difference between the tropics and mid-latitude with a reversal between some of the products as to which tends to be relatively high or low. Comparisons of TRMM estimates with ocean atoll and land gauge information point to products that might have significant regional biases. The radar-based product is significantly low biased compared with atoll raingauge data, while the passive microwave product is significantly high compared to raingauge data in the deep tropics. The evolution of rainfall patterns during the recent change from intense El Nino to a long period of La Nina and then a gradual return to near neutral conditions is described using TRMM. The time history of integrated rainfall over the tropical oceans (and land) during this period differs among the passive and active microwave TRMM estimates.
    Keywords: Meteorology and Climatology
    Type: Symposium on Cloud Systems, Hurricanes and TRMM; Unknown
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2013-08-29
    Description: Observations made by the Precipitation Radar (PR) and the Microwave Imager (TMI) radiometer on board the Tropical Rainfall Measuring Mission (TRMM) satellite help us to show the significance of the 85 GHz polarization difference, PD85, measured by TMI. Rain type, convective or stratiform, deduced from the PR allows us to infer that PD85 is generally positive in stratiform rain clouds, while PD85 can be markedly negative in deep convective rain clouds. Furthermore, PD85 increases in a gross manner as stratiform rain rate increases. On the contrary, in a crude fashion PD85 decreases as convective rain rate increases. From the observations of TMI and PR, we find that PD85 is a weak indicator of rain rate. Utilizing information from existing polarimetric radar studies, we infer that negative values of PD85 are likely associated with vertically-oriented small oblate or wet hail that are found in deep convective updrafts.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2013-08-29
    Description: Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2013-08-29
    Description: Major droughts and floods over the U.S. continent may be related to a far field energy source in the Asian Pacific. This is illustrated by two climate patterns associated with summertime rainfall over the U.S. and large-scale circulation on interannual timescale. The first shows an opposite variation between the drought/flood over the Midwest and that over eastern and southeastern U.S., coupled to a coherent wave pattern spanning the entire East Asia-North Pacific-North America region related to the East Asian jetstream. The second shows a continental-scale drought/flood in the central U.S., coupled to a wavetrain linking Asian/Pacific monsoon region to North America.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2013-08-29
    Description: A set of global, monthly rainfall products has been intercompared to understand the quality and utility of the estimates. The products include 25 observational (satellite-based), four model and two climatological products. The results of the intercomparison indicate a very large range (factor of two or three) of values when all products are considered. The range of values is reduced considerably when the set of observational products is limited to those considered quasi-standard. The model products do significantly poorer in the tropics, but are competitive with satellite-based fields in mid-latitudes over land. Over ocean, products are compared to frequency of precipitation from ship observations. The evaluation of the observational products point to merged data products (including rain gauge information) as providing the overall best results.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2013-08-29
    Description: We describe an event-based, publish-and-subscribe mechanism based on using 'smart subscriptions' to recognize weakly-structured events. We present a hierarchy of subscription languages (propositional, predicate, temporal and agent) and algorithms for efficiently recognizing event matches. This mechanism has been applied to the management of distributed applications.
    Keywords: Computer Programming and Software
    Type: Distributed Objects in Computational Science; Unknown
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2013-08-29
    Description: Observational and modeling studies have described the relationships between convective/stratiform rain proportion and the vertical distributions of vertical motion, latent heating, and moistening in mesoscale convective systems. Therefore, remote sensing techniques which can quantify the relative areal proportion of convective and stratiform, rainfall can provide useful information regarding the dynamic and thermodynamic processes in these systems. In the present study, two methods for deducing the convective/stratiform areal extent of precipitation from satellite passive microwave radiometer measurements are combined to yield an improved method. If sufficient microwave scattering by ice-phase precipitating hydrometeors is detected, the method relies mainly on the degree of polarization in oblique-view, 85.5 GHz radiances to estimate the area fraction of convective rain within the radiometer footprint. In situations where ice scattering is minimal, the method draws mostly on texture information in radiometer imagery at lower microwave frequencies to estimate the convective area fraction. Based upon observations of ten convective systems over ocean and nine systems over land, instantaneous 0.5 degree resolution estimates of convective area fraction from the Tropical Rainfall Measuring Mission Microwave Imager (TRMM TMI) are compared to nearly coincident estimates from the TRMM Precipitation Radar (TRMM PR). The TMI convective area fraction estimates are slightly low-biased with respect to the PR, with TMI-PR correlations of 0.78 and 0.84 over ocean and land backgrounds, respectively. TMI monthly-average convective area percentages in the tropics and subtropics from February 1998 exhibit the greatest values along the ITCZ and in continental regions of the summer (southern) hemisphere. Although convective area percentages. from the TMI are systematically lower than those from the PR, monthly rain patterns derived from the TMI and PR rain algorithms are very similar. TMI rain depths are significantly higher than corresponding rain depths from the PR in the ITCZ, but are similar in magnitude elsewhere.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2013-08-29
    Description: We built a direct detection Doppler lidar based on the double-edge molecular technique and made the first molecular based wind measurements using the eyesafe 355 nm wavelength. Three etalon bandpasses are obtained with Step etalons on a single pair of etalon plates. Long-term frequency drift of the laser and the capacitively stabilized etalon is removed by locking the etalon to the laser frequency. We use a low angle design to avoid polarization effects. Wind measurements of 1 to 2 m/s accuracy are obtained to 10 km altitude with 5 mJ of laser energy, a 750s integration, and a 25 cm telescope. Good agreement is obtained between the lidar and rawinsonde measurements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2013-08-29
    Description: We use clear sky heating rates to show that convective outflow in the tropics decreases rapidly with height between the 350 K and 360 K potential temperature surfaces (or between roughly 13 and 15 km). There is also a rapid fall-off in the pseudoequivalent potential temperature probability distribution of near surface air parcels between 350 K and 360 K. This suggests that the vertical variation of convective outflow in the upper tropical troposphere is to a large degree determined by the distribution of sub cloud layer entropy.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2013-08-29
    Description: A symposium celebrating the first 50 years of Dr. Joanne Simpson's career took place at the NASA/Goddard Space Flight Center from December 1 - 3, 1999. This symposium consisted of presentations that focused on: historical and personal points of view concerning Dr. Simpson's research career, her interactions with the American Meteorological Society, and her leadership in TRMM; scientific interactions with Dr. Simpson that influenced personal research; research related to observations and modeling of clouds, cloud systems and hurricanes; and research related to the Tropical Rainfall Measuring Mission (TRMM). There were a total of 36 presentations and 103 participants from the US, Japan and Australia. The specific presentations during the symposium are summarized in this paper.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2013-08-29
    Description: We present results of simulations of the distribution of 1809 keV radiation from the decay of Al-26 in the Galaxy. Recent observations of this emission line using the Gamma Ray Imaging Spectrometer (GRIS) have indicated that the bulk of the AL-26 must have a velocity of approx. 500 km/ s. We have previously shown that a velocity this large could be maintained over the 10(exp 6) year lifetime of the Al-26 if it is trapped in dust grains that are reaccelerated periodically in the ISM. Here we investigate whether a dust grain velocity of approx. 500 km/ s will produce a distribution of 1809 keV emission in latitude that is consistent with the narrow distribution seen by COMPTEL. We find that dust grain velocities in the range 275 - 1000 km/ s are able to reproduce the COMPTEL 1809 keV emission maps reconstructed using the Richardson-Lucy and Maximum Entropy image reconstruction methods while the emission map reconstructed using the Multiresolution Regularized Expectation Maximization algorithm is not well fit by any of our models. The Al-26 production rate that is needed to reproduce the observed 1809 keV intensity yields in a Galactic mass of Al-26 of approx. 1.5 - 2 solar mass which is in good agreement with both other observations and theoretical production rates.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2013-08-29
    Description: This study examines the uncertainty in forecasts of the January-February-March (JFM) mean extratropical circulation, and how that uncertainty is modulated by the El Nino/Southern Oscillation (ENSO). The analysis is based on ensembles of hindcasts made with an Atmospheric General Circulation Model (AGCM) forced with sea surface temperatures observed during; the 1983 El Nino and 1989 La Nina events. The AGCM produces pronounced interannual differences in the magnitude of the extratropical seasonal mean noise (intra-ensemble variability). The North Pacific, in particular, shows extensive regions where the 1989 seasonal mean noise kinetic energy (SKE), which is dominated by a "PNA-like" spatial structure, is more than twice that of the 1983 forecasts. The larger SKE in 1989 is associated with a larger than normal barotropic conversion of kinetic energy from the mean Pacific jet to the seasonal mean noise. The generation of SKE due to sub-monthly transients also shows substantial interannual differences, though these are much smaller than the differences in the mean flow conversions. An analysis of the Generation of monthly mean noise kinetic energy (NIKE) and its variability suggests that the seasonal mean noise is predominantly a statistical residue of variability resulting from dynamical processes operating on monthly and shorter times scales. A stochastically-forced barotropic model (linearized about the AGCM's 1983 and 1989 base states) is used to further assess the role of the basic state, submonthly transients, and tropical forcing, in modulating the uncertainties in the seasonal AGCM forecasts. When forced globally with spatially-white noise, the linear model generates much larger variance for the 1989 base state, consistent with the AGCM results. The extratropical variability for the 1989 base state is dominanted by a single eigenmode, and is strongly coupled with forcing over tropical western Pacific and the Indian Ocean, again consistent with the AGCM results. Linear calculations that include forcing from the AGCM variance of the tropical forcing and submonthly transients show a small impact on the variability over the Pacific/North American region compared with that of the base state differences.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2013-08-29
    Description: Using global rainfall and sea surface temperature (SST) data for the past two decades (1979-1998), we have investigated the intrinsic modes of Asian summer monsoon (ASM) and ENSO co-variability. Three recurring ASM rainfall-SST coupled modes were identified. The first is a basin scale mode that features SST and rainfall variability over the entire tropics (including the ASM region), identifiable with those occurring during El Nino or La Nina. This mode is further characterized by a pronounced biennial variation in ASM rainfall and SST associated with fluctuations of the anomalous Walker circulation that occur during El Nino/La Nina transitions. The second mode comprises mixed regional and basin-scale rainfall and SST signals, with pronounced intraseasonal and interannual variabilities. This mode features a SST pattern associated with a developing La Nina, with a pronounced low level anticyclone in the subtropics of the western Pacific off the coast of East Asia. The third mode depicts an east-west rainfall and SST dipole across the southern equatorial Indian Ocean, most likely stemming from coupled ocean-atmosphere processes within the ASM region. This mode also possesses a decadal time scale and a linear trend, which are not associated with El Nino/La Nina variability. Possible causes of year-to-year rainfall variability over the ASM and sub-regions have been evaluated from a reconstruction of the observed rainfall from singular eigenvectors of the coupled modes. It is found that while basin-scale SST can account for portions of ASM rainfall variability during ENSO events (up to 60% in 1998), regional processes can accounts up to 20-25% of the rainfall variability in typical non-ENSO years. Stronger monsoon-ENSO relationship tends to occur in the boreal summer immediately preceding a pronounced La Nina, i.e., 1998, 1988 and 1983. Based on these results, we discuss the possible impacts of the ASM on ENSO variability via the west Pacific anticyclone and articulate a hypothesis that anomalous wind forcings derived from the anticyclone may be instrumental in inducing a strong biennial modulation to natural ENSO cycles.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2013-08-29
    Description: In general, there are two broad scientific objectives when using cloud resolving models (CRMs or cloud ensemble models-CEMs) to study tropical convection. The first one is to use them as a physics resolving models to understand the dynamic and microphysical processes associated with the tropical water and energy cycles and their role in the climate system. The second approach is to use the CRMs to improve the representation of moist processes and their interaction with radiation in large-scale models. In order to improve the credibility of the CRMs and achieve the above goals, CRMs using identical initial conditions and large-scale influences need to produce very similar results. Two CRMs produced different statistical equilibrium (SE) states even though both used the same initial thermodynamic and wind conditions. Sensitivity tests to identify the major physical processes that determine the SE states for the different CRM simulations were performed. Their results indicated that atmospheric horizontal wind is treated quite differently in these two CRMs. The model that had stronger surface winds and consequently larger latent and sensible heat fluxes from the ocean produced a warmer and more humid modeled thermodynamic SE state. In addition, the domain mean thermodynamic state is more unstable for those experiments that produced a warmer and more humid SE state. Their simulated wet (warm and humid) SE states are thermally more stable in the lower troposphere (from the surface to 4-5 km in altitude). The large-scale horizontal advective effects on temperature and water vapor mixing ratio are needed when using CRMs to perform long-term integrations to study convective feedback under specified large-scale environments. In addition, it is suggested that the dry and cold SE state simulated was caused by enhanced precipitation but not enough surface evaporation. We find some problems with the interpretation of these three phenomena.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2013-08-29
    Description: The effectiveness of techniques for creating "bogus" vortices in numerical simulations of hurricanes is examined by using the Penn State/NCAR nonhydrostatic mesoscale model (MM5) and its adjoint system. A series of four-dimensional variational data assimilation (4-D VAR) experiments is conducted to generate an initial vortex for Hurricane Georges (1998) in the Atlantic Ocean by assimilating bogus sea-level pressure and surface wind information into the mesoscale numerical model. Several different strategies are tested for improving the vortex representation. The initial vortices produced by the 4-D VAR technique are able to reproduce many of the structural features of mature hurricanes. The vortices also result in significant improvements to the hurricane forecasts in terms of both intensity and track. In particular, with assimilation of only bogus sea-level pressure information, the response in the wind field is contained largely within the divergent component, with strong convergence leading to strong upward motion near the center. Although the intensity of the initial vortex seems to be well represented, a dramatic spin down of the storm occurs within the first 6 h of the forecast. With assimilation of bogus surface wind data only, an expected dominance of the rotational component of the wind field is generated, but the minimum pressure is adjusted inadequately compared to the actual hurricane minimum pressure. Only when both the bogus surface pressure and wind information are assimilated together does the model produce a vortex that represents the actual intensity of the hurricane and results in significant improvements to forecasts of both hurricane intensity and track.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2013-08-29
    Description: This paper represents the first attempt to use TRMM rainfall information to estimate the four dimensional latent heating structure over the global tropics for February 1998. The mean latent heating profiles over six oceanic regions (TOGA COARE IFA, Central Pacific, S. Pacific Convergence Zone, East Pacific, Indian Ocean and Atlantic Ocean) and three continental regions (S. America, Central Africa and Australia) are estimated and studied. The heating profiles obtained from the results of diagnostic budget studies over a broad range of geographic locations are used to provide comparisons and indirect validation for the heating algorithm estimated heating profiles. Three different latent heating algorithms, the Goddard Convective-Stratiform (CSH) heating, the Goddard Profiling (GPROF) heating, and the Hydrometeor heating (HH) are used and their results are intercompared. The horizontal distribution or patterns of latent heat release from the three different heating retrieval methods are quite similar. They all can identify the areas of major convective activity (i.e., a well defined ITCZ in the Pacific, a distinct SPCZ) in the global tropics. The magnitude of their estimated latent heating release is also not in bad agreement with each other and with those determined from diagnostic budget studies. However, the major difference among these three heating retrieval algorithms is the altitude of the maximum heating level. The CSH algorithm estimated heating profiles only show one maximum heating level, and the level varies between convective activity from various geographic locations. These features are in good agreement with diagnostic budget studies. By contrast, two maximum heating levels were found using the GPROF heating and HH algorithms. The latent heating profiles estimated from all three methods can not show cooling between active convective events. We also examined the impact of different TMI (Multi-channel Passive Microwave Sensor) and PR (Precipitation Radar) rainfall information on latent heating structures.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2013-08-29
    Description: The NASA/GSFC Scanning Raman Lidar (SRL) was stationed on Andros Island in the Bahamas during August - September, 1998 as a part of the third Convection and Moisture Experiment (CAMEX-3) which focussed on hurricane development and tracking. During the period August 21 - 24, hurricane Bonnie passed near Andros Island and influenced the water vapor and cirrus cloud measurements acquired by the SRL. Two drying signatures related to the hurricane were recorded by the SRL (Scanning Raman Lidar) and other sensors. Cirrus cloud optical depths (at 351 nm) were also measured during this period. Optical depth values ranged from approximately 0.01 to 1.4. The influence of multiple scattering on these optical depth measurements was studied with the conclusion that the measured values of optical depth are less than the actual value by up to 20% . The UV/IR cirrus cloud optical depth ratio was estimated based on a comparison of lidar and GOES measurements. Simple radiative transfer model calculations compared with GOES satellite brightness temperatures indicate that satellite radiances are significantly affected by the presence of cirrus clouds if IR optical depths are approximately 0.02 or greater. This has implications for satellite cirrus detection requirements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2013-08-29
    Description: Central Florida is the ideal test laboratory for studying convergence zone-induced convection. The region regularly experiences sea breeze fronts and rainfall-induced outflow boundaries. The focus of this study is the common yet poorly-studied convergence zone established by the interaction of the sea breeze front and an outflow boundary. Previous studies have investigated mechanisms primarily affecting storm initiation by such convergence zones. Few have focused on rainfall morphology yet these storms contribute a significant amount precipitation to the annual rainfall budget. Low-level convergence and mid-tropospheric moisture have both been shown to correlate with rainfall amounts in Florida. Using 2D and 3D numerical simulations, the roles of low-level convergence and mid-tropospheric moisture in rainfall evolution are examined. The results indicate that time-averaged, vertical moisture flux (VMF) at the sea breeze front/outflow convergence zone is directly and linearly proportional to initial condensation rates. This proportionality establishes a similar relationship between VMF and initial rainfall. Vertical moisture flux, which encompasses depth and magnitude of convergence, is better correlated to initial rainfall production than surface moisture convergence. This extends early observational studies which linked rainfall in Florida to surface moisture convergence. The amount and distribution of mid-tropospheric moisture determines how rainfall associated with secondary cells develop. Rainfall amount and efficiency varied significantly over an observable range of relative humidities in the 850- 500 mb layer even though rainfall evolution was similar during the initial or "first-cell" period. Rainfall variability was attributed to drier mid-tropospheric environments inhibiting secondary cell development through entrainment effects. Observationally, 850-500 mb moisture structure exhibits wider variability than lower level moisture, which is virtually always present in Florida. A likely consequence of the variability in 850-500 moisture is a stronger statistical correlation to rainfall, which observational studies have noted. The study indicates that vertical moisture flux forcing at convergence zones is critical in determining rainfall in the initial stage of development but plays a decreasing role in rainfall evolution as the system matures. The mid-tropospheric moisture (e.g. environment) plays an increasing role in rainfall evolution as the system matures. This suggests the need to improve measurements of magnitude/depth of convergence and mid-tropospheric moisture distribution. It also highlights the need for better parameterization of entrainment and vertical moisture distribution in larger-scale models.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2013-08-29
    Description: This paper shows that if one is provided with a loss function, it can be used in a natural way to specify a distance measure quantifying the similarity of any two supervised learning algorithms, even non-parametric algorithms. Intuitively, this measure gives the fraction of targets and training sets for which the expected performance of the two algorithms differs significantly. Bounds on the value of this distance are calculated for the case of binary outputs and 0-1 loss, indicating that any two learning algorithms are almost exactly identical for such scenarios. As an example, for any two algorithms A and B, even for small input spaces and training sets, for less than 2e(-50) of all targets will the difference between A's and B's generalization performance of exceed 1%. In particular, this is true if B is bagging applied to A, or boosting applied to A. These bounds can be viewed alternatively as telling us, for example, that the simple English phrase 'I expect that algorithm A will generalize from the training set with an accuracy of at least 75% on the rest of the target' conveys 20,000 bytes of information concerning the target. The paper ends by discussing some of the subtleties of extending the distance measure to give a full (non-parametric) differential geometry of the manifold of learning algorithms.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2013-08-29
    Description: In this paper, we discuss our approach to making the behavior of planetary rovers more robust for the purpose of increased productivity. Due to the inherent uncertainty in rover exploration, the traditional approach to rover control is conservative, limiting the autonomous operation of the rover and sacrificing performance for safety. Our objective is to increase the science productivity possible within a single uplink by allowing the rover's behavior to be specified with flexible, contingent plans and by employing dynamic plan adaptation during execution. We have deployed a system exhibiting flexible, contingent execution; this paper concentrates on our ongoing efforts on plan adaptation, Plans can be revised in two ways: plan steps may be deleted, with execution continuing with the plan suffix; and the current plan may be merged with an "alternate plan" from an on-board library. The plan revision action is chosen to maximize the expected utility of the plan. Plan merging and action deletion constitute a more conservative general-purpose planning system; in return, our approach is more efficient and more easily verified, two important criteria for deployed rovers.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2013-08-29
    Description: This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2013-08-29
    Description: In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation analogue' of algorithmic information complexity. It is proven in that second paper that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2013-08-29
    Description: Chao's numerical and theoretical work on multiple quasi-equilibria of the intertropical convergence zone (ITCZ) and the origin of monsoon onset is extended to solve two additional puzzles. One is the highly nonlinear dependence on latitude of the "force" acting on the ITCZ due to earth's rotation, which makes the multiple quasi-equilibria of the ITCZ and monsoon onset possible. The other is the dramatic difference in such dependence when different cumulus parameterization schemes are used in a model. Such a difference can lead to a switch between a single ITCZ at the equator and a double ITCZ, when a different cumulus parameterization scheme is used. Sometimes one of the double ITCZ can diminish and only the other remain, but still this can mean different latitudinal locations for the single ITCZ. A single idea based on two off-equator attractors for the ITCZ, due to earth's rotation and symmetric with respect to the equator, and the dependence of the strength and size of these attractors on the cumulus parameterization scheme solves both puzzles. The origin of these rotational attractors, explained in Part I, is further discussed. The "force" acting on the ITCZ due to earth's rotation is the sum of the "forces" of the two attractors. Each attractor exerts on the ITCZ a "force" of simple shape in latitude; but the sum gives a shape highly varying in latitude. Also the strength and the domain of influence of each attractor vary, when change is made in the cumulus parameterization. This gives rise to the high sensitivity of the "force" shape to cumulus parameterization. Numerical results, of experiments using Goddard's GEOS general circulation model, supporting this idea are presented. It is also found that the model results are sensitive to changes outside of the cumulus parameterization. The significance of this study to El Nino forecast and to tropical forecast in general is discussed.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-29
    Description: This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2013-08-29
    Description: The 1997-1999 ENSO period was very powerful, but also well observed. Multiple satellite rainfall estimates combined with gauge observations allow for a quantitative analysis of precipitation anomalies in the tropics and elsewhere accompanying the 1997-99 ENSO cycle. An examination of the evolution of the El Nino and accompanying precipitation anomalies revealed that a dry Maritime Continent preceded the formation of positive SST anomalies in the eastern Pacific Ocean. 30-60 day oscillations in the winter of 1996/97 may have contributed to this lag relationship. Furthermore, westerly wind burst events may have maintained the drought over the Maritime Continent. The warming of the equatorial Pacific was then followed by an increase in convection. A rapid transition from El Nino to La Nina occurred in May 1998, but as early as October-November 1997 precipitation indices captured substantial changes in Pacific rainfall anomalies. The global precipitation patterns for this event were in good agreement with the strong consistent ENSO-related precipitation signals identified in earlier studies. Differences included a shift in precipitation anomalies over Africa during the 1997-98 El Nino and unusually wet conditions over northeast Australia during the later stages of the El Nino. Also, the typically wet region in the north tropical Pacific was mostly dry during the 1998-99 La Nina. Reanalysis precipitation was compared to observations during this time period and substantial differences were noted. In particular, the model had a bias towards positive precipitation anomalies and the magnitudes of the anomalies in the equatorial Pacific were small compared to the observations. Also, the evolution of the precipitation field, including the drying of the Maritime Continent and eastward progression of rainfall in the equatorial Pacific was less pronounced for the model compared to the observations.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2013-08-29
    Description: This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2013-08-29
    Description: The predictability of the 1997 and 1998 south Asian summer monsoon winds is examined from an ensemble of 10 Atmospheric General Circulation Model (AGCM) simulations with prescribed sea surface temperatures (SSTs) and soil moisture, The simulations are started in September 1996 so that they have lost all memory of the atmospheric initial conditions for the periods of interest. The model simulations show that the 1998 monsoon is considerably more predictable than the 1997 monsoon. During May and June of 1998 the predictability of the low-level wind anomalies is largely associated with a local response to anomalously warm Indian Ocean SSTs. Predictability increases late in the season (July and August) as a result of the strengthening of the anomalous Walker circulation and the associated development of easterly low level wind anomalies that extend westward across India and the Arabian Sea. During these months the model is also the most skillful with the observations showing a similar late-season westward extension of the easterly CD wind anomalies. The model shows little predictability or skill in the low level winds over southeast Asia during, 1997. Predictable wind anomalies do occur over the western Indian Ocean and Indonesia, however, over the Indian Ocean they are a response to SST anomalies that were wind driven and they show no skill. The reduced predictability in the low level winds during 1997 appears to be the result of a weaker (compared with 1998) simulated anomalous Walker circulation, while the reduced skill is associated with pronounced intraseasonal activity that is not well captured by the model. Remarkably, the model does produce an ensemble mean Madden-Julian Oscillation (MJO) response that is approximately in phase with (though weaker than) the observed MJ0 anomalies. This is consistent with the idea that SST coupling may play an important role in the MJO.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2013-08-29
    Description: This article is partly a review and partly a new research paper on monsoon-ENSO relationship. The paper begins with a discussion of the basic relationship between the Indian monsoon and ENSO dating back to the work of Sir Gilbert Walker up to research results in more recent years. Various factors that may affect the monsoon-ENSO, relationship, including regional coupled ocean-atmosphere processes, Eurasian snow cover, land-atmosphere hydrologic feedback, intraseasonal oscillation, biennial variability and inter-decadal variations, are discussed. The extreme complex and highly nonlinear nature of the monsoon-ENSO relationship is stressed. We find that for regional impacts on the monsoon, El Nino and La Nina are far from simply mirror images of each other. These two polarities of ENSO can have strong or no impacts on monsoon anomalies depending on the strength of the intraseasonal oscillations and the phases of the inter-decadal variations. For the Asian-Australian monsoon (AAM) as a whole, the ENSO impact is effected through a east-west shift in the Walker Circulation. For rainfall anomalies over specific monsoon areas, regional processes play important roles in addition to the shift in the Walker Circulation. One of the key regional processes identified for the boreal summer monsoon is the anomalous West Pacific Anticyclone (WPA). This regional feature has similar signatures in interannual and intraseasonal time scales and appears to determine whether the monsoon-ENSO relationship is strong or weak in a given year. Another important regional feature includes a rainfall and SST dipole across the Indian Ocean, which may have strong impact on the austral summer monsoon. Results are shown indicating that monsoon surface wind forcings may induce a strong biennial signal in ENSO and that strong monsoon-ENSO coupling may translate into pronounced biennial variability in ENSO. Finally, a new paradigm is proposed for the study of monsoon variability. This paradigm provides a unified framework in which monsoon predictability, the role of regional vs. basin-scale processes, its relationship with different climate subsystems, and causes of secular changes in monsoon-ENSO relationship can be investigated.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2013-08-29
    Description: We describe the Object Infrastructure Framework, a system that seeks to simplify the creation of distributed applications by injecting behavior on the communication paths between components. We touch on some of the ilities and services that can be achieved with injector technology, and then focus on the uses of redirecting injectors, injectors that take requests directed at a particular server and generate requests directed at others. We close by noting that OIF is an Aspect-Oriented Programming system, and comparing OIF to related work.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2013-08-29
    Description: FutureFlight Central will permit integration of tomorrow's technologies in a risk-free simulation of any airport, airfield, and tower cab environment. The facility provides an opportunity for airlines to mitigate passenger delays by fine tuning airport hub operations, gate management and ramp movement procedures. It also allows airport managers an opportunity to study effects of various improvements at their airports. Finally, it enables air traffic controllers to provide feedback and to become familiar with new airport operations and technologies before final installation.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2013-08-29
    Description: Surface profiles were generated by a fractal algorithm and haptically rendered on a force feedback joystick, Subjects were asked to use the joystick to explore pairs of surfaces and report to the experimenter which of the surfaces they felt was rougher. Surfaces were characterized by their root mean square (RMS) amplitude and their fractal dimension. The most important factor affecting the perceived roughness of the fractal surfaces was the RMS amplitude of the surface. When comparing surfaces of fractal dimension 1.2-1.35 it was found that the fractal dimension was negatively correlated with perceived roughness.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2013-08-29
    Description: Abstract A 1999 study reports an advancement of spring in Europe by 0.2 days per year in the 30 years since 1960. Our analysis indicates that this trend results directly from a change in the late-winter surface winds over the eastern North Atlantic: the southwesterly direction became more dominant, and the speed of these southwesterlies increased slightly. Splitting the 52-year NCEP reanalysis dataset into the First Half, FH (1948-1973)), and the Second Half, SH (1974-1999), we analyze the wind direction for the February mean at three sites at 45N: site A at 30W, site B at 20W, and site C at 10W. The incidence (number of years) of the southwesterlies in SH Vs. (FH) at these sites respectively increased in SH as follows: 24(18), 19(12), 14(l 1); whereas the incidence of northeasterlies decreased: 0(2), 1(2), and 1(6). When the February mean wind is southwesterly, the monthly mean sensible heat flux from the ocean at these sites takes zero or slightly negative values, that is, the surface air is warmer than the ocean. Analyzing the scenario in the warm late winter 1990, we observe that the sensible heat flux from the ocean surface in February 1990 shows a "tongue" of negative values extending southwest from southern England to 7N. This indicates that the source of the maritime air advected into Europe lies to the south of the "tongue." Streamline analysis suggests that the Southwestern or southcentral North Atlantic is the source. For February 1990, we find strong, ascending motions over Europe at 700 mb, up to -0.4 Pa/s as monthly averages. Associated with the unstable low-levels of the troposphere are positive rain and cloud anomalies. Thus, positive in situ feedback over land in late winter (when shortwave absorption is not significant) apparently further enhances the surface temperature through an increase in the greenhouse effect due to increased water vapor and cloudiness.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2013-08-29
    Description: It is a long-held fundamental belief that the basic cause of a monsoon is land-sea thermal contrast on the continental scale. Through general circulation model experiments we demonstrate that this belief should be changed. The Asian and Australian summer monsoon circulations are largely intact in an experiment in which Asia, maritime continent, and Australia are replaced by ocean. It is also shown that the change resulting from such replacement is in general due more to the removal of topography than to the removal of land-sea contrast. Therefore, land-sea contrast plays only a minor modifying role in Asian and Australian summer monsoons. This also happens to the Central American summer monsoon. However, the same thing cannot be said of the African and South American summer monsoons. In Asian and Australian winter monsoons land-sea contrast also plays only a minor role. Our interpretation for the origin of monsoon is that the summer monsoon is the result of ITCZ's (intertropical convergence zones) peak being substantially (more than 10 degrees) away from the equator. The origin of the ITCZ has been previously interpreted by Chao. The circulation around thus located ITCZ, previously interpreted by Chao and Chen through the modified Gill solution and briefly described in this paper, explains the monsoon circulation. The longitudinal location of the ITCZs is determined by the distribution of surface conditions. ITCZ's favor locations of higher SST as in western Pacific and Indian Ocean, or tropical landmass, due to land-sea contrast, as in tropical Africa and South America. Thus, the role of landmass in the origin of monsoon can be replaced by ocean of sufficiently high SST. Furthermore, the ITCZ circulation extends into the tropics in the other hemisphere to give rise to the winter monsoon circulation there. Also through the equivalence of land-sea contrast and higher SST, it is argued that the basic monsoon onset mechanism proposed by Chao is valid for all monsoons.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2013-08-29
    Description: It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2013-08-29
    Description: We evaluated the impact of several newly available sources of meteorological data on mesoscale model forecasts of precipitation produced by the extra-tropical cyclone that struck Florida on February 2, 1998. Precipitation distributions of convective rainfall events were derived from Special Sensor Microwave Imager (SSM/I) and Multi-Channel Passive Microwave Sensor (TMI) microwave radiometric data by means of the Goddard PROFiling (GPROF) algorithm. Continuous lightning distributions were obtained from sferics measurements obtained from a network of VLF radio receivers. Histograms of coincident sferics frequency distributions were matched to those of precipitation to derive bogus convective rainfall rates from the continuously available sferics measurements. SSM/I and TMI microwave data were used to derive Integrated Precipitable Water (IPW) distributions. The TMI also provided sea surface temperatures (SSTS) of the Loop Current and Gulf Stream with improved structural detail. A series of experiments assimilated IPW and latent heating from the bogus convective rainfall for six-hours in the MM5 mesoscale forecast model to produce nine-hour forecasts of all rainfall as well as other weather parameters. Although continuously assimilating latent heating only slightly improved the surface pressure distribution forecast, it significantly improved the precipitation forecasts. Correctly locating convective rainfall was found critical for assimilating latent heating in the forecast model, but measurement of the rainfall intensity proved to be less important. The improved SSTs also had a positive impact on rainfall forecasts for this case. Assimilating bogus rainfall in the model produced nine-hour forecasts of radar reflectivity distributions that agreed well with coincident observations from the TRMM spaceborne precipitation radar, ground based radar and spaceborne microwave measurements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2013-08-29
    Description: In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2013-08-29
    Description: We consider the design of multi-agent systems so as to optimize an overall world utility function when (1) those systems lack centralized communication and control, and (2) each agents runs a distinct Reinforcement Learning (RL) algorithm. A crucial issue in such design problems is to initialize/update each agent's private utility function, so as to induce best possible world utility. Traditional 'team game' solutions to this problem sidestep this issue and simply assign to each agent the world utility as its private utility function. In previous work we used the 'Collective Intelligence' framework to derive a better choice of private utility functions, one that results in world utility performance up to orders of magnitude superior to that ensuing from use of the team game utility. In this paper we extend these results. We derive the general class of private utility functions that both are easy for the individual agents to learn and that, if learned well, result in high world utility. We demonstrate experimentally that using these new utility functions can result in significantly improved performance over that of our previously proposed utility, over and above that previous utility's superiority to the conventional team game utility.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2013-08-29
    Description: The NASA/GSFC Scanning Raman Lidar (SRL) was stationed on Andros Island in the Bahamas during August - September, 1998 as a part of the third Convection and Moisture Experiment (CAMEX-3) which focussed on hurricane development and tracking. During the period August 21 - 24, hurricane Bonnie passed near Andros Island and influenced the water vapor and cirrus cloud measurements acquired by the SRL. Two drying signatures related to the hurricane were recorded by the SRL and other sensors. Cirrus cloud optical depths (at 351 nm) were also measured during this period. Optical depth values ranged from less than 0.01 to 1.5. The influence of multiple scattering on these optical depth measurements was studied. A correction technique is presented which minimizes the influences of multiple scattering and derives information about cirrus cloud optical and physical properties. The UV/IR cirrus cloud optical depth ratio was estimated based on a comparison of lidar and GOES measurements. Simple radiative transfer model calculations compared with GOES satellite brightness temperatures indicate that satellite radiances are significantly affected by the presence of cirrus clouds if IR optical depths are approximately 0.005 or greater. Using the ISCCP detection threshold for cirrus clouds on the GOES data presented here, a high bias of up to 40% in the GOES precipitable water retrieval was found.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2013-08-29
    Description: Quantitative use of satellite-derived maps of monthly rainfall requires some measure of the accuracy of the satellite estimates. The rainfall estimate for a given map grid box is subject to both remote-sensing error and, in the case of low-orbiting satellites, sampling error due to the limited number of observations of the grid box provided by the satellite. A simple model of rain behavior predicts that Root-mean-square (RMS) random error in grid-box averages should depend in a simple way on the local average rain rate, and the predicted behavior has been seen in simulations using surface rain-gauge and radar data. This relationship was examined using satellite SSM/I data obtained over the western equatorial Pacific during TOGA COARE. RMS error inferred directly from SSM/I rainfall estimates was found to be larger than predicted from surface data, and to depend less on local rain rate than was predicted. Preliminary examination of TRMM microwave estimates shows better agreement with surface data. A simple method of estimating rms error in satellite rainfall estimates is suggested, based on quantities that can be directly computed from the satellite data.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2013-08-29
    Description: The tropical cyclone rainfall climatology study that was performed for the North Pacific was extended to the North Atlantic. Similar to the North Pacific tropical cyclone study, mean monthly rainfall within 444 km of the center of the North Atlantic tropical cyclones (i.e., that reached storm stage and greater) was estimated from passive microwave satellite observations during, an eleven year period. These satellite-observed rainfall estimates were used to assess the impact of tropical cyclone rainfall in altering the geographical, seasonal, and inter-annual distribution of the North Atlantic total rainfall during, June-November when tropical cyclones were most abundant. The main results from this study indicate: 1) that tropical cyclones contribute, respectively, 4%, 3%, and 4% to the western, eastern, and entire North Atlantic; 2) similar to that observed in the North Pacific, the maximum in North Atlantic tropical cyclone rainfall is approximately 5 - 10 deg poleward (depending on longitude) of the maximum non-tropical cyclone rainfall; 3) tropical cyclones contribute regionally a maximum of 30% of the total rainfall 'northeast of Puerto Rico, within a region near 15 deg N 55 deg W, and off the west coast of Africa; 4) there is no lag between the months with maximum tropical cyclone rainfall and non-tropical cyclone rainfall in the western North Atlantic, while in the eastern North Atlantic, maximum tropical cyclone rainfall precedes maximum non-tropical cyclone rainfall; 5) like the North Pacific, North Atlantic tropical cyclones Of hurricane intensity generate the greatest amount of rainfall in the higher latitudes; and 6) warm ENSO events inhibit tropical cyclone rainfall.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2013-08-29
    Description: The mechanism of the quasi-biennial tendency in El Nino Southern Oscillation (ENSO)-monsoon coupled system is investigated using an intermediate coupled model. The monsoon wind forcing is prescribed as a function of Sea Surface Temperature (SST) anomalies based on the relationship between zonal wind anomalies over the western Pacific to sea level change in the equatorial eastern Pacific. The key mechanism of quasi-biennial tendency in El Nino evolution is found to be in the strong coupling of ENSO to monsoon wind forcing over the western Pacific. Strong boreal summer monsoon wind forcing, which lags the maximum SST anomaly in the equatorial eastern Pacific approximately 6 months, tends to generate Kelvin waves of the opposite sign to anomalies in the eastern Pacific and initiates the turnabout in the eastern Pacific. Boreal winter monsoon forcing, which has zero lag with maximum SST in the equatorial eastern Pacific, tends to damp the ENSO oscillations.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2013-08-29
    Description: Idealized numerical simulations are performed with a coupled atmosphere/land-surface model to identify the roles of initial soil moisture, coastline curvature, and land breeze circulations on sea breeze initiated precipitation. Data collected on 27 July 1991 during the Convection and Precipitation Electrification Experiment (CAPE) in central Florida are used. The 3D Goddard Cumulus Ensemble (GCE) cloud resolving model is coupled with the Goddard Parameterization for Land-Atmosphere-Cloud Exchange (PLACE) land surface model, thus providing a tool to simulate more realistically land-surface/atmosphere interaction and convective initiation. Eight simulations are conducted with either straight or curved coast-lines, initially homogeneous soil moisture or initially variable soil moisture, and initially homogeneous horizontal winds or initially variable horizontal winds (land breezes). All model simulations capture the diurnal evolution and general distribution of sea-breeze initiated precipitation over central Florida. The distribution of initial soil moisture influences the timing, intensity and location of subsequent precipitation. Soil moisture acts as a moisture source for the atmosphere, increases the connectively available potential energy, and thus preferentially focuses heavy precipitation over existing wet soil. Strong soil moisture-induced mesoscale circulations are not evident in these simulations. Coastline curvature has a major impact on the timing and location of precipitation. Earlier low-level convergence occurs inland of convex coastlines, and subsequent precipitation occurs earlier in simulations with curved coastlines. The presence of initial land breezes alone has little impact on subsequent precipitation. however, simulations with both coastline curvature and initial land breezes produce significantly larger peak rain rates due to nonlinear interactions.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2013-08-31
    Description: The numerical simulation of precipitation helps scientists understand the complex mechanisms that determine how and why rainfall is distributed across the globe. Simulation aids in the development of forecastin,g efforts that inform policies regarding the management of water resources. Precipitation modeling also provides short-term warnings, for emergencies such as flash floods and mudslides. Just as precipitation modeling can warn of an impending abundance of rainfall, it can help anticipate the absence of rainfall in drought. What constitutes a drought? A meteorological drought simply means that an area is getting a significantly lower amount of rain than usual over a prolonged period of time and an agricultural drought is based on the level of soil moisture.
    Keywords: Meteorology and Climatology
    Type: 2000 NCCS Highlights: Enabling NASA Earth and Space Sciences; 56-65
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: This paper presents the aspects of language programming transformations that were unknown in the early 1980's.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2013-08-31
    Description: Aerosol is any small particle of matter that rests suspended in the atmosphere. Natural sources, such as deserts, create some aerosols; consumption of fossil fuels and industrial activity create other aerosols. All the microscopic aerosol particles add up to a large amount of material floating in the atmosphere. You can see the particles in the haze that floats over polluted cities. Beyond this visible effect, aerosols can actually lower temperatures. They do this by blocking, or scattering, a portion of the sun's energy from reaching the surface. Because of this influence, scientists study the physical properties of atmospheric aerosols. Reliable numerical models for atmospheric aerosols play an important role in research.
    Keywords: Meteorology and Climatology
    Type: 2000 NCCS Highlights: Enabling NASA Earth and Space Science; 38-45
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: How are scientists going to make use of the Internet several years from now? This is a case study of a leading-edge experiment in building a 'virtual institute'-- using electronic communication tools to foster collaboration among geographically dispersed scientists. Our experience suggests: Scientists will want to use web-based document management systems. There will be a demand for Internet-enabled meeting support tools. While internet videoconferencing will have limited value for scientists, webcams will be in great demand as a tool for transmitting pictures of objects and settings, rather than "talking heads." and a significant share of scientists who do fieldwork will embrace mobile voice, data and video communication tools. The setting for these findings is a research consortium called the NASA Astrobiology Institute.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2013-08-31
    Description: The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2011-08-23
    Description: During the TEFLUN-B (Texas-Florida under-flights for TRMM) field experiment of August-September, 1998, a number of ER-2 aircraft flights with a host of microwave instruments were conducted over many convective storms, including some hurricanes, in the coastal region of Florida and Texas. These instruments include MIR (Millimeter-wave Imaging Radiometer), AMPR (Advanced Microwave Precipitation Radiometer), and EDOP (ER-2 Doppler Radar). EDOP is operated at the frequency of 9.7 GHz, while the AMPR and the MIR together give eleven channels of radiometric measurements in the frequency range of 10-340 GHz. The concurrent measurements from these instruments provide unique data sets for studying the details of the microphysics of hydrometeors. Preliminary examination of these data sets shows features that are generally well understood; i.e., radiometric measurements at frequencies less than or equal to 37 GHz mainly respond to rain, while those at frequencies greater than or equal to 150 GHz, to ice particles above the freezing level. Model calculations of brightness temperature and radar reflectivity are performed and results compared with these measurements. For simplicity the analysis is limited to the anvil region of the storms where hydrometeors are predominantly frozen. Only one ice particle size distribution is examined in the calculations of brightness temperature and radar reflectivity in this initial study. Estimation of ice water path is made based on the best agreement between the measurements and calculations of brightness temperature and reflectivity. Problems associated with these analyses and measurement accuracy will be discussed.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 25-32
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2011-08-23
    Description: GLOW (Goddard Lidar Observatory for Winds) is a mobile Doppler lidar system which uses direct detection Doppler lidar techniques to measure wind profiles from the surface into the lower stratosphere. The system is contained in a modified van to allow deployment in field operations. The lidar system uses a Nd:YAG laser transmitter to measure winds using either aerosol backscatter at 1064 nm or molecular backscatter at 355 nm. The receiver telescope is a 45 cm Dall-Kirkham which is fiber coupled to separate Doppler receivers, one optimized for the aerosol backscatter wind measurement and another optimized for the molecular backscatter wind measurement. The receivers are implementations of the 'double edge' technique and use high spectral resolution Fabry-Perot etalons to measure the Doppler shift. A 45 cm aperture azimuth-over-elevation scanner is mounted on the roof of the van to allow full sky access and a variety of scanning options. GLOW is intended to be used as a deployable field system for studying atmospheric dynamics and transport and can also serve as a testbed to evaluate candidate technologies developed for use in future spaceborne systems. In addition, it can be used for calibration/validation activities following launch of spaceborne wind lidar systems. A description of the mobile system is presented along with the examples of lidar wind profiles obtained with the system.
    Keywords: Meteorology and Climatology
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 314-320
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2016-06-07
    Description: The ability to exchange information between different engineering software (i.e, CAD, CAE, CAM) is necessary to aid in collaborative engineering. There are a number of different ways to accomplish this goal. One popular method is to transfer data via different file formats. However this method can lose data and becomes complex as more file formats are added. Another method is to use a standard protocol. STEP is one such standard. This paper gives an overview of STEP, provides a list of where to access more information, and develops guidelines to aid the reader in deciding if STEP is appropriate for his/her use.
    Keywords: Computer Programming and Software
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 23-32; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2017-09-27
    Description: Historically NASA has trained teams of astronauts by bringing them to the Johnson Space Center in Houston to undergo generic training, followed by mission-specific training. This latter training begins after a crew has been selected for a mission (perhaps two years before the launch of that mission). While some Space Shuttle flights have included an astronaut from a foreign country, the International Space Station will be consistently crewed by teams comprised of astronauts from two or more of the partner nations. The cost of training these international teams continues to grow in both monetary and personal terms. Thus, NASA has been seeking alternative training approaches for the International Space Station program. Since 1994 we have been developing, testing, and refining shared virtual environments for astronaut team training, including the use of virtual environments for use while in or in transit to the task location. In parallel with this effort, we have also been preparing applications for training teams of military personnel engaged in peacekeeping missions. This paper will describe the applications developed to date, some of the technological challenges that have been overcome in their development, and the research performed to guide the development and to measure the efficacy of these shared environments as training tools.
    Keywords: Computer Programming and Software
    Type: The Capability of Virtual Reality to Meet Military Requirements; 22-1 - 22-6; RTO-MP-54
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2017-09-27
    Description: An integrally stiffened graphite/epoxy composite rotorcraft structure is evaluated via computational simulation. A computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of integrally stiffened composite structures are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.
    Keywords: Computer Programming and Software
    Type: Application of Damage Tolerance Principles for Improved Airworthiness of Rotorcraft; 12 - 1 - 12 - 13; RTO-MP-24
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2017-10-04
    Description: Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency issues in the GA, it is possible to have idle processors. However, as long as the load at each processing node is similar, the processors are kept busy nearly all of the time. In applying GAs to circuit design, a suitable genetic representation 'is that of a circuit-construction program. We discuss one such circuit-construction programming language and show how evolution can generate useful analog circuit designs. This language has the desirable property that virtually all sets of combinations of primitives result in valid circuit graphs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm and circuit simulation software, we present experimental results as applied to three analog filter and two amplifier design tasks. For example, a figure shows an 85 dB amplifier design evolved by our system, and another figure shows the performance of that circuit (gain and frequency response). In all tasks, our system is able to generate circuits that achieve the target specifications.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2017-10-04
    Description: Within NASA's High Performance Computing and Communication (HPCC) program, the NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. To this end, NPSS integrates multiple disciplines such as aerodynamics, structures, and heat transfer and supports "numerical zooming" between O-dimensional to 1-, 2-, and 3-dimensional component engine codes. In order to facilitate the timely and cost-effective capture of complex physical processes, NPSS uses object-oriented technologies such as C++ objects to encapsulate individual engine components and CORBA ORBs for object communication and deployment across heterogeneous computing platforms. Recently, the HPCC program has initiated a concept called the Information Power Grid (IPG), a virtual computing environment that integrates computers and other resources at different sites. IPG implements a range of Grid services such as resource discovery, scheduling, security, instrumentation, and data access, many of which are provided by the Globus toolkit. IPG facilities have the potential to benefit NPSS considerably. For example, NPSS should in principle be able to use Grid services to discover dynamically and then co-schedule the resources required for a particular engine simulation, rather than relying on manual placement of ORBs as at present. Grid services can also be used to initiate simulation components on parallel computers (MPPs) and to address inter-site security issues that currently hinder the coupling of components across multiple sites. These considerations led NASA Glenn and Globus project personnel to formulate a collaborative project designed to evaluate whether and how benefits such as those just listed can be achieved in practice. This project involves firstly development of the basic techniques required to achieve co-existence of commodity object technologies and Grid technologies; and secondly the evaluation of these techniques in the context of NPSS-oriented challenge problems. The work on basic techniques seeks to understand how "commodity" technologies (CORBA, DCOM, Excel, etc.) can be used in concert with specialized "Grid" technologies (for security, MPP scheduling, etc.). In principle, this coordinated use should be straightforward because of the Globus and IPG philosophy of providing low-level Grid mechanisms that can be used to implement a wide variety of application-level programming models. (Globus technologies have previously been used to implement Grid-enabled message-passing libraries, collaborative environments, and parameter study tools, among others.) Results obtained to date are encouraging: we have successfully demonstrated a CORBA to Globus resource manager gateway that allows the use of CORBA RPCs to control submission and execution of programs on workstations and MPPs; a gateway from the CORBA Trader service to the Grid information service; and a preliminary integration of CORBA and Grid security mechanisms. The two challenge problems that we consider are the following: 1) Desktop-controlled parameter study. Here, an Excel spreadsheet is used to define and control a CFD parameter study, via a CORBA interface to a high throughput broker that runs individual cases on different IPG resources. 2) Aviation safety. Here, about 100 near real time jobs running NPSS need to be submitted, run and data returned in near real time. Evaluation will address such issues as time to port, execution time, potential scalability of simulation, and reliability of resources. The full paper will present the following information: 1. A detailed analysis of the requirements that NPSS applications place on IPG. 2. A description of the techniques used to meet these requirements via the coordinated use of CORBA and Globus. 3. A description of results obtained to date in the first two challenge problems.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2017-10-04
    Description: The solution data computed from large scale simulations are sometimes too big for main memory, for local disks, and possibly even for a remote storage disk, creating tremendous processing time as well as technical difficulties in analyzing the data. The excessive storage demands a corresponding huge penalty in I/O time, rendering time and transmission time between different computer systems. In this paper, a multiresolution scheme is proposed to compress field simulation or experimental data without much loss of important information in the representation. Originally, the wavelet based multiresolution scheme was introduced in image processing, for the purposes of data compression and feature extraction. Unlike photographic image data which has rather simple settings, computational field simulation data needs more careful treatment in applying the multiresolution technique. While the image data sits on a regular spaced grid, the simulation data usually resides on a structured curvilinear grid or unstructured grid. In addition to the irregularity in grid spacing, the other difficulty is that the solutions consist of vectors instead of scalar values. The data characteristics demand more restrictive conditions. In general, the photographic images have very little inherent smoothness with discontinuities almost everywhere. On the other hand, the numerical solutions have smoothness almost everywhere and discontinuities in local areas (shock, vortices, and shear layers). The wavelet bases should be amenable to the solution of the problem at hand and applicable to constraints such as numerical accuracy and boundary conditions. In choosing a suitable wavelet basis for simulation data among a variety of wavelet families, the supercompact wavelets designed by Beam and Warming provide one of the most effective multiresolution schemes. Supercompact multi-wavelets retain the compactness of Haar wavelets, are piecewise polynomial and orthogonal, and can have arbitrary order of approximation. The advantages of the multiresolution algorithm are that no special treatment is required at the boundaries of the interval, and that the application to functions which are only piecewise continuous (internal boundaries) can be efficiently implemented. In this presentation, Beam's supercompact wavelets are generalized to higher dimensions using multidimensional scaling and wavelet functions rather than alternating the directions as in the 1D version. As a demonstration of actual 3D data compression, supercompact wavelet transforms are applied to a 3D data set for wing tip vortex flow solutions (2.5 million grid points). It is shown that high data compression ratio can be achieved (around 50:1 ratio) in both vector and scalar data set.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2017-10-02
    Description: The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2017-10-02
    Description: This paper discusses the following topics: (1) Autonomy for Future Missions- Mars Outposts, Titan Aerobot, and Europa Cryobot / Hydrobot; (2) Emergence of Autonomy- Remote Agent Architecture, Closing Loops Onboard, and New Millennium Flight Experiment; and (3) Software Engineering Challenges- Influence of Remote Agent, Scalable Autonomy, Autonomy Software Validation, Analytic Verification Technology, and Autonomy and Software Software Engineering.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2017-10-02
    Description: Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2017-10-02
    Description: This paper contains the following sections: GSFC Space Missions of the 21st Century, Information Technology Challenges, Components of a GSFC Solution, and Conclusions.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2018-06-09
    Description: NASA's need to trace mistakes to their source to try and eliminate them in the future has resulted in software known as Root Cause Analysis (RoCA). Fair, Isaac & Co., Inc. has applied RoCA software, originally developed under an SBIR contract with Kennedy, to its predictive software technology. RoCA can generate graphic reports to make analysis of problems easier and more efficient.
    Keywords: Computer Programming and Software
    Type: Spinoff 2000; 65; NASA/NP-2000-08-257-HQ
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2018-06-09
    Description: AgentBuilder is a software component developed under an SBIR contract between Reticular Systems, Inc., and Goddard Space Flight Center. AgentBuilder allows software developers without experience in intelligent agent technologies to easily build software applications using intelligent agents. Agents are components of software that will perform tasks automatically, with no intervention or command from a user. AgentBuilder reduces the time and cost of developing agent systems and provides a simple mechanism for implementing high-performance agent systems.
    Keywords: Computer Programming and Software
    Type: Spinoff 2000; 59; NASA/NP-2000-08-257-HQ
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2018-06-08
    Description: The 2nd GSFC-JPL QMSW workshop brought together 56 participants mostly from GSFC and JPL to focus on critical challenges for mission software.
    Keywords: Computer Programming and Software
    Type: Quality Mission Software (QMSW) Workshop; Fallbrook, CA; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2018-06-08
    Keywords: Meteorology and Climatology
    Type: IEEE/EIA International Frequency Control Symposium and Exhibition, 2000
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2018-06-08
    Keywords: Meteorology and Climatology
    Type: Committee on Space Research (COSPAR) & Seminar on Space Weather; Torun; Poland
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2018-06-09
    Description: Using NASA SBIR funding, CFD Research Corporation has developed CFD-GEOM, an extension of traditional computer-aided drawing (CAD) software. CFD-GEOM can provide modeling and interactivity of computational fluid dynamics (CFD) latest field, mesh-generation and allows for quick and easy updating of a grid in response to changes in the CAD model.
    Keywords: Computer Programming and Software
    Type: Spinoff 2000; 73; NASA/NP-2000-08-257-HQ
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2018-06-08
    Description: We describe and test a software approach to overcoming radiation-induced errors in spaceborne applications running on commercial off-the-shelf components.
    Keywords: Computer Programming and Software
    Type: Fault Tolerant Computing Symposium; New York, NY; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2018-06-08
    Description: The Tropospheric Emission Spectrometer (TES) is a Fourier transform spectrometer slated for launch in December 2002.
    Keywords: Computer Programming and Software
    Type: IEEE Aerospace Conference 2000; Big Sky, MT; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2018-06-05
    Description: A new software design and development effort has produced a Java (Sun Microsystems, Inc.) version of the award-winning Tempest software (refs. 1 and 2). In 1999, the Embedded Web Technology (EWT) team received a prestigious R&D 100 Award for Tempest, Java Version. In this article, "Tempest" will refer to the Java version of Tempest, a World Wide Web server for desktop or embedded systems. Tempest was designed at the NASA Glenn Research Center at Lewis Field to run on any platform for which a Java Virtual Machine (JVM, Sun Microsystems, Inc.) exists. The JVM acts as a translator between the native code of the platform and the byte code of Tempest, which is compiled in Java. These byte code files are Java executables with a ".class" extension. Multiple byte code files can be zipped together as a "*.jar" file for more efficient transmission over the Internet. Today's popular browsers, such as Netscape (Netscape Communications Corporation) and Internet Explorer (Microsoft Corporation) have built-in Virtual Machines to display Java applets.
    Keywords: Computer Programming and Software
    Type: Research and Technology 1999; NASA/TM-2000-209639
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2018-06-05
    Description: SHDOM is a general purpose, publicly available, three-dimensional atmospheric radiative transfer model. SHDOM is an explicit method, which means it solves for the whole radiation field, as distinct from Monte Carlo methods which solve for particular radiative outputs. SHDOM is particularly well suited for remote sensing applications, where it can compute outgoing radiances at many angles from a cloud field at virtually no extra cost. SHDOM is not appropriate for calculating domain average quantities for which Monte Carlo methods excel. The I3RC intercomparison offers an opportunity to explore the pros and cons of SHDOM and Monte Carlo models on some real world inhomogeneous cloud fields. Specifically, we wish to determine the computer resources required to achieve a particular accuracy for a certain number of outputs using SHDOM and Monte Carlo models. This will help guide modelers on the appropriate choice of SHDOM or Monte Carlo for their applications. To emphasize the importance of this accuracy versus CPU time tradeoff, we are submitting two SHDOM entries (low and high resolution) in the I3RC.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2018-06-05
    Description: The Students' Cloud Observations On-Line (S'COOL) Project involved students in K-16 as ground truth observers for a NASA Earth-Observing satellite instrument. The Clouds and Earth's Radiant Energy System (CERES) instrument allows scientists to study the Earth's energy budget and how clouds affect it. Student reports of cloud conditions help scientists verify their algorithms and allow students to be involved in obtaining and analyzing real scientific data. The presentation contains 23 slides.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2017-10-04
    Description: In the year 2001, NASA will launch the satellite TRIANA that will be the first Earth observing mission to provide a continuous, full disk view of the sunlit Earth. As a part of the HPCC Program at NASA GSFC, we have started a project whose objectives are to develop and implement a 3D cloud data assimilation system, by combining TRIANA measurements with model simulation, and to produce accurate statistics of global cloud coverage as an important element of the Earth's climate. For simulation of the atmosphere within this project we are using the NCEP/NOAA operational Eta model. In order to compare TRIANA and the Eta model data on approximately the same grid without significant downscaling, the Eta model will be integrated at a resolution of about 15 km. The integration domain (from -70 to +70 deg in latitude and 150 deg in longitude) will cover most of the sunlit Earth disc and will continuously rotate around the globe following TRIANA. The cloud data assimilation is supposed to run and produce 3D clouds on a near real-time basis. Such a numerical setup and integration design is very ambitious and computationally demanding. Thus, though the Eta model code has been very carefully developed and its computational efficiency has been systematically polished during the years of operational implementation at NCEP, the current MPI version may still have problems with memory and efficiency for the TRIANA simulations. Within this work, we optimize a parallel version of the Eta model code on a Cray T3E and a network of PCs (theHIVE) in order to improve its overall efficiency. Our optimization procedure consists of introducing dynamically allocated arrays to reduce the size of static memory, and optimizing on a single processor by splitting loops to limit the number of streams. All the presented results are derived using an integration domain centered at the equator, with a size of 60 x 60 deg, and with horizontal resolutions of 1/2 and 1/3 deg, respectively. In accompanying charts we report the elapsed time, the speedup and the Mflops as a function of the number of processors for the non-optimized version of the code on the T3E and theHIVE. The large amount of communication required for model integration explains its poor performance on theHIVE. Our initial implementation of the dynamic memory allocation has contributed to about 12% reduction of memory but has introduced a 3% overhead in computing time. This overhead was removed by performing loop splitting in some of the high demanding subroutines. When the Eta code is fully optimized in order to meet the memory requirement for TRIANA simulations, a non-negligeable overhead may appear that may seriously affect the efficiency of the code. To alleviate this problem, we are considering implementation of a new algorithm for the horizontal advection that is computationally less expensive, and also a new approach for marching in time.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2017-10-04
    Description: Porting applications to high performance parallel computers is always a challenging task. It is time consuming and costly. With rapid progressing in hardware architectures and increasing complexity of real applications in recent years, the problem becomes even more sever. Today, scalability and high performance are mostly involving handwritten parallel programs using message-passing libraries (e.g. MPI). However, this process is very difficult and often error-prone. The recent reemergence of shared memory parallel (SMP) architectures, such as the cache coherent Non-Uniform Memory Access (ccNUMA) architecture used in the SGI Origin 2000, show good prospects for scaling beyond hundreds of processors. Programming on an SMP is simplified by working in a globally accessible address space. The user can supply compiler directives, such as OpenMP, to parallelize the code. As an industry standard for portable implementation of parallel programs for SMPs, OpenMP is a set of compiler directives and callable runtime library routines that extend Fortran, C and C++ to express shared memory parallelism. It promises an incremental path for parallel conversion of existing software, as well as scalability and performance for a complete rewrite or an entirely new development. Perhaps the main disadvantage of programming with directives is that inserted directives may not necessarily enhance performance. In the worst cases, it can create erroneous results. While vendors have provided tools to perform error-checking and profiling, automation in directive insertion is very limited and often failed on large programs, primarily due to the lack of a thorough enough data dependence analysis. To overcome the deficiency, we have developed a toolkit, CAPO, to automatically insert OpenMP directives in Fortran programs and apply certain degrees of optimization. CAPO is aimed at taking advantage of detailed inter-procedural dependence analysis provided by CAPTools, developed by the University of Greenwich, to reduce potential errors made by users. Earlier tests on NAS Benchmarks and ARC3D have demonstrated good success of this tool. In this study, we have applied CAPO to parallelize three large applications in the area of computational fluid dynamics (CFD): OVERFLOW, TLNS3D and INS3D. These codes are widely used for solving Navier-Stokes equations with complicated boundary conditions and turbulence model in multiple zones. Each one comprises of from 50K to 1,00k lines of FORTRAN77. As an example, CAPO took 77 hours to complete the data dependence analysis of OVERFLOW on a workstation (SGI, 175MHz, R10K processor). A fair amount of effort was spent on correcting false dependencies due to lack of necessary knowledge during the analysis. Even so, CAPO provides an easy way for user to interact with the parallelization process. The OpenMP version was generated within a day after the analysis was completed. Due to sequential algorithms involved, code sections in TLNS3D and INS3D need to be restructured by hand to produce more efficient parallel codes. An included figure shows preliminary test results of the generated OVERFLOW with several test cases in single zone. The MPI data points for the small test case were taken from a handcoded MPI version. As we can see, CAPO's version has achieved 18 fold speed up on 32 nodes of the SGI O2K. For the small test case, it outperformed the MPI version. These results are very encouraging, but further work is needed. For example, although CAPO attempts to place directives on the outer- most parallel loops in an interprocedural framework, it does not insert directives based on the best manual strategy. In particular, it lacks the support of parallelization at the multi-zone level. Future work will emphasize on the development of methodology to work in a multi-zone level and with a hybrid approach. Development of tools to perform more complicated code transformation is also needed.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2017-10-04
    Description: The aerodynamic computer code, OVERFLOW, with a multi-zone overset grid feature, has been parallelized to enhance its performance on distributed and shared memory paradigms. Practical application benchmarks have been set to assess the efficiency of code's parallelism on high-performance architectures. The code's performance has also been experimented with in the context of the distributed computing paradigm on distant computer resources using the Information Power Grid (IPG) toolkit, Globus. Two parallel versions of the code, namely OVERFLOW-MPI and -MLP, have developed around the natural coarse grained parallelism inherent in a multi-zonal domain decomposition paradigm. The algorithm invokes a strategy that forms a number of groups, each consisting of a zone, a cluster of zones and/or a partition of a large zone. Each group can be thought of as a process with one or multithreads assigned to it and that all groups run in parallel. The -MPI version of the code uses explicit message-passing based on the standard MPI library for sending and receiving interzonal boundary data across processors. The -MLP version employs no message-passing paradigm; the boundary data is transferred through the shared memory. The -MPI code is suited for both distributed and shared memory architectures, while the -MLP code can only be used on shared memory platforms. The IPG applications are implemented by the -MPI code using the Globus toolkit. While a computational task is distributed across multiple computer resources, the parallelism can be explored on each resource alone. Performance studies are achieved with some practical aerodynamic problems with complex geometries, consisting of 2.5 up to 33 million grid points and a large number of zonal blocks. The computations were executed primarily on SGI Origin 2000 multiprocessors and on the Cray T3E. OVERFLOW's IPG applications are carried out on NASA homogeneous metacomputing machines located at three sites, Ames, Langley and Glenn. Plans for the future will exploit the distributed parallel computing capability on various homogeneous and heterogeneous resources and large scale benchmarks. Alternative IPG toolkits will be used along with sophisticated zonal grouping strategies to minimize the communication time across the computer resources.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2017-10-04
    Description: The shared memory Multi-Level Parallelism (MLP) technique, developed last year at NASA Ames has been very successful in dramatically improving the performance of important NASA CFD codes. This new and very simple parallel programming technique was first inserted into the OVERFLOW production CFD code in FY 1998. The OVERFLOW-MLP code's parallel performance scaled linearly to 256 CPUs on the NASA Ames 256 CPU Origin 2000 system (steger). Overall performance exceeded 20.1 GFLOP/s, or about 4.5x the performance of a dedicated 16 CPU C90 system. All of this was achieved without any major modification to the original vector based code. The OVERFLOW-MLP code is now in production on the inhouse Origin systems as well as being used offsite at commercial aerospace companies. Partially as a result of this work, NASA Ames has purchased a new 512 CPU Origin 2000 system to further test the limits of parallel performance for NASA codes of interest. This paper presents the performance obtained from the latest optimization efforts on this machine for the LAURA-MLP and OVERFLOW-MLP codes. The Langley Aerothermodynamics Upwind Relaxation Algorithm (LAURA) code is a key simulation tool in the development of the next generation shuttle, interplanetary reentry vehicles, and nearly all "X" plane development. This code sustains about 4-5 GFLOP/s on a dedicated 16 CPU C90. At this rate, expected workloads would require over 100 C90 CPU years of computing over the next few calendar years. It is not feasible to expect that this would be affordable or available to the user community. Dramatic performance gains on cheaper systems are needed. This code is expected to be perhaps the largest consumer of NASA Ames compute cycles per run in the coming year.The OVERFLOW CFD code is extensively used in the government and commercial aerospace communities to evaluate new aircraft designs. It is one of the largest consumers of NASA supercomputing cycles and large simulations of highly resolved full aircraft are routinely undertaken. Typical large problems might require 100s of Cray C90 CPU hours to complete. The dramatic performance gains with the 256 CPU steger system are exciting. Obtaining results in hours instead of months is revolutionizing the way in which aircraft manufacturers are looking at future aircraft simulation work. Figure 2 below is a current state of the art plot of OVERFLOW-MLP performance on the 512 CPU Lomax system. As can be seen, the chart indicates that OVERFLOW-MLP continues to scale linearly with CPU count up to 512 CPUs on a large 35 million point full aircraft RANS simulation. At this point performance is such that a fully converged simulation of 2500 time steps is completed in less than 2 hours of elapsed time. Further work over the next few weeks will improve the performance of this code even further.The LAURA code has been converted to the MLP format as well. This code is currently being optimized for the 512 CPU system. Performance statistics indicate that the goal of 100 GFLOP/s will be achieved by year's end. This amounts to 20x the 16 CPU C90 result and strongly demonstrates the viability of the new parallel systems rapidly solving very large simulations in a production environment.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2017-10-04
    Description: On the path from inanimate to animate matter, a key step was the self-organization of molecules into protocells - the earliest ancestors of contemporary cells. Studies of the properties of protocells and the mechanisms by which they maintained themselves and reproduced are an important part of astrobiology. These studies also have the potential to greatly impact research in nanotechnology and computer science. Previous studies of protocells have focussed on self-replication. In these systems, Darwinian evolution occurs through a series of small alterations to functional molecules whose identities are stored. Protocells, however, may have been incapable of such storage. We hypothesize that under such conditions, the replication of functions and their interrelationships, rather than the precise identities of the functional molecules, is sufficient for survival and evolution. This process is called non-genomic evolution. Recent breakthroughs in experimental protein chemistry have opened the gates for experimental tests of non-genomic evolution. On the basis of these achievements, we have developed a stochastic model for examining the evolutionary potential of non-genomic systems. In this model, the formation and destruction (hydrolysis) of bonds joining amino acids in proteins occur through catalyzed, albeit possibly inefficient, pathways. Each protein can act as a substrate for polymerization or hydrolysis, or as a catalyst of these chemical reactions. When a protein is hydrolyzed to form two new proteins, or two proteins are joined into a single protein, the catalytic abilities of the product proteins are related to the catalytic abilities of the reactants. We will demonstrate that the catalytic capabilities of such a system can increase. Its evolutionary potential is dependent upon the competition between the formation of bond-forming and bond-cutting catalysts. The degree to which hydrolysis preferentially affects bonds in less efficient, and therefore less well-ordered, peptides is also critical to evolution of a non-genomic system. Based on these results, a new computational object called a "molnet" is defined. Like a neural network, it is formed of interconnected units that send "signals" to each other. Like molecules, neural networks have a specific function once their structure is defined. The difference between a molnet and traditional neural networks, is that input to molnets is not simply passed along and processed from input to output units, but rather it is utilized to form and break connections(bonds), and thus to form new structures. Molnets represent a powerful tool that can be used to understand the conditions under which chemical systems can form large molecules, such as proteins, and display ever more complex functions. This has direct applications, for example to the design of smart,synthetic fabrics. Additional information is contained in the original.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2017-10-04
    Description: Recent progress in distributed object technology has enabled software applications to be developed and deployed easily such that objects or components can work together across the boundaries of the network, different operating systems, and different languages. A distributed object is not necessarily a complete application but rather a reusable, self-contained piece of software that co-operates with other objects in a plug-and-play fashion via a well-defined interface. The Common Object Request Broker Architecture (CORBA), a middleware standard defined by the Object Management Group (OMG), uses the Interface Definition Language (IDL) to specify such an interface for transparent communication between distributed objects. Since IDL can be mapped to any programming language, such as C++, Java, Smalltalk, etc., existing applications can be integrated into a new application and hence the tasks of code re-writing and software maintenance can be reduced. Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with CORBA objects can increase the codes reusability. For example, scientists could link their scientific applications to vintage Fortran programs such as Partial Differential Equation(PDE) solvers in a plug-and-play fashion. Unfortunately, CORBA IDL to Fortran mapping has not been proposed and there seems to be no direct method of generating CORBA objects from Fortran without having to resort to manually writing C/C++ wrappers. In this paper, we present an efficient methodology to integrate Fortran legacy programs into a distributed object framework. Issues and strategies regarding the conversion and decomposition of Fortran codes into CORBA objects are discussed. The following diagram shows the conversion and decomposition mechanism we proposed. Our goal is to keep the Fortran codes unmodified. The conversion- aided tool takes the Fortran application program as input and helps programmers generate C/C++ header file and IDL file for wrapping the Fortran code. Programmers need to determine by themselves how to decompose the legacy application into several reusable components based on the cohesion and coupling factors among the functions and subroutines. However, programming effort still can be greatly reduced because function headings and types have been converted to C++ and IDL styles. Most Fortran applications use the COMMON block to facilitate the transfer of large amount of variables among several functions. The COMMON block plays the similar role of global variables used in C. In the CORBA-compliant programming environment, global variables can not be used to pass values between objects. One approach to dealing with this problem is to put the COMMON variables into the parameter list. We do not adopt this approach because it requires modification of the Fortran source code which violates our design consideration. Our approach is to extract the COMMON blocks and convert them into a structure-typed attribute in C++. Through attributes, each component can initialize the variables and return the computation result back to the client. We have tested successfully the proposed conversion methodology based on the f2c converter. Since f2c only translates Fortran to C, we still needed to edit the converted code to meet the C++ and IDL syntax. For example, C++/IDL requires a tag in the structure type, while C does not. In this paper, we identify the necessary changes to the f2c converter in order to directly generate the C++ header and the IDL file. Our future work is to add GUI interface to ease the decomposition task by simply dragging and dropping icons.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2017-10-04
    Description: A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2018-06-02
    Description: SmaggIce (Surface Modeling and Grid Generation for Iced Airfoils), which is being developed at the NASA Glenn Research Center at Lewis Field, is an interactive software system for data probing, boundary smoothing, domain decomposition, and structured grid generation and refinement. All these steps are required for aerodynamic performance prediction using structured, grid-based computational fluid dynamics (CFD), as illustrated in the following figure. SmaggIce provides the underlying computations to perform these functions, as well as a graphical user interface to control and interact with them, and graphics to display the results.
    Keywords: Computer Programming and Software
    Type: Research and Technology 1999; NASA/TM-2000-209639
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2018-06-08
    Keywords: Computer Programming and Software
    Type: 25th Annual Software Engineering Workshop; Greenbelt, MD; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2018-06-08
    Description: Responses of the nightside magnetosphere and auroral zone to interplanetary shocks are studies using WIND solar wind data and POLAR UV imaging data.
    Keywords: Meteorology and Climatology
    Type: Proceedings on Space Weather|Committee on Space Research (COSPAR) Colloquium; Green Bay; Taiwan
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2018-06-08
    Description: The Cassini spacecraft flew past the Earth in a trajectory almost along the Sun-Earth line, giving a unique perspective of low frequency waves in geospace.
    Keywords: Meteorology and Climatology
    Type: Committee on Space Research (COSPAR) Colloquium; Green Bay; Taiwan|Proceedings on Space Weather
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2018-06-08
    Description: The Mission Execution and Automation Section, Information Technologies and Software Systems Division at the Jet Propulsion Laboratory, recently delivered an animated software training module for the TMOD UPLINK Consolidation Task for operator training at the Deep Space Network.
    Keywords: Computer Programming and Software
    Type: 2001 IEEE Aerospace Conference; Big Sky, MT; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    Publication Date: 2018-06-08
    Description: The purpose of this research and study paper is to provide a summary description and results of rapid development accomplishments at NASA/JPL in the area of advanced distributed computing technology using a Commercial-Off--The-Shelf (COTS)-based object oriented component approach to open inter-operable software development and software reuse.
    Keywords: Computer Programming and Software
    Type: 2001 IEEE/Aerospace Conference; Big Sky, MT; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2018-06-08
    Description: The turbulent latent and sensible heat fluxes are necessary to study heat budget of the upper ocean or initialize ocean general circulation models. In order to retrieve the latent heat flux from satellite observations authors mostly use a bulk approximation of the flux whose parameters are derived from different instrument. In this paper, an approach based on artificial neural networks is proposed and compared to the bulk method on a global data set and 3 local data sets.
    Keywords: Meteorology and Climatology
    Type: ANS 11th Symposium of Meteorological Observations and Instrumentation; Albuquerque, NM; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2018-06-08
    Description: Atmospheric retrieval consists of a series of the scientific algorithms performed to retrieve the actual state of the atmosphere in terms of its temperature and chemical consituents.
    Keywords: Computer Programming and Software
    Type: Aerospace Conference; Big Sky, MT; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2018-06-08
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2018-06-08
    Description: This paper describes a coherent approach and accompanying tool support that addresses the challenges of large software efforts.
    Keywords: Computer Programming and Software
    Type: 4th International Software and Internet Quality Week; Brussels; Belgium
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...