ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Meteorology and Climatology  (264)
  • Communications and Radar  (212)
  • Computer Programming and Software  (184)
  • Cell & Developmental Biology
  • 2000-2004  (660)
  • 1995-1999
  • 2000  (660)
Collection
Keywords
Years
  • 2000-2004  (660)
  • 1995-1999
Year
  • 1
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2009-11-17
    Description: It's spring at Martian Outpost 3, the year 2025. The Universe Cup's on later today, and next week little Suzie celebrates her fourth birthday. Fortunately, this football fan and parent will be able to participate in both of these activities, albeit at a slight time delay, due completely to the sophisticated, high-speed quasi-real-time multimedia/navigation MarsNet surrounding Mars and tying it to Earth. Capable of moving gigabits a second in either direction, the network supports not only the multiple manned and robotic science needs of teams and devices encircling Mars, but also the very real human need for communication.
    Keywords: Communications and Radar
    Type: Concepts and Approaches for Mars Exploration; Part 1; 1; LPI-Contrib-1062
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Over the next decade, international plans and commitments are underway to develop an infrastructure at Mars to support future exploration of the red planet. The purpose of this infrastructure is to provide reliable global communication and navigation coverage for on-approach, landed, roving, and in-flight assets at Mars. The claim is that this infrastructure will: 1) eliminate the need of these assets to carry Direct to Earth (DTE) communications equipment, 2) significantly increase data return and connectivity, 3) enable small mission exploration of Mars without DTE equipment, 4) provide precision navigation i.e., 10 to 100m position resolution, 5) supply timing reference accurate to 10ms. This paper in particular focuses on two CCSDS recommendations for that infrastructure: CCSDS Proximity-1 Space Link Protocol and CCSDS File Delivery Protocol (CFDP). A key aspect of Mars exploration will be the ability of future missions to interoperate. These protocols establish a framework for interoperability by providing standard communication, navigation, and timing services. In addition, these services include strategies to recover gracefully from communication interruptions and interference while ensuring backward compatibility with previous missions from previous phases of exploration.
    Keywords: Communications and Radar
    Type: Concepts and Approaches for Mars Exploration; Part 1; 172-173; LPI-Contrib-1062-Pt-1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2004-12-03
    Description: Telecommunications is a critical component for any mission at Mars as it is an enabling function that provides connectivity back to Earth and provides a means for conducting science. New developments in telecommunications, specifically in software - configurable radios, expand the possible approaches for science missions at Mars. These radios provide a flexible and re-configurable platform that can evolve with the mission and that provide an integrated approach to communications and science data processing. Deep space telecommunication faces challenges not normally faced by terrestrial and near-earth communications. Radiation, thermal, highly constrained mass, volume, packaging and reliability all are significant issues. Additionally, once the spacecraft leaves earth, there is no way to go out and upgrade or replace radio components. The reconfigurable software radio is an effort to provide not only a product that is immediately usable in the harsh space environment but also to develop a radio that will stay current as the years pass and technologies evolve.
    Keywords: Communications and Radar
    Type: Concepts and Approaches for Mars Exploration; Part 1; 150-151; LPI-Contrib-1062-Pt-1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2004-12-03
    Description: We have completed a new generation of water vapor radiometers (WVR), the A- series, in order to support radio science experiments with the Cassini spacecraft. These new instruments sense three frequencies in the vicinity of the 22 GHz emission line of atmospheric water vapor within a 1 degree beamwidth from a clear aperture antenna that is co-pointed with the radio telescope down to 10 degree elevation. The radiometer electronics features almost an order of magnitude improvement in temperature stability compared with earlier WVR designs. For many radio science experiments, the error budget is likely to be dominated by path delay fluctuations due to variable atmospheric water vapor along the line-of-sight to the spacecraft. In order to demonstrate the performance of these new WVRs we are attempting to calibrate the delay fluctuations as seen by a radio interferometer operating over a 21 km baseline with a WVR near each antenna. The characteristics of these new WVRs will be described and the results of our preliminary analysis will be presented indicating an accuracy of 0.2 to 0.5 mm in tracking path delay fluctuations over time scales of 10 to 10,000 seconds.
    Keywords: Meteorology and Climatology
    Type: International VLBI Service for Geodesy and Astrometry: 2000 General Meeting Proceedings; 274-279; NASA/CP-2000-209893
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2004-12-03
    Description: Science education is taking the teaching of science from a traditional (lecture) approach to a multidimensional sense-making approach which allows teachers to support students by providing exploratory experiences. Using projects is one way of providing students with opportunities to observe and participate in sense-making activity. We created a learning environment that fostered inquiry-based learning. Students were engaged in a variety of Inquiry activities that enabled them to work in cooperative planning teams where respect for each other was encouraged and their ability to grasp, transform and transfer information was enhanced. Summer, 1998: An air pollution workshop was conducted for high school students in the Medgar Evers College/Middle College High School Liberty Partnership Summer Program. Students learned the basics of meteorology: structure and composition of the atmosphere and the processes that cause weather. The highlight of this workshop was the building of hand-held sunphotometers, which measure the intensity of the sunlight striking the Earth. Summer, 1999: high school students conducted a research project which measured the mass and size of ambient particulates and enhanced our ability to observe through land based measurements changes in the optical depth of ambient aerosols over Brooklyn. Students used hand held Sunphotometers to collect data over a two week period and entered it into the NASA GISS database by way of the internet.
    Keywords: Meteorology and Climatology
    Type: Materials Presented at the MU-SPIN Ninth Annual Users' Conference; 33-36; NASA/CP-2000-209970
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2011-08-24
    Description: Rapid climate change characterizes numerous terrestrial sediment records during and since the last glaciation. Vegetational response is best expressed in terrestrial records near ecotones, where sensitivity to climate change is greatest, and response times are as short as decades.
    Keywords: Meteorology and Climatology
    Type: Proceedings of the National Academy of Sciences of the United States of America (ISSN 0027-8424); Volume 97; 4; 1359-61
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2011-08-24
    Description: Geological, geophysical, and geochemical data support a theory that Earth experienced several intervals of intense, global glaciation ("snowball Earth" conditions) during Precambrian time. This snowball model predicts that postglacial, greenhouse-induced warming would lead to the deposition of banded iron formations and cap carbonates. Although global glaciation would have drastically curtailed biological productivity, melting of the oceanic ice would also have induced a cyanobacterial bloom, leading to an oxygen spike in the euphotic zone and to the oxidative precipitation of iron and manganese. A Paleoproterozoic snowball Earth at 2.4 Giga-annum before present (Ga) immediately precedes the Kalahari Manganese Field in southern Africa, suggesting that this rapid and massive change in global climate was responsible for its deposition. As large quantities of O(2) are needed to precipitate this Mn, photosystem II and oxygen radical protection mechanisms must have evolved before 2.4 Ga. This geochemical event may have triggered a compensatory evolutionary branching in the Fe/Mn superoxide dismutase enzyme, providing a Paleoproterozoic calibration point for studies of molecular evolution.
    Keywords: Meteorology and Climatology
    Type: Proceedings of the National Academy of Sciences of the United States of America (ISSN 0027-8424); Volume 97; 4; 1400-5
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2011-08-23
    Description: The Texas A&M monthly total oceanic rainfall retrieval algorithm is based on radiative transfer models and can only be modified on a physically sound basis. Within this constraint we have examined some improvements to the algorithm and it appears that it can be made significantly better. In particular, it appears that by proper use of the range of frequencies available on TMI (TRMM Microwave Imager) and AMSR that the need for the log-normal fit can be eliminated.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 235-242
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2011-08-23
    Description: Results of two lidar measurement campaigns are presented, HOLO-1 (Utah, March 1999) and HOLO-2 (New Hampshire, June 1999). These tests demonstrate the ability of lidars utilizing holographic optical elements (HOEs) to determine tropospheric wind velocity and direction at cloud altitude. Several instruments were employed. HOLO-1 used the 1,064 mm transmission-HOE lidar (HARLIE, Goddard Space Flight Center), a zenith-staring 532 nm lidar (AROL-2, Utah State University), and a wide-field video camera (SkyCam) for imagery of clouds overhead. HOLO-2 included these instruments plus the 532 nm reflection-HOE lidar (PHASERS, St. Anselm College). HARLIE and PHASERS scan the sky at constant cone angles of 45 deg. and 42 deg. from normal, respectively. The progress of clouds and entire cloud fields across the sky is tracked by the repetitive conical scans of the HOE lidars. AROL-2 provides the attitude information enabling the SkyCam cloud images to be analyzed for independent data on cloud motion. Data from the HOE lidars are reduced by means of correlations, visualization by animation techniques, and kinematic diagrams of cloud feature motion. Excellent agreement is observed between the HOE lidar results and those obtained with video imagery and lidar ranging.
    Keywords: Communications and Radar
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 63-68
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2011-08-23
    Description: We evaluated the performance of the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI) at-launch algorithm for monthly oceanic rain rate using two years (January 1998 - December 1999) of TMI data. The TMI at-launch algorithm is based on Wilheit et al.'s technique for estimating monthly oceanic rainfall that relies on histograms of multichannel microwave measurements. Comparisons with oceanic monthly rain rates derived from the Defense Meteorological Satellite Program (DMSP) F-13 and F-14 Special Sensor Microwave Imager (SSM/I) data show the average rain rates over the TRMM region (between 400S and 40N) are 3.0, 2.85 and 2.89 mm/day, respectively for F-13, F-14 and TMI. Based on the latest version of TB data (version 5), both rainrate and freezing height derived from TMI are similar to those from the F-13 and F-14 SSM/I data. However, regionally the differences are statistically significant at the 95% confidence. Three hourly monthly rainrates are also computed from 3-hourly TB histograms to examine the diurnal cycle of precipitation. Over most of the oceanic TRMM area, a distinct early morning rainfall peak is found. A harmonic analysis shows that the amplitude of the 12h harmonic is significant and comparable to that of the 24h harmonic.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 198-207
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    Publication Date: 2011-08-24
    Description: A one-week in situ intercomparison campaign was completed on the Rice University campus for measuring HCHO using three different techniques, including a novel optical sensor based on difference frequency generation (DFG) operating at room temperature. Two chemical derivatization methods, 2,4-dinitrophenylhydrazine (DNPH) and o-(2,3,4,5,6-pentafluorobenzyl) hydroxylamine (PFBHA), were deployed during the daylight hours for three- to four-hour time-integrated samples. A real-time optical sensor based on laser absorption spectroscopy was operated simultaneously, including nighttime hours. This tunable spectroscopic source based on difference frequency mixing of two fiber-amplified diode lasers in periodically poled LiNb03 (PPLN) was operated at 3.5315 micrometers (2831.64 cm 1) to access a strong HCHO ro-vibrational transition free of interferences from other species. The results showed a bias of -1.7 and -1.2 ppbv and a gross error of 2.6 and 1.5 ppbv for DNPH and PFBHA measurements, respectively, compared with DFG measurements. These results validate the DFG sensor for time-resolved measurements of HCHO in urban areas.
    Keywords: Meteorology and Climatology
    Type: Geophysical research letters (ISSN 0094-8276); Volume 27; 14; 2093-6
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2013-08-31
    Description: Determination of the readiness of a spacecraft for launch is a critical requirement. The final assembly of all subsystems must be verified. Testing of a communications system can mostly be done using closed-circuits (cabling to/from test ports), but the final connections to the antenna require radiation tests. The Tropical Rainfall Measuring Mission (TRMM) Project used a readily available 'near-fleld on-axis' equation to predict the values to be used for comparison with those obtained in a test program. Tests were performed in a 'clean room' environment at both Goddard Space Flight Center (GSFC) and in Japan at the Tanegashima Space Center (TnSC) launch facilities. Most of the measured values agreed with the predicted values to within 0.5 dB. This demonstrates that sometimes you can use relatively simple techniques to make antenna performance measurements when use of the 'far field ranges, anechoic chambers, or precision near-field ranges' are neither available nor practical. Test data and photographs are provided.
    Keywords: Communications and Radar
    Type: Antenna Applications; Monticello, IL; United States
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2013-08-31
    Description: The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2013-08-31
    Description: Submillimeter-wave cloud ice radiometry is an innovative technique for determining the amount of ice present in cirrus clouds, measuring median crystal size, and constraining crystal shape. The radiometer described in this poster is being developed to acquire data to validate radiometric retrievals of cloud ice at submillimeter wavelengths. The goal of this effort is to develop a technique to enable spaceborne characterization of cirrus, meeting key climate modeling and NASA measurement needs.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2013-08-31
    Description: A summary is presented of basic lightning characteristics/criteria applicable to current and future aerospace vehicles. The paper provides estimates on the probability of occurrence of a 200 kA peak lightning return current, should lightning strike an aerospace vehicle in various operational phases, i.e., roll-out, on-pad, launch, reenter/land, and return-to-launch site. A literature search was conducted for previous work concerning occurrence and measurement of peak lighting currents, modeling, and estimating the probabilities of launch vehicles/objects being struck by lightning. This paper presents a summary of these results.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2016-06-07
    Description: A frequency sub-band based adaptive spectral subtraction algorithm is developed to remove noise from noise-corrupted speech signals. A single microphone is used to obtain both the noise-corrupted speech and the estimate of the statistics of the noise. The statistics of the noise are estimated during time frames that do not contain speech. These statistics are used to determine if future time frames contain speech. During speech time frames, the algorithm determines which frequency sub-bands contain useful speech information and which frequency sub-bands contain only noise. The frequency sub-bands, which contain only noise, are subtracted off at a larger proportion so the noise does not compete with the speech information. Simulation results are presented.
    Keywords: Communications and Radar
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 97-106; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: In this project we continued the development of a visual editor in the Java programming language to create screens on which to display real-time data. The data comes from the numerous systems monitoring the operation of the space shuttle while on the ground and in space, and from the many tests of subsystems. The data can be displayed on any computer platform running a Java-enabled World Wide Web (WWW) browser and connected to the Internet. Previously a special-purpose program bad been written to display data on emulations of character-based display screens used for many years at NASA. The goal now is to display bit-mapped screens created by a visual editor. We report here on the visual editor that creates the display screens. This project continues the work we bad done previously. Previously we had followed the design of the 'beanbox,' a prototype visual editor created by Sun Microsystems. We abandoned this approach and implemented a prototype using a more direct approach. In addition, our prototype is based on newly released Java 2 graphical user interface (GUI) libraries. The result has been a visually more appealing appearance and a more robust application.
    Keywords: Computer Programming and Software
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 189-195; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2016-06-07
    Description: This paper provides synopses of the design. implementation, and results of key high data rate communications experiments utilizing the technologies of NASA's Advanced Communications Technology Satellite (ACTS). Specifically, the network protocol and interoperability performance aspects will be highlighted. The objectives of these key experiments will be discussed in their relevant context to NASA missions, as well as, to the comprehensive communications industry. Discussion of the experiment implementation will highlight the technical aspects of hybrid network connectivity, a variety of high-speed interoperability architectures, a variety of network node platforms, protocol layers, internet-based applications, and new work focused on distinguishing between link errors and congestion. In addition, this paper describes the impact of leveraging government-industry partnerships to achieve technical progress and forge synergistic relationships. These relationships will be the key to success as NASA seeks to combine commercially available technology with its own internal technology developments to realize more robust and cost effective communications for space operations.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 151-157 and 301-310; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2016-06-07
    Description: For ships at sea. satellites provide the only option for high data rate (HDR), long haul communications. Furthermore the demand for HDR satellite communications (SATCOM) for military and commercial ships. and other offshore platforms is increasing. Presently the bulk of this maritime HDR SATCOM connectivity is provided via C-band and X-band. However, the shipboard antenna sizes required to achieve a data rate of, say T 1 (1.544 Mbps) with present C-/X-band SATCOM systems range from seven to ten feet in diameter. This limits the classes of ships to which HDR services can be provided to those which are large enough to accommodate the massive antennas. With its high powered K/Ka-band spot beams, the National Aeronautics and Space Administration's (NASA) Advanced Communications Technology Satellite (ACTS) was able to provide T I and higher rate services to ships at sea using much smaller shipboard antennas. This paper discusses three shipboard HDR SATCOM demonstrations that were conducted with ACTS between 1996 and 1998. The first demonstration involved a 2 Mbps link provided to the seismic survey ship MN Geco Diamond equipped with a 16-inch wide, 4.5-inch tall, mechanically steered slotted waveguide array antenna developed by the Jet Propulsion Laboratory. In this February 1996 demonstration ACTS allowed supercomputers ashore to process Geco Diamond's voluminous oceanographic seismic data in near real time. This capability allowed the ship to adjust its search parameters on a daily basis based on feedback from the processed data, thereby greatly increasing survey efficiency. The second demonstration was conducted on the US Navy cruiser USS Princeton (CG 59) with the same antenna used on Geco Diamond. Princeton conducted a six-month (January-July 1997) Western Hemisphere solo deployment during which time T1 connectivity via ACTS provided the ship with a range of valuable tools for operational, administrative and quality-of-life tasks. In one instance, video teleconferencing (VTC) via ACTS allowed the ship to provide life-saving emergency medical aid, assisted by specialists ashore. to a fellow mariner - the Master of a Greek cargo ship. The third demonstration set what is believed to be the all-time SATCOM data rate record to a ship at sea, 45 Mbps in October 1998. This Lake Michigan (Chicago area) demonstration employed one of ACTS' fixed beams and involved the smallest of the three vessels, the 45-foot Bayliner M/V Entropy equipped with a modified commercial-off-the-shelf one-meter antenna. A variety of multi-media services were provided to Entropy through a stressing range of sea states. These three demonstrations provided a preview of the capabilities that could be provided to future mariners on a more routine basis when K/Ka-band SATCOM systems are widely deployed.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 121-133 and 271-278; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2016-06-07
    Description: New satellite communication systems are steadily seeking to use higher frequency bands to accommodate the requirements for additional capacity. At these higher frequencies, propagation impairments that did not significantly affect the signal at lower frequencies begin to have considerable impact. In Ka-band. the next logical commercial frequency band to be used for satellite communication, attenuation of the signal due to rain is a primary concern. An experimental satellite built by NASA, the Advanced Communication Technology Satellite (ACTS). launched in September 1993, is the first U.S. communication satellite operating in the Ka-band. In addition to higher carrier frequencies, a number of other new technologies, including on-board baseband processing. multiple beam antennas, and rain fade detection and compensation techniques, were designed into the ACTS. Verification experiments have been conducted since the launch to characterize the new technologies. The focus of this paper is to characterize the method used by the ACTS TI Very Small Aperture Terminal (TI VSAT) ground stations in detecting the presence of fade in the communication signal and to adaptively compensate for it by the addition of burst rate reduction and forward error correction. Measured data obtained from the ACTS program was used to validate the compensation technique. A software process was developed and demonstrated to statistically characterize the increased availability achieved by the compensation techniques in terms of the bit error rate time enhancement factor. Several improvements to the ACTS technique are discussed and possible implementations for future Ka band system are offered.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 23-33 and 203-209; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2016-06-07
    Description: The Advanced Communications Technology Satellite (ACTS) Project invested heavily in prototype Ka-band satellite ground terminals to conduct an experiments program with the ACTS satellite. The ACTS experiment's program proposed to validate Ka-band satellite and ground station technology. demonstrate future telecommunication services. demonstrate commercial viability and market acceptability of these new services, evaluate system networking and processing technology, and characterize Ka-band propagation effects, including development of techniques to mitigate signal fading. This paper will present a summary of the fixed ground terminals developed by the NASA Glenn Research Center and its industry partners, emphasizing the technology and performance of the terminals (Part 1) and the lessons learned throughout their six year operation including the inclined orbit phase of operations (Full Report). An overview of the Ka-band technology and components developed for the ACTS ground stations is presented. Next. the performance of the ground station technology and its evolution during the ACTS campaign are discussed to illustrate the technical tradeoffs made during the program and highlight technical advances by industry to support the ACTS experiments program and terminal operations. Finally. lessons learned during development and operation of the user terminals are discussed for consideration of commercial adoption into future Ka-band systems. The fixed ground stations used for experiments by government, academic, and commercial entities used reflector based offset-fed antenna systems ranging in size from 0.35m to 3.4m antenna diameter. Gateway earth stations included two systems, referred to as the NASA Ground Station (NGS) and the Link Evaluation Terminal (LET). The NGS provides tracking, telemetry, and control (TT&C) and Time Division Multiple Access (TDMA) network control functions. The LET supports technology verification and high data rate experiments. The ground stations successfully demonstrated many services and applications at Ka-band in three different modes of operation: circuit switched TDMA using the satellite on-board processor, satellite switched SS-TDMA applications using the on-board Microwave Switch Matrix (MSM), and conventional transponder (bent-pipe) operation. Data rates ranged from 4.8 kbps up to 622 Mbps. Experiments included: 1) low rate (4.8- 1 00's kbps) remote data acquisition and control using small earth stations, 2) moderate rate (1-45 Mbps) experiments included full duplex voice and video conferencing and both full duplex and asymmetric data rate protocol and network evaluation using mid-size ground stations, and 3) link characterization experiments and high data rate (155-622 Mbps) terrestrial and satellite interoperability application experiments conducted by a consortium of experimenters using the large transportable ground stations.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 9-22 and 191-201; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2016-06-07
    Description: This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.
    Keywords: Computer Programming and Software
    Type: Lfm2000: Fifth NASA Langley Formal Methods Workshop; NASA/CP-2000-210100
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2014-10-07
    Description: The goals of this study are the evaluation of current fast radiative transfer models (RTMs) and line-by-line (LBL) models. The intercomparison focuses on the modeling of 11 representative sounding channels routinely used at numerical weather prediction centers: seven HIRS (High-resolution Infrared Sounder) and four AMSU (Advanced Microwave Sounding Unit) channels. Interest in this topic was evidenced by the participation of 24 scientists from 16 institutions. An ensemble of 42 diverse atmospheres was used and results compiled for 19 infrared models and 10 microwave models, including several LBL RTMs. For the first time, not only radiances, but also Jacobians (of temperature, water vapor, and ozone) were compared to various LBL models for many channels. In the infrared, LBL models typically agree to within 0.05-0.15 K (standard deviation) in terms of top-of-the-atmosphere brightness temperature (BT). Individual differences up to 0.5 K still exist, systematic in some channels, and linked to the type of atmosphere in others. The best fast models emulate LBL BTs to within 0.25 K, but no model achieves this desirable level of success for all channels. The ozone modeling is particularly challenging. In the microwave, fast models generally do quite well against the LBL model to which they were tuned. However significant differences were noted among LBL models. Extending the intercomparison to the Jacobians proved very useful in detecting subtle and more obvious modeling errors. In addition, total and single gas optical depths were calculated, which provided additional insight on the nature of differences. Recommendations for future intercomparisons are suggested.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2013-08-29
    Description: Nearly three years of Tropical Rainfall Measuring Mission Satellite (TRMM Satellite) monthly estimates of tropical surface rainfall are analyzed to document and understand the differences among the TRMM-based estimates and how these differences relate to the pre-TRMM estimates and current operational analyses. Variation among the TRMM estimates is shown to be considerably smaller than among a pre-TRMM collection of passive microwave-based products. Use of both passive and active microwave techniques in TRMM should lead to increased confidence in converged estimates. Current TRMM estimates are shown to have a range of about 20% for the tropical ocean as a whole, with variations in heavily raining ocean areas of the ITCZ and SPCZ having differences over 30%. In mid-latitude ocean areas the differences are smaller. Over land there is a distinct difference between the tropics and mid-latitude with a reversal between some of the products as to which tends to be relatively high or low. Comparisons of TRMM estimates with ocean atoll and land gauge information point to products that might have significant regional biases. The radar-based product is significantly low biased compared with atoll raingauge data, while the passive microwave product is significantly high compared to raingauge data in the deep tropics. The evolution of rainfall patterns during the recent change from intense El Nino to a long period of La Nina and then a gradual return to near neutral conditions is described using TRMM. The time history of integrated rainfall over the tropical oceans (and land) during this period differs among the passive and active microwave TRMM estimates.
    Keywords: Meteorology and Climatology
    Type: Symposium on Cloud Systems, Hurricanes and TRMM; Unknown
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2013-08-29
    Description: Observations made by the Precipitation Radar (PR) and the Microwave Imager (TMI) radiometer on board the Tropical Rainfall Measuring Mission (TRMM) satellite help us to show the significance of the 85 GHz polarization difference, PD85, measured by TMI. Rain type, convective or stratiform, deduced from the PR allows us to infer that PD85 is generally positive in stratiform rain clouds, while PD85 can be markedly negative in deep convective rain clouds. Furthermore, PD85 increases in a gross manner as stratiform rain rate increases. On the contrary, in a crude fashion PD85 decreases as convective rain rate increases. From the observations of TMI and PR, we find that PD85 is a weak indicator of rain rate. Utilizing information from existing polarimetric radar studies, we infer that negative values of PD85 are likely associated with vertically-oriented small oblate or wet hail that are found in deep convective updrafts.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2013-08-29
    Description: Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2013-08-29
    Description: Major droughts and floods over the U.S. continent may be related to a far field energy source in the Asian Pacific. This is illustrated by two climate patterns associated with summertime rainfall over the U.S. and large-scale circulation on interannual timescale. The first shows an opposite variation between the drought/flood over the Midwest and that over eastern and southeastern U.S., coupled to a coherent wave pattern spanning the entire East Asia-North Pacific-North America region related to the East Asian jetstream. The second shows a continental-scale drought/flood in the central U.S., coupled to a wavetrain linking Asian/Pacific monsoon region to North America.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2013-08-29
    Description: A set of global, monthly rainfall products has been intercompared to understand the quality and utility of the estimates. The products include 25 observational (satellite-based), four model and two climatological products. The results of the intercomparison indicate a very large range (factor of two or three) of values when all products are considered. The range of values is reduced considerably when the set of observational products is limited to those considered quasi-standard. The model products do significantly poorer in the tropics, but are competitive with satellite-based fields in mid-latitudes over land. Over ocean, products are compared to frequency of precipitation from ship observations. The evaluation of the observational products point to merged data products (including rain gauge information) as providing the overall best results.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2013-08-29
    Description: We describe an event-based, publish-and-subscribe mechanism based on using 'smart subscriptions' to recognize weakly-structured events. We present a hierarchy of subscription languages (propositional, predicate, temporal and agent) and algorithms for efficiently recognizing event matches. This mechanism has been applied to the management of distributed applications.
    Keywords: Computer Programming and Software
    Type: Distributed Objects in Computational Science; Unknown
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2013-08-29
    Description: Observational and modeling studies have described the relationships between convective/stratiform rain proportion and the vertical distributions of vertical motion, latent heating, and moistening in mesoscale convective systems. Therefore, remote sensing techniques which can quantify the relative areal proportion of convective and stratiform, rainfall can provide useful information regarding the dynamic and thermodynamic processes in these systems. In the present study, two methods for deducing the convective/stratiform areal extent of precipitation from satellite passive microwave radiometer measurements are combined to yield an improved method. If sufficient microwave scattering by ice-phase precipitating hydrometeors is detected, the method relies mainly on the degree of polarization in oblique-view, 85.5 GHz radiances to estimate the area fraction of convective rain within the radiometer footprint. In situations where ice scattering is minimal, the method draws mostly on texture information in radiometer imagery at lower microwave frequencies to estimate the convective area fraction. Based upon observations of ten convective systems over ocean and nine systems over land, instantaneous 0.5 degree resolution estimates of convective area fraction from the Tropical Rainfall Measuring Mission Microwave Imager (TRMM TMI) are compared to nearly coincident estimates from the TRMM Precipitation Radar (TRMM PR). The TMI convective area fraction estimates are slightly low-biased with respect to the PR, with TMI-PR correlations of 0.78 and 0.84 over ocean and land backgrounds, respectively. TMI monthly-average convective area percentages in the tropics and subtropics from February 1998 exhibit the greatest values along the ITCZ and in continental regions of the summer (southern) hemisphere. Although convective area percentages. from the TMI are systematically lower than those from the PR, monthly rain patterns derived from the TMI and PR rain algorithms are very similar. TMI rain depths are significantly higher than corresponding rain depths from the PR in the ITCZ, but are similar in magnitude elsewhere.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2013-08-29
    Description: We built a direct detection Doppler lidar based on the double-edge molecular technique and made the first molecular based wind measurements using the eyesafe 355 nm wavelength. Three etalon bandpasses are obtained with Step etalons on a single pair of etalon plates. Long-term frequency drift of the laser and the capacitively stabilized etalon is removed by locking the etalon to the laser frequency. We use a low angle design to avoid polarization effects. Wind measurements of 1 to 2 m/s accuracy are obtained to 10 km altitude with 5 mJ of laser energy, a 750s integration, and a 25 cm telescope. Good agreement is obtained between the lidar and rawinsonde measurements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2013-08-29
    Description: We use clear sky heating rates to show that convective outflow in the tropics decreases rapidly with height between the 350 K and 360 K potential temperature surfaces (or between roughly 13 and 15 km). There is also a rapid fall-off in the pseudoequivalent potential temperature probability distribution of near surface air parcels between 350 K and 360 K. This suggests that the vertical variation of convective outflow in the upper tropical troposphere is to a large degree determined by the distribution of sub cloud layer entropy.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    Publication Date: 2013-08-29
    Description: A symposium celebrating the first 50 years of Dr. Joanne Simpson's career took place at the NASA/Goddard Space Flight Center from December 1 - 3, 1999. This symposium consisted of presentations that focused on: historical and personal points of view concerning Dr. Simpson's research career, her interactions with the American Meteorological Society, and her leadership in TRMM; scientific interactions with Dr. Simpson that influenced personal research; research related to observations and modeling of clouds, cloud systems and hurricanes; and research related to the Tropical Rainfall Measuring Mission (TRMM). There were a total of 36 presentations and 103 participants from the US, Japan and Australia. The specific presentations during the symposium are summarized in this paper.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2013-08-29
    Description: We present results of simulations of the distribution of 1809 keV radiation from the decay of Al-26 in the Galaxy. Recent observations of this emission line using the Gamma Ray Imaging Spectrometer (GRIS) have indicated that the bulk of the AL-26 must have a velocity of approx. 500 km/ s. We have previously shown that a velocity this large could be maintained over the 10(exp 6) year lifetime of the Al-26 if it is trapped in dust grains that are reaccelerated periodically in the ISM. Here we investigate whether a dust grain velocity of approx. 500 km/ s will produce a distribution of 1809 keV emission in latitude that is consistent with the narrow distribution seen by COMPTEL. We find that dust grain velocities in the range 275 - 1000 km/ s are able to reproduce the COMPTEL 1809 keV emission maps reconstructed using the Richardson-Lucy and Maximum Entropy image reconstruction methods while the emission map reconstructed using the Multiresolution Regularized Expectation Maximization algorithm is not well fit by any of our models. The Al-26 production rate that is needed to reproduce the observed 1809 keV intensity yields in a Galactic mass of Al-26 of approx. 1.5 - 2 solar mass which is in good agreement with both other observations and theoretical production rates.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2013-08-29
    Description: This study examines the uncertainty in forecasts of the January-February-March (JFM) mean extratropical circulation, and how that uncertainty is modulated by the El Nino/Southern Oscillation (ENSO). The analysis is based on ensembles of hindcasts made with an Atmospheric General Circulation Model (AGCM) forced with sea surface temperatures observed during; the 1983 El Nino and 1989 La Nina events. The AGCM produces pronounced interannual differences in the magnitude of the extratropical seasonal mean noise (intra-ensemble variability). The North Pacific, in particular, shows extensive regions where the 1989 seasonal mean noise kinetic energy (SKE), which is dominated by a "PNA-like" spatial structure, is more than twice that of the 1983 forecasts. The larger SKE in 1989 is associated with a larger than normal barotropic conversion of kinetic energy from the mean Pacific jet to the seasonal mean noise. The generation of SKE due to sub-monthly transients also shows substantial interannual differences, though these are much smaller than the differences in the mean flow conversions. An analysis of the Generation of monthly mean noise kinetic energy (NIKE) and its variability suggests that the seasonal mean noise is predominantly a statistical residue of variability resulting from dynamical processes operating on monthly and shorter times scales. A stochastically-forced barotropic model (linearized about the AGCM's 1983 and 1989 base states) is used to further assess the role of the basic state, submonthly transients, and tropical forcing, in modulating the uncertainties in the seasonal AGCM forecasts. When forced globally with spatially-white noise, the linear model generates much larger variance for the 1989 base state, consistent with the AGCM results. The extratropical variability for the 1989 base state is dominanted by a single eigenmode, and is strongly coupled with forcing over tropical western Pacific and the Indian Ocean, again consistent with the AGCM results. Linear calculations that include forcing from the AGCM variance of the tropical forcing and submonthly transients show a small impact on the variability over the Pacific/North American region compared with that of the base state differences.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2013-08-29
    Description: Using global rainfall and sea surface temperature (SST) data for the past two decades (1979-1998), we have investigated the intrinsic modes of Asian summer monsoon (ASM) and ENSO co-variability. Three recurring ASM rainfall-SST coupled modes were identified. The first is a basin scale mode that features SST and rainfall variability over the entire tropics (including the ASM region), identifiable with those occurring during El Nino or La Nina. This mode is further characterized by a pronounced biennial variation in ASM rainfall and SST associated with fluctuations of the anomalous Walker circulation that occur during El Nino/La Nina transitions. The second mode comprises mixed regional and basin-scale rainfall and SST signals, with pronounced intraseasonal and interannual variabilities. This mode features a SST pattern associated with a developing La Nina, with a pronounced low level anticyclone in the subtropics of the western Pacific off the coast of East Asia. The third mode depicts an east-west rainfall and SST dipole across the southern equatorial Indian Ocean, most likely stemming from coupled ocean-atmosphere processes within the ASM region. This mode also possesses a decadal time scale and a linear trend, which are not associated with El Nino/La Nina variability. Possible causes of year-to-year rainfall variability over the ASM and sub-regions have been evaluated from a reconstruction of the observed rainfall from singular eigenvectors of the coupled modes. It is found that while basin-scale SST can account for portions of ASM rainfall variability during ENSO events (up to 60% in 1998), regional processes can accounts up to 20-25% of the rainfall variability in typical non-ENSO years. Stronger monsoon-ENSO relationship tends to occur in the boreal summer immediately preceding a pronounced La Nina, i.e., 1998, 1988 and 1983. Based on these results, we discuss the possible impacts of the ASM on ENSO variability via the west Pacific anticyclone and articulate a hypothesis that anomalous wind forcings derived from the anticyclone may be instrumental in inducing a strong biennial modulation to natural ENSO cycles.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2013-08-29
    Description: In general, there are two broad scientific objectives when using cloud resolving models (CRMs or cloud ensemble models-CEMs) to study tropical convection. The first one is to use them as a physics resolving models to understand the dynamic and microphysical processes associated with the tropical water and energy cycles and their role in the climate system. The second approach is to use the CRMs to improve the representation of moist processes and their interaction with radiation in large-scale models. In order to improve the credibility of the CRMs and achieve the above goals, CRMs using identical initial conditions and large-scale influences need to produce very similar results. Two CRMs produced different statistical equilibrium (SE) states even though both used the same initial thermodynamic and wind conditions. Sensitivity tests to identify the major physical processes that determine the SE states for the different CRM simulations were performed. Their results indicated that atmospheric horizontal wind is treated quite differently in these two CRMs. The model that had stronger surface winds and consequently larger latent and sensible heat fluxes from the ocean produced a warmer and more humid modeled thermodynamic SE state. In addition, the domain mean thermodynamic state is more unstable for those experiments that produced a warmer and more humid SE state. Their simulated wet (warm and humid) SE states are thermally more stable in the lower troposphere (from the surface to 4-5 km in altitude). The large-scale horizontal advective effects on temperature and water vapor mixing ratio are needed when using CRMs to perform long-term integrations to study convective feedback under specified large-scale environments. In addition, it is suggested that the dry and cold SE state simulated was caused by enhanced precipitation but not enough surface evaporation. We find some problems with the interpretation of these three phenomena.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2013-08-29
    Description: The effectiveness of techniques for creating "bogus" vortices in numerical simulations of hurricanes is examined by using the Penn State/NCAR nonhydrostatic mesoscale model (MM5) and its adjoint system. A series of four-dimensional variational data assimilation (4-D VAR) experiments is conducted to generate an initial vortex for Hurricane Georges (1998) in the Atlantic Ocean by assimilating bogus sea-level pressure and surface wind information into the mesoscale numerical model. Several different strategies are tested for improving the vortex representation. The initial vortices produced by the 4-D VAR technique are able to reproduce many of the structural features of mature hurricanes. The vortices also result in significant improvements to the hurricane forecasts in terms of both intensity and track. In particular, with assimilation of only bogus sea-level pressure information, the response in the wind field is contained largely within the divergent component, with strong convergence leading to strong upward motion near the center. Although the intensity of the initial vortex seems to be well represented, a dramatic spin down of the storm occurs within the first 6 h of the forecast. With assimilation of bogus surface wind data only, an expected dominance of the rotational component of the wind field is generated, but the minimum pressure is adjusted inadequately compared to the actual hurricane minimum pressure. Only when both the bogus surface pressure and wind information are assimilated together does the model produce a vortex that represents the actual intensity of the hurricane and results in significant improvements to forecasts of both hurricane intensity and track.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2013-08-29
    Description: This paper represents the first attempt to use TRMM rainfall information to estimate the four dimensional latent heating structure over the global tropics for February 1998. The mean latent heating profiles over six oceanic regions (TOGA COARE IFA, Central Pacific, S. Pacific Convergence Zone, East Pacific, Indian Ocean and Atlantic Ocean) and three continental regions (S. America, Central Africa and Australia) are estimated and studied. The heating profiles obtained from the results of diagnostic budget studies over a broad range of geographic locations are used to provide comparisons and indirect validation for the heating algorithm estimated heating profiles. Three different latent heating algorithms, the Goddard Convective-Stratiform (CSH) heating, the Goddard Profiling (GPROF) heating, and the Hydrometeor heating (HH) are used and their results are intercompared. The horizontal distribution or patterns of latent heat release from the three different heating retrieval methods are quite similar. They all can identify the areas of major convective activity (i.e., a well defined ITCZ in the Pacific, a distinct SPCZ) in the global tropics. The magnitude of their estimated latent heating release is also not in bad agreement with each other and with those determined from diagnostic budget studies. However, the major difference among these three heating retrieval algorithms is the altitude of the maximum heating level. The CSH algorithm estimated heating profiles only show one maximum heating level, and the level varies between convective activity from various geographic locations. These features are in good agreement with diagnostic budget studies. By contrast, two maximum heating levels were found using the GPROF heating and HH algorithms. The latent heating profiles estimated from all three methods can not show cooling between active convective events. We also examined the impact of different TMI (Multi-channel Passive Microwave Sensor) and PR (Precipitation Radar) rainfall information on latent heating structures.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2013-08-29
    Description: The NASA/GSFC Scanning Raman Lidar (SRL) was stationed on Andros Island in the Bahamas during August - September, 1998 as a part of the third Convection and Moisture Experiment (CAMEX-3) which focussed on hurricane development and tracking. During the period August 21 - 24, hurricane Bonnie passed near Andros Island and influenced the water vapor and cirrus cloud measurements acquired by the SRL. Two drying signatures related to the hurricane were recorded by the SRL (Scanning Raman Lidar) and other sensors. Cirrus cloud optical depths (at 351 nm) were also measured during this period. Optical depth values ranged from approximately 0.01 to 1.4. The influence of multiple scattering on these optical depth measurements was studied with the conclusion that the measured values of optical depth are less than the actual value by up to 20% . The UV/IR cirrus cloud optical depth ratio was estimated based on a comparison of lidar and GOES measurements. Simple radiative transfer model calculations compared with GOES satellite brightness temperatures indicate that satellite radiances are significantly affected by the presence of cirrus clouds if IR optical depths are approximately 0.02 or greater. This has implications for satellite cirrus detection requirements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2013-08-29
    Description: Central Florida is the ideal test laboratory for studying convergence zone-induced convection. The region regularly experiences sea breeze fronts and rainfall-induced outflow boundaries. The focus of this study is the common yet poorly-studied convergence zone established by the interaction of the sea breeze front and an outflow boundary. Previous studies have investigated mechanisms primarily affecting storm initiation by such convergence zones. Few have focused on rainfall morphology yet these storms contribute a significant amount precipitation to the annual rainfall budget. Low-level convergence and mid-tropospheric moisture have both been shown to correlate with rainfall amounts in Florida. Using 2D and 3D numerical simulations, the roles of low-level convergence and mid-tropospheric moisture in rainfall evolution are examined. The results indicate that time-averaged, vertical moisture flux (VMF) at the sea breeze front/outflow convergence zone is directly and linearly proportional to initial condensation rates. This proportionality establishes a similar relationship between VMF and initial rainfall. Vertical moisture flux, which encompasses depth and magnitude of convergence, is better correlated to initial rainfall production than surface moisture convergence. This extends early observational studies which linked rainfall in Florida to surface moisture convergence. The amount and distribution of mid-tropospheric moisture determines how rainfall associated with secondary cells develop. Rainfall amount and efficiency varied significantly over an observable range of relative humidities in the 850- 500 mb layer even though rainfall evolution was similar during the initial or "first-cell" period. Rainfall variability was attributed to drier mid-tropospheric environments inhibiting secondary cell development through entrainment effects. Observationally, 850-500 mb moisture structure exhibits wider variability than lower level moisture, which is virtually always present in Florida. A likely consequence of the variability in 850-500 moisture is a stronger statistical correlation to rainfall, which observational studies have noted. The study indicates that vertical moisture flux forcing at convergence zones is critical in determining rainfall in the initial stage of development but plays a decreasing role in rainfall evolution as the system matures. The mid-tropospheric moisture (e.g. environment) plays an increasing role in rainfall evolution as the system matures. This suggests the need to improve measurements of magnitude/depth of convergence and mid-tropospheric moisture distribution. It also highlights the need for better parameterization of entrainment and vertical moisture distribution in larger-scale models.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2013-08-29
    Description: This paper shows that if one is provided with a loss function, it can be used in a natural way to specify a distance measure quantifying the similarity of any two supervised learning algorithms, even non-parametric algorithms. Intuitively, this measure gives the fraction of targets and training sets for which the expected performance of the two algorithms differs significantly. Bounds on the value of this distance are calculated for the case of binary outputs and 0-1 loss, indicating that any two learning algorithms are almost exactly identical for such scenarios. As an example, for any two algorithms A and B, even for small input spaces and training sets, for less than 2e(-50) of all targets will the difference between A's and B's generalization performance of exceed 1%. In particular, this is true if B is bagging applied to A, or boosting applied to A. These bounds can be viewed alternatively as telling us, for example, that the simple English phrase 'I expect that algorithm A will generalize from the training set with an accuracy of at least 75% on the rest of the target' conveys 20,000 bytes of information concerning the target. The paper ends by discussing some of the subtleties of extending the distance measure to give a full (non-parametric) differential geometry of the manifold of learning algorithms.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2013-08-29
    Description: In this paper, we discuss our approach to making the behavior of planetary rovers more robust for the purpose of increased productivity. Due to the inherent uncertainty in rover exploration, the traditional approach to rover control is conservative, limiting the autonomous operation of the rover and sacrificing performance for safety. Our objective is to increase the science productivity possible within a single uplink by allowing the rover's behavior to be specified with flexible, contingent plans and by employing dynamic plan adaptation during execution. We have deployed a system exhibiting flexible, contingent execution; this paper concentrates on our ongoing efforts on plan adaptation, Plans can be revised in two ways: plan steps may be deleted, with execution continuing with the plan suffix; and the current plan may be merged with an "alternate plan" from an on-board library. The plan revision action is chosen to maximize the expected utility of the plan. Plan merging and action deletion constitute a more conservative general-purpose planning system; in return, our approach is more efficient and more easily verified, two important criteria for deployed rovers.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2013-08-29
    Description: This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2013-08-29
    Description: In this first of two papers, strong limits on the accuracy of physical computation are established. First it is proven that there cannot be a physical computer C to which one can pose any and all computational tasks concerning the physical universe. Next it is proven that no physical computer C can correctly carry out any computational task in the subset of such tasks that can be posed to C. This result holds whether the computational tasks concern a system that is physically isolated from C, or instead concern a system that is coupled to C. As a particular example, this result means that there cannot be a physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly 'processing information faster than the universe does'. The results also mean that there cannot exist an infallible, general-purpose observation apparatus, and that there cannot be an infallible, general-purpose control apparatus. These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - a definition of 'physical computation' - is needed to address the issues considered in these papers. While this definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. The second in this pair of papers presents a preliminary exploration of some of this mathematical structure, including in particular that of prediction complexity, which is a 'physical computation analogue' of algorithmic information complexity. It is proven in that second paper that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    Publication Date: 2013-08-29
    Description: Chao's numerical and theoretical work on multiple quasi-equilibria of the intertropical convergence zone (ITCZ) and the origin of monsoon onset is extended to solve two additional puzzles. One is the highly nonlinear dependence on latitude of the "force" acting on the ITCZ due to earth's rotation, which makes the multiple quasi-equilibria of the ITCZ and monsoon onset possible. The other is the dramatic difference in such dependence when different cumulus parameterization schemes are used in a model. Such a difference can lead to a switch between a single ITCZ at the equator and a double ITCZ, when a different cumulus parameterization scheme is used. Sometimes one of the double ITCZ can diminish and only the other remain, but still this can mean different latitudinal locations for the single ITCZ. A single idea based on two off-equator attractors for the ITCZ, due to earth's rotation and symmetric with respect to the equator, and the dependence of the strength and size of these attractors on the cumulus parameterization scheme solves both puzzles. The origin of these rotational attractors, explained in Part I, is further discussed. The "force" acting on the ITCZ due to earth's rotation is the sum of the "forces" of the two attractors. Each attractor exerts on the ITCZ a "force" of simple shape in latitude; but the sum gives a shape highly varying in latitude. Also the strength and the domain of influence of each attractor vary, when change is made in the cumulus parameterization. This gives rise to the high sensitivity of the "force" shape to cumulus parameterization. Numerical results, of experiments using Goddard's GEOS general circulation model, supporting this idea are presented. It is also found that the model results are sensitive to changes outside of the cumulus parameterization. The significance of this study to El Nino forecast and to tropical forecast in general is discussed.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2013-08-29
    Description: The use of soft computing techniques in coherent communications phase synchronization provides an alternative to analytical or hard computing methods. This paper discusses a novel use of Adaptive Neuro-Fuzzy Inference Systems (ANFIS) for phase synchronization in coherent communications systems utilizing Multiple Phase Shift Keying (MPSK) modulation. A brief overview of the M-PSK digital communications bandpass modulation technique is presented and it's requisite need for phase synchronization is discussed. We briefly describe the hybrid platform developed by Jang that incorporates fuzzy/neural structures namely the, Adaptive Neuro-Fuzzy Interference Systems (ANFIS). We then discuss application of ANFIS to phase estimation for M-PSK. The modeling of both explicit, and implicit phase estimation schemes for M-PSK symbols with unknown structure are discussed. Performance results from simulation of the above scheme is presented.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-29
    Description: This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter will start a series of notes concentrating on analysis techniques with this issues section discussing worst-case analysis requirements.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2013-08-29
    Description: The 1997-1999 ENSO period was very powerful, but also well observed. Multiple satellite rainfall estimates combined with gauge observations allow for a quantitative analysis of precipitation anomalies in the tropics and elsewhere accompanying the 1997-99 ENSO cycle. An examination of the evolution of the El Nino and accompanying precipitation anomalies revealed that a dry Maritime Continent preceded the formation of positive SST anomalies in the eastern Pacific Ocean. 30-60 day oscillations in the winter of 1996/97 may have contributed to this lag relationship. Furthermore, westerly wind burst events may have maintained the drought over the Maritime Continent. The warming of the equatorial Pacific was then followed by an increase in convection. A rapid transition from El Nino to La Nina occurred in May 1998, but as early as October-November 1997 precipitation indices captured substantial changes in Pacific rainfall anomalies. The global precipitation patterns for this event were in good agreement with the strong consistent ENSO-related precipitation signals identified in earlier studies. Differences included a shift in precipitation anomalies over Africa during the 1997-98 El Nino and unusually wet conditions over northeast Australia during the later stages of the El Nino. Also, the typically wet region in the north tropical Pacific was mostly dry during the 1998-99 La Nina. Reanalysis precipitation was compared to observations during this time period and substantial differences were noted. In particular, the model had a bias towards positive precipitation anomalies and the magnitudes of the anomalies in the equatorial Pacific were small compared to the observations. Also, the evolution of the precipitation field, including the drying of the Maritime Continent and eastward progression of rainfall in the equatorial Pacific was less pronounced for the model compared to the observations.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2013-08-29
    Description: This paper describes a translator called JAVA PATHFINDER from JAVA to PROMELA, the "programming language" of the SPIN model checker. The purpose is to establish a framework for verification and debugging of JAVA programs based on model checking. This work should be seen in a broader attempt to make formal methods applicable "in the loop" of programming within NASA's areas such as space, aviation, and robotics. Our main goal is to create automated formal methods such that programmers themselves can apply these in their daily work (in the loop) without the need for specialists to manually reformulate a program into a different notation in order to analyze the program. This work is a continuation of an effort to formally verify, using SPIN, a multi-threaded operating system programmed in Lisp for the Deep-Space 1 spacecraft, and of previous work in applying existing model checkers and theorem provers to real applications.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2013-08-29
    Description: The predictability of the 1997 and 1998 south Asian summer monsoon winds is examined from an ensemble of 10 Atmospheric General Circulation Model (AGCM) simulations with prescribed sea surface temperatures (SSTs) and soil moisture, The simulations are started in September 1996 so that they have lost all memory of the atmospheric initial conditions for the periods of interest. The model simulations show that the 1998 monsoon is considerably more predictable than the 1997 monsoon. During May and June of 1998 the predictability of the low-level wind anomalies is largely associated with a local response to anomalously warm Indian Ocean SSTs. Predictability increases late in the season (July and August) as a result of the strengthening of the anomalous Walker circulation and the associated development of easterly low level wind anomalies that extend westward across India and the Arabian Sea. During these months the model is also the most skillful with the observations showing a similar late-season westward extension of the easterly CD wind anomalies. The model shows little predictability or skill in the low level winds over southeast Asia during, 1997. Predictable wind anomalies do occur over the western Indian Ocean and Indonesia, however, over the Indian Ocean they are a response to SST anomalies that were wind driven and they show no skill. The reduced predictability in the low level winds during 1997 appears to be the result of a weaker (compared with 1998) simulated anomalous Walker circulation, while the reduced skill is associated with pronounced intraseasonal activity that is not well captured by the model. Remarkably, the model does produce an ensemble mean Madden-Julian Oscillation (MJO) response that is approximately in phase with (though weaker than) the observed MJ0 anomalies. This is consistent with the idea that SST coupling may play an important role in the MJO.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2013-08-29
    Description: This article is partly a review and partly a new research paper on monsoon-ENSO relationship. The paper begins with a discussion of the basic relationship between the Indian monsoon and ENSO dating back to the work of Sir Gilbert Walker up to research results in more recent years. Various factors that may affect the monsoon-ENSO, relationship, including regional coupled ocean-atmosphere processes, Eurasian snow cover, land-atmosphere hydrologic feedback, intraseasonal oscillation, biennial variability and inter-decadal variations, are discussed. The extreme complex and highly nonlinear nature of the monsoon-ENSO relationship is stressed. We find that for regional impacts on the monsoon, El Nino and La Nina are far from simply mirror images of each other. These two polarities of ENSO can have strong or no impacts on monsoon anomalies depending on the strength of the intraseasonal oscillations and the phases of the inter-decadal variations. For the Asian-Australian monsoon (AAM) as a whole, the ENSO impact is effected through a east-west shift in the Walker Circulation. For rainfall anomalies over specific monsoon areas, regional processes play important roles in addition to the shift in the Walker Circulation. One of the key regional processes identified for the boreal summer monsoon is the anomalous West Pacific Anticyclone (WPA). This regional feature has similar signatures in interannual and intraseasonal time scales and appears to determine whether the monsoon-ENSO relationship is strong or weak in a given year. Another important regional feature includes a rainfall and SST dipole across the Indian Ocean, which may have strong impact on the austral summer monsoon. Results are shown indicating that monsoon surface wind forcings may induce a strong biennial signal in ENSO and that strong monsoon-ENSO coupling may translate into pronounced biennial variability in ENSO. Finally, a new paradigm is proposed for the study of monsoon variability. This paradigm provides a unified framework in which monsoon predictability, the role of regional vs. basin-scale processes, its relationship with different climate subsystems, and causes of secular changes in monsoon-ENSO relationship can be investigated.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2013-08-29
    Description: We describe the Object Infrastructure Framework, a system that seeks to simplify the creation of distributed applications by injecting behavior on the communication paths between components. We touch on some of the ilities and services that can be achieved with injector technology, and then focus on the uses of redirecting injectors, injectors that take requests directed at a particular server and generate requests directed at others. We close by noting that OIF is an Aspect-Oriented Programming system, and comparing OIF to related work.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    Publication Date: 2013-08-29
    Description: Testing involves commercial radio equipment approved for export and use in Canada. Testing was conducted in the Canadian High Arctic, where hilly terrain provided the worst-case testing. SFU and Canadian governmental agencies made significant technical contributions. The only technical data related to radio testing was exchanged with SFU. Test protocols are standard radio tests performed by communication technicians worldwide. The Joint Fields Operations objectives included the following: (1) to provide Internet communications services for field science work and mobile exploration systems; (2) to evaluate the range and throughput of three different medium-range radio link technologies for providing coverage of the crater area; and (3) to demonstrate collaborative software such as NetMeeting with multi-point video for exchange of scientific information between remote node and base-base camp and science centers as part of communications testing.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2013-08-29
    Description: FutureFlight Central will permit integration of tomorrow's technologies in a risk-free simulation of any airport, airfield, and tower cab environment. The facility provides an opportunity for airlines to mitigate passenger delays by fine tuning airport hub operations, gate management and ramp movement procedures. It also allows airport managers an opportunity to study effects of various improvements at their airports. Finally, it enables air traffic controllers to provide feedback and to become familiar with new airport operations and technologies before final installation.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2013-08-29
    Description: Surface profiles were generated by a fractal algorithm and haptically rendered on a force feedback joystick, Subjects were asked to use the joystick to explore pairs of surfaces and report to the experimenter which of the surfaces they felt was rougher. Surfaces were characterized by their root mean square (RMS) amplitude and their fractal dimension. The most important factor affecting the perceived roughness of the fractal surfaces was the RMS amplitude of the surface. When comparing surfaces of fractal dimension 1.2-1.35 it was found that the fractal dimension was negatively correlated with perceived roughness.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2013-08-29
    Description: Abstract A 1999 study reports an advancement of spring in Europe by 0.2 days per year in the 30 years since 1960. Our analysis indicates that this trend results directly from a change in the late-winter surface winds over the eastern North Atlantic: the southwesterly direction became more dominant, and the speed of these southwesterlies increased slightly. Splitting the 52-year NCEP reanalysis dataset into the First Half, FH (1948-1973)), and the Second Half, SH (1974-1999), we analyze the wind direction for the February mean at three sites at 45N: site A at 30W, site B at 20W, and site C at 10W. The incidence (number of years) of the southwesterlies in SH Vs. (FH) at these sites respectively increased in SH as follows: 24(18), 19(12), 14(l 1); whereas the incidence of northeasterlies decreased: 0(2), 1(2), and 1(6). When the February mean wind is southwesterly, the monthly mean sensible heat flux from the ocean at these sites takes zero or slightly negative values, that is, the surface air is warmer than the ocean. Analyzing the scenario in the warm late winter 1990, we observe that the sensible heat flux from the ocean surface in February 1990 shows a "tongue" of negative values extending southwest from southern England to 7N. This indicates that the source of the maritime air advected into Europe lies to the south of the "tongue." Streamline analysis suggests that the Southwestern or southcentral North Atlantic is the source. For February 1990, we find strong, ascending motions over Europe at 700 mb, up to -0.4 Pa/s as monthly averages. Associated with the unstable low-levels of the troposphere are positive rain and cloud anomalies. Thus, positive in situ feedback over land in late winter (when shortwave absorption is not significant) apparently further enhances the surface temperature through an increase in the greenhouse effect due to increased water vapor and cloudiness.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2013-08-29
    Description: It is a long-held fundamental belief that the basic cause of a monsoon is land-sea thermal contrast on the continental scale. Through general circulation model experiments we demonstrate that this belief should be changed. The Asian and Australian summer monsoon circulations are largely intact in an experiment in which Asia, maritime continent, and Australia are replaced by ocean. It is also shown that the change resulting from such replacement is in general due more to the removal of topography than to the removal of land-sea contrast. Therefore, land-sea contrast plays only a minor modifying role in Asian and Australian summer monsoons. This also happens to the Central American summer monsoon. However, the same thing cannot be said of the African and South American summer monsoons. In Asian and Australian winter monsoons land-sea contrast also plays only a minor role. Our interpretation for the origin of monsoon is that the summer monsoon is the result of ITCZ's (intertropical convergence zones) peak being substantially (more than 10 degrees) away from the equator. The origin of the ITCZ has been previously interpreted by Chao. The circulation around thus located ITCZ, previously interpreted by Chao and Chen through the modified Gill solution and briefly described in this paper, explains the monsoon circulation. The longitudinal location of the ITCZs is determined by the distribution of surface conditions. ITCZ's favor locations of higher SST as in western Pacific and Indian Ocean, or tropical landmass, due to land-sea contrast, as in tropical Africa and South America. Thus, the role of landmass in the origin of monsoon can be replaced by ocean of sufficiently high SST. Furthermore, the ITCZ circulation extends into the tropics in the other hemisphere to give rise to the winter monsoon circulation there. Also through the equivalence of land-sea contrast and higher SST, it is argued that the basic monsoon onset mechanism proposed by Chao is valid for all monsoons.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2013-08-29
    Description: It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2013-08-29
    Description: Radar has been proposed as a way to track wake vortices to reduce aircraft spacing and tests have revealed radar echoes from aircraft wakes in clear air. The results are always interpreted qualitatively using Tatarski's theory of weak scattering by isotropic atmospheric turbulence. The goal of the present work was to predict the value of the radar cross-section (RCS) using simpler models. This is accomplished in two steps. First, the refractive index is obtained. Since the structure of the aircraft wakes is different from atmospheric turbulence, three simple mechanisms specific to vortex wakes are considered: (1) Radial density gradient in a two-dimensional vortex, (2) three-dimensional fluctuations in the vortex cores, and (3) Adiabatic transport of the atmospheric fluid in a two-dimensional oval surrounding the pair of vortices. The index of refraction is obtained more precisely for the two-dimensional mechanisms than for the three-dimensional ones. In the second step, knowing the index of refraction, a scattering analysis is performed. Tatarski's weak scattering approximation is kept but the usual assumptions of a far-field and a uniform incident wave are dropped. Neither assumption is generally valid for a wake that is coherent across the radar beam. For analytical insight, a simpler approximation that invokes, in addition to weak scattering, the far-field and wide cylindrical beam assumptions, is also developed and compared with the more general analysis. The predicted RCS values for the oval surround the vortices (mechanism C) agree with the experiments of Bilson conducted over a wide range of frequencies. However, the predictions have a cut-off away from normal incidence which is not present in the measurements. Estimates suggest that this is due to turbulence in the baroclinic vorticity generated at the boundary of the oval. The reflectivity of a vortex itself (mechanism A) is comparable to that of the oval (mechanism C) but cuts-off at frequencies lower than those considered in all the experiments to date. The RCS of a vortex happens to peak at the frequency (about 49 MHz) where atmospheric radars (known as ST radars) operate and so the present prediction could be verified in the future. Finally , we suggest that hot engine exhaust could increase RCE by 40 db and reveal vortex circulation, provided its mixing with the surroundings is prevented in the laminarising flow of the vortices.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2013-08-29
    Description: We evaluated the impact of several newly available sources of meteorological data on mesoscale model forecasts of precipitation produced by the extra-tropical cyclone that struck Florida on February 2, 1998. Precipitation distributions of convective rainfall events were derived from Special Sensor Microwave Imager (SSM/I) and Multi-Channel Passive Microwave Sensor (TMI) microwave radiometric data by means of the Goddard PROFiling (GPROF) algorithm. Continuous lightning distributions were obtained from sferics measurements obtained from a network of VLF radio receivers. Histograms of coincident sferics frequency distributions were matched to those of precipitation to derive bogus convective rainfall rates from the continuously available sferics measurements. SSM/I and TMI microwave data were used to derive Integrated Precipitable Water (IPW) distributions. The TMI also provided sea surface temperatures (SSTS) of the Loop Current and Gulf Stream with improved structural detail. A series of experiments assimilated IPW and latent heating from the bogus convective rainfall for six-hours in the MM5 mesoscale forecast model to produce nine-hour forecasts of all rainfall as well as other weather parameters. Although continuously assimilating latent heating only slightly improved the surface pressure distribution forecast, it significantly improved the precipitation forecasts. Correctly locating convective rainfall was found critical for assimilating latent heating in the forecast model, but measurement of the rainfall intensity proved to be less important. The improved SSTs also had a positive impact on rainfall forecasts for this case. Assimilating bogus rainfall in the model produced nine-hour forecasts of radar reflectivity distributions that agreed well with coincident observations from the TRMM spaceborne precipitation radar, ground based radar and spaceborne microwave measurements.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2013-08-29
    Description: We believe that the next evolutionary step in supporting wide-area application and services delivery to customers is a network framework that provides for collocation of applications and services at distinct sites in the network, an interconnection between these sites that is performance optimized for these applications, and value-added services for applications. We use the term IsoWAN to describe an advanced, isolated network interconnect services framework that will enable applications to be more secure, and able to access and be in use in both local and remote environments. The main functions of an IsoWAN are virtual localization of application services, an application service interface, coordinated delivery of applications and associated data to the customer, and supporting collaborative application development for customers. An initial pilot network between three NASA Centers: Ames Research Center, the Jet Propulsion Laboratory, and Marshall Space Flight Center, has been built and its properties will be discussed.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2013-08-29
    Description: In the first of this pair of papers, it was proven that there cannot be a physical computer to which one can properly pose any and all computational tasks concerning the physical universe. It was then further proven that no physical computer C can correctly carry out all computational tasks that can be posed to C. As a particular example, this result means that no physical computer that can, for any physical system external to that computer, take the specification of that external system's state as input and then correctly predict its future state before that future state actually occurs; one cannot build a physical computer that can be assured of correctly "processing information faster than the universe does". These results do not rely on systems that are infinite, and/or non-classical, and/or obey chaotic dynamics. They also hold even if one uses an infinitely fast, infinitely dense computer, with computational powers greater than that of a Turing Machine. This generality is a direct consequence of the fact that a novel definition of computation - "physical computation" - is needed to address the issues considered in these papers, which concern real physical computers. While this novel definition does not fit into the traditional Chomsky hierarchy, the mathematical structure and impossibility results associated with it have parallels in the mathematics of the Chomsky hierarchy. This second paper of the pair presents a preliminary exploration of some of this mathematical structure. Analogues of Chomskian results concerning universal Turing Machines and the Halting theorem are derived, as are results concerning the (im)possibility of certain kinds of error-correcting codes. In addition, an analogue of algorithmic information complexity, "prediction complexity", is elaborated. A task-independent bound is derived on how much the prediction complexity of a computational task can differ for two different reference universal physical computers used to solve that task, a bound similar to the "encoding" bound governing how much the algorithm information complexity of a Turing machine calculation can differ for two reference universal Turing machines. Finally, it is proven that either the Hamiltonian of our universe proscribes a certain type of computation, or prediction complexity is unique (unlike algorithmic information complexity), in that there is one and only version of it that can be applicable throughout our universe.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2013-08-29
    Description: We consider the design of multi-agent systems so as to optimize an overall world utility function when (1) those systems lack centralized communication and control, and (2) each agents runs a distinct Reinforcement Learning (RL) algorithm. A crucial issue in such design problems is to initialize/update each agent's private utility function, so as to induce best possible world utility. Traditional 'team game' solutions to this problem sidestep this issue and simply assign to each agent the world utility as its private utility function. In previous work we used the 'Collective Intelligence' framework to derive a better choice of private utility functions, one that results in world utility performance up to orders of magnitude superior to that ensuing from use of the team game utility. In this paper we extend these results. We derive the general class of private utility functions that both are easy for the individual agents to learn and that, if learned well, result in high world utility. We demonstrate experimentally that using these new utility functions can result in significantly improved performance over that of our previously proposed utility, over and above that previous utility's superiority to the conventional team game utility.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2013-08-29
    Description: The NASA/GSFC Scanning Raman Lidar (SRL) was stationed on Andros Island in the Bahamas during August - September, 1998 as a part of the third Convection and Moisture Experiment (CAMEX-3) which focussed on hurricane development and tracking. During the period August 21 - 24, hurricane Bonnie passed near Andros Island and influenced the water vapor and cirrus cloud measurements acquired by the SRL. Two drying signatures related to the hurricane were recorded by the SRL and other sensors. Cirrus cloud optical depths (at 351 nm) were also measured during this period. Optical depth values ranged from less than 0.01 to 1.5. The influence of multiple scattering on these optical depth measurements was studied. A correction technique is presented which minimizes the influences of multiple scattering and derives information about cirrus cloud optical and physical properties. The UV/IR cirrus cloud optical depth ratio was estimated based on a comparison of lidar and GOES measurements. Simple radiative transfer model calculations compared with GOES satellite brightness temperatures indicate that satellite radiances are significantly affected by the presence of cirrus clouds if IR optical depths are approximately 0.005 or greater. Using the ISCCP detection threshold for cirrus clouds on the GOES data presented here, a high bias of up to 40% in the GOES precipitable water retrieval was found.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2013-08-29
    Description: Quantitative use of satellite-derived maps of monthly rainfall requires some measure of the accuracy of the satellite estimates. The rainfall estimate for a given map grid box is subject to both remote-sensing error and, in the case of low-orbiting satellites, sampling error due to the limited number of observations of the grid box provided by the satellite. A simple model of rain behavior predicts that Root-mean-square (RMS) random error in grid-box averages should depend in a simple way on the local average rain rate, and the predicted behavior has been seen in simulations using surface rain-gauge and radar data. This relationship was examined using satellite SSM/I data obtained over the western equatorial Pacific during TOGA COARE. RMS error inferred directly from SSM/I rainfall estimates was found to be larger than predicted from surface data, and to depend less on local rain rate than was predicted. Preliminary examination of TRMM microwave estimates shows better agreement with surface data. A simple method of estimating rms error in satellite rainfall estimates is suggested, based on quantities that can be directly computed from the satellite data.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2013-08-29
    Description: The tropical cyclone rainfall climatology study that was performed for the North Pacific was extended to the North Atlantic. Similar to the North Pacific tropical cyclone study, mean monthly rainfall within 444 km of the center of the North Atlantic tropical cyclones (i.e., that reached storm stage and greater) was estimated from passive microwave satellite observations during, an eleven year period. These satellite-observed rainfall estimates were used to assess the impact of tropical cyclone rainfall in altering the geographical, seasonal, and inter-annual distribution of the North Atlantic total rainfall during, June-November when tropical cyclones were most abundant. The main results from this study indicate: 1) that tropical cyclones contribute, respectively, 4%, 3%, and 4% to the western, eastern, and entire North Atlantic; 2) similar to that observed in the North Pacific, the maximum in North Atlantic tropical cyclone rainfall is approximately 5 - 10 deg poleward (depending on longitude) of the maximum non-tropical cyclone rainfall; 3) tropical cyclones contribute regionally a maximum of 30% of the total rainfall 'northeast of Puerto Rico, within a region near 15 deg N 55 deg W, and off the west coast of Africa; 4) there is no lag between the months with maximum tropical cyclone rainfall and non-tropical cyclone rainfall in the western North Atlantic, while in the eastern North Atlantic, maximum tropical cyclone rainfall precedes maximum non-tropical cyclone rainfall; 5) like the North Pacific, North Atlantic tropical cyclones Of hurricane intensity generate the greatest amount of rainfall in the higher latitudes; and 6) warm ENSO events inhibit tropical cyclone rainfall.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2013-08-29
    Description: The mechanism of the quasi-biennial tendency in El Nino Southern Oscillation (ENSO)-monsoon coupled system is investigated using an intermediate coupled model. The monsoon wind forcing is prescribed as a function of Sea Surface Temperature (SST) anomalies based on the relationship between zonal wind anomalies over the western Pacific to sea level change in the equatorial eastern Pacific. The key mechanism of quasi-biennial tendency in El Nino evolution is found to be in the strong coupling of ENSO to monsoon wind forcing over the western Pacific. Strong boreal summer monsoon wind forcing, which lags the maximum SST anomaly in the equatorial eastern Pacific approximately 6 months, tends to generate Kelvin waves of the opposite sign to anomalies in the eastern Pacific and initiates the turnabout in the eastern Pacific. Boreal winter monsoon forcing, which has zero lag with maximum SST in the equatorial eastern Pacific, tends to damp the ENSO oscillations.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2013-08-29
    Description: Idealized numerical simulations are performed with a coupled atmosphere/land-surface model to identify the roles of initial soil moisture, coastline curvature, and land breeze circulations on sea breeze initiated precipitation. Data collected on 27 July 1991 during the Convection and Precipitation Electrification Experiment (CAPE) in central Florida are used. The 3D Goddard Cumulus Ensemble (GCE) cloud resolving model is coupled with the Goddard Parameterization for Land-Atmosphere-Cloud Exchange (PLACE) land surface model, thus providing a tool to simulate more realistically land-surface/atmosphere interaction and convective initiation. Eight simulations are conducted with either straight or curved coast-lines, initially homogeneous soil moisture or initially variable soil moisture, and initially homogeneous horizontal winds or initially variable horizontal winds (land breezes). All model simulations capture the diurnal evolution and general distribution of sea-breeze initiated precipitation over central Florida. The distribution of initial soil moisture influences the timing, intensity and location of subsequent precipitation. Soil moisture acts as a moisture source for the atmosphere, increases the connectively available potential energy, and thus preferentially focuses heavy precipitation over existing wet soil. Strong soil moisture-induced mesoscale circulations are not evident in these simulations. Coastline curvature has a major impact on the timing and location of precipitation. Earlier low-level convergence occurs inland of convex coastlines, and subsequent precipitation occurs earlier in simulations with curved coastlines. The presence of initial land breezes alone has little impact on subsequent precipitation. however, simulations with both coastline curvature and initial land breezes produce significantly larger peak rain rates due to nonlinear interactions.
    Keywords: Meteorology and Climatology
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2013-08-31
    Description: The numerical simulation of precipitation helps scientists understand the complex mechanisms that determine how and why rainfall is distributed across the globe. Simulation aids in the development of forecastin,g efforts that inform policies regarding the management of water resources. Precipitation modeling also provides short-term warnings, for emergencies such as flash floods and mudslides. Just as precipitation modeling can warn of an impending abundance of rainfall, it can help anticipate the absence of rainfall in drought. What constitutes a drought? A meteorological drought simply means that an area is getting a significantly lower amount of rain than usual over a prolonged period of time and an agricultural drought is based on the level of soil moisture.
    Keywords: Meteorology and Climatology
    Type: 2000 NCCS Highlights: Enabling NASA Earth and Space Sciences; 56-65
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: This paper presents the aspects of language programming transformations that were unknown in the early 1980's.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2013-08-31
    Description: Aerosol is any small particle of matter that rests suspended in the atmosphere. Natural sources, such as deserts, create some aerosols; consumption of fossil fuels and industrial activity create other aerosols. All the microscopic aerosol particles add up to a large amount of material floating in the atmosphere. You can see the particles in the haze that floats over polluted cities. Beyond this visible effect, aerosols can actually lower temperatures. They do this by blocking, or scattering, a portion of the sun's energy from reaching the surface. Because of this influence, scientists study the physical properties of atmospheric aerosols. Reliable numerical models for atmospheric aerosols play an important role in research.
    Keywords: Meteorology and Climatology
    Type: 2000 NCCS Highlights: Enabling NASA Earth and Space Science; 38-45
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: How are scientists going to make use of the Internet several years from now? This is a case study of a leading-edge experiment in building a 'virtual institute'-- using electronic communication tools to foster collaboration among geographically dispersed scientists. Our experience suggests: Scientists will want to use web-based document management systems. There will be a demand for Internet-enabled meeting support tools. While internet videoconferencing will have limited value for scientists, webcams will be in great demand as a tool for transmitting pictures of objects and settings, rather than "talking heads." and a significant share of scientists who do fieldwork will embrace mobile voice, data and video communication tools. The setting for these findings is a research consortium called the NASA Astrobiology Institute.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2013-08-31
    Description: A method is developed for semi-automated classification of SAR images of the tropical forest. Information is extracted using the wavelet transform (WT). The transform allows for extraction of structural information in the image as a function of scale. In order to classify the SAR image, a Desicion Tree Classifier is used. The method of pruning is used to optimize classification rate versus tree size. The results give explicit insight on the type of information useful for a given class.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2013-08-31
    Description: The purpose of this paper is to provide a description of NASA JPL Distributed Systems Technology (DST) Section's object-oriented component approach to open inter-operable systems software development and software reuse. It will address what is meant by the terminology object component software, give an overview of the component-based development approach and how it relates to infrastructure support of software architectures and promotes reuse, enumerate on the benefits of this approach, and give examples of application prototypes demonstrating its usage and advantages. Utilization of the object-oriented component technology approach for system development and software reuse will apply to several areas within JPL, and possibly across other NASA Centers.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2013-08-31
    Description: Radiation belts around Planets have sufficient high energy electron flux to penetrate spacecraft skins and statically charge insulators inside the electronic boxes. For example, geosynchronous-orbit Earth spacecraft require 100 mils aluminum shielding to sufficiently attenuate the fast electron flux. Electrons are stopped and accumulate slowly in the insulated materials to produce strong electric fields. Typically the field strength achieves a threshold for occasional spontaneous discharge in the insulating material. The field strength remains high yet pulsing is infrequent. Charge can leak off if the insulator is sufficiently leaky. The conductivity of insulators is usually controlled by mobile ions such as H and OH in ground service. In space the mobile ions are eventually out-gassed. The resistivity of several insulators is known to increase over three decades after exposure to vacuum for several months. Insulators in space were seen to pulse more frequently as they aged.
    Keywords: Communications and Radar
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2011-08-23
    Description: During the TEFLUN-B (Texas-Florida under-flights for TRMM) field experiment of August-September, 1998, a number of ER-2 aircraft flights with a host of microwave instruments were conducted over many convective storms, including some hurricanes, in the coastal region of Florida and Texas. These instruments include MIR (Millimeter-wave Imaging Radiometer), AMPR (Advanced Microwave Precipitation Radiometer), and EDOP (ER-2 Doppler Radar). EDOP is operated at the frequency of 9.7 GHz, while the AMPR and the MIR together give eleven channels of radiometric measurements in the frequency range of 10-340 GHz. The concurrent measurements from these instruments provide unique data sets for studying the details of the microphysics of hydrometeors. Preliminary examination of these data sets shows features that are generally well understood; i.e., radiometric measurements at frequencies less than or equal to 37 GHz mainly respond to rain, while those at frequencies greater than or equal to 150 GHz, to ice particles above the freezing level. Model calculations of brightness temperature and radar reflectivity are performed and results compared with these measurements. For simplicity the analysis is limited to the anvil region of the storms where hydrometeors are predominantly frozen. Only one ice particle size distribution is examined in the calculations of brightness temperature and radar reflectivity in this initial study. Estimation of ice water path is made based on the best agreement between the measurements and calculations of brightness temperature and reflectivity. Problems associated with these analyses and measurement accuracy will be discussed.
    Keywords: Meteorology and Climatology
    Type: Microwave Remote Sensing of the Atmosphere and Environment II; Volume 4152; 25-32
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    Publication Date: 2011-08-23
    Description: We have studied the influence of the geometrical interaction of different detectors with the impinging optical/laser received beam for a wide range of laser sensing applications. Although different techniques apply, it is found that similar aspects of geometrical physics plays a role in direct detection of a range-resolved large M(sup 2) OPO atmospheric Lidar, heterodyne multi-detector reception of atmospheric turbulence distorted coherent lidar type laser sensing, and the distribution and summation of laser induced fluorescence signals after being spectrally resolved with a spectrometer and detected by a column summing CCD detector. In each of these systems, the focused received light is spatially and spectrally distributed due to several factors including Field-of-View considerations, laser beam quality/divergence, multi-detector aspects, and hardware and software summation (coherent and non-coherent) of multi-element or spatially integrated signals. This invited talk will present some of our recent results in these areas and show the similarities in the detector spatial and temporal summation techniques of these different laser sensing systems.
    Keywords: Communications and Radar
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 13-20
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2011-08-23
    Description: GLOW (Goddard Lidar Observatory for Winds) is a mobile Doppler lidar system which uses direct detection Doppler lidar techniques to measure wind profiles from the surface into the lower stratosphere. The system is contained in a modified van to allow deployment in field operations. The lidar system uses a Nd:YAG laser transmitter to measure winds using either aerosol backscatter at 1064 nm or molecular backscatter at 355 nm. The receiver telescope is a 45 cm Dall-Kirkham which is fiber coupled to separate Doppler receivers, one optimized for the aerosol backscatter wind measurement and another optimized for the molecular backscatter wind measurement. The receivers are implementations of the 'double edge' technique and use high spectral resolution Fabry-Perot etalons to measure the Doppler shift. A 45 cm aperture azimuth-over-elevation scanner is mounted on the roof of the van to allow full sky access and a variety of scanning options. GLOW is intended to be used as a deployable field system for studying atmospheric dynamics and transport and can also serve as a testbed to evaluate candidate technologies developed for use in future spaceborne systems. In addition, it can be used for calibration/validation activities following launch of spaceborne wind lidar systems. A description of the mobile system is presented along with the examples of lidar wind profiles obtained with the system.
    Keywords: Meteorology and Climatology
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 314-320
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2011-08-23
    Description: Solid-state 2-microns laser has been receiving considerable interest because of its eye-safe property and efficient diode pump operation, It has potential for multiple lidar applications to detect water vapor. carbon dioxide and winds. In this paper, we describe a 2-microns double pulsed Ho:Tm:YLF laser and end-pumped amplifier system. A comprehensive theoretical model has been developed to aid the design and optimization of the laser performance. In a single Q-switched pulse operation the residual energy stored in the Tm atoms will be wasted. However, in a double pulses operation mode, the residual energy stored in the Tm atoms will repopulate the Ho atoms that were depleted by the extraction of the first Q-switched pulse. Thus. the Tin sensitized Ho:YLF laser provides a unique advantage in applications that require double pulse operation, such as Differential Absorption Lidar (DIAL). A total output energy of 146 mJ per pulse pair under Q-switch operation is achieved with as high as 4.8% optical to optical efficiency. Compared to a single pulse laser, 70% higher laser efficiency is realized. To obtain high energy while maintaining the high beam quality, a master-oscillator-power-amplifier 2-microns system is designed. We developed an end-pumped Ho:Tm:YLF disk amplifier. This amplifier uses two diode arrays as pump source. A non-imaging lens duct is used to couple the radiation from the laser diode arrays to the laser disk. Preliminary result shows that the efficiency of this laser can be as high as 3%, a factor of three increases over side-pump configuration. This high energy, highly efficient and high beam quality laser is a promising candidate for use in an efficient, multiple lidar applications.
    Keywords: Communications and Radar
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 70-76
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2011-08-23
    Description: Since the beginning of space remote sensing of the earth, there has been a natural progression widening the range of electromagnetic radiation used to sense the earth, and slowly, steadily increasing the spatial, spectral, and radiometric resolution of the measurements. There has also been a somewhat slower trend toward active measurements across the electromagnetic spectrum, motivated in part by increased resolution, but also by the ability to make new measurements. Active microwave instruments have been used to measure ocean topography, to study the land surface. and to study rainfall from space. Future NASA active microwave missions may add detail to the topographical studies, sense soil moisture, and better characterize the cryosphere. Only recently have active optical instruments been flown in space by NASA; however, there are currently several missions in development which will sense the earth with lasers and many more conceptual active optical missions which address the priorities of NASA's earth science program. Missions are under development to investigate the structure of the terrestrial vegetation canopy, to characterize the earth's ice caps, and to study clouds and aerosols. Future NASA missions may measure tropospheric vector winds and make vastly improved measurements of the chemical components of the earth's atmosphere.
    Keywords: Communications and Radar
    Type: Lidar Remote Sensing for Industry and Environment Monitoring; Volume 4153; 5-12
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2016-06-07
    Description: NASA-KSC currently uses three bridged 100-Mbps FDDI segments as its backbone for data traffic. The FDDI Transmission System (FTXS) connects the KSC industrial area, KSC launch complex 39 area, and the Cape Canaveral Air Force Station. The report presents a performance modeling study of the FTXS and the proposed ATM Transmission System (ATXS). The focus of the study is on performance of MPEG video transmission on these networks. Commercial modeling tools - the CACI Predictor and Comnet tools - were used. In addition, custom software tools were developed to characterize conversation pairs in Sniffer trace (capture) files to use as input to these tools. A baseline study of both non-launch and launch day data traffic on the FTXS is presented. MPEG-1 and MPEG-2 video traffic was characterized and the shaping of it evaluated. It is shown that the characteristics of a video stream has a direct effect on its performance in a network. It is also shown that shaping of video streams is necessary to prevent overflow losses and resulting poor video quality. The developed models can be used to predict when the existing FTXS will 'run out of room' and for optimizing the parameters of ATM links used for transmission of MPEG video. Future work with these models can provide useful input and validation to set-top box projects within the Advanced Networks Development group in NASA-KSC Development Engineering.
    Keywords: Communications and Radar
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 53-63; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2016-06-07
    Description: Several techniques had been proposed to enhance multimode fiber bandwidth-distance product. Single mode-to-multimode offset launch condition technique had been experimented with at Kennedy Space Center. Significant enhancement in multimode fiber link bandwidth is achieved using this technique. It is found that close to three-fold bandwidth enhancement can be achieved compared to standard zero offset launch technique. Moreover, significant reduction in modal noise has been observed as a function of offset launch displacement. However, significant reduction in the overall signal-to-noise ratio is also observed due to signal attenuation due to mode radiation from fiber core to its cladding.
    Keywords: Communications and Radar
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 11-21; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2016-06-07
    Description: The ability to exchange information between different engineering software (i.e, CAD, CAE, CAM) is necessary to aid in collaborative engineering. There are a number of different ways to accomplish this goal. One popular method is to transfer data via different file formats. However this method can lose data and becomes complex as more file formats are added. Another method is to use a standard protocol. STEP is one such standard. This paper gives an overview of STEP, provides a list of where to access more information, and develops guidelines to aid the reader in deciding if STEP is appropriate for his/her use.
    Keywords: Computer Programming and Software
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 23-32; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: The goal of this project was to support the development of a full duplex, spread spectrum voice communications system. The assembly and testing of a prototype system consisting of a Harris PRISM spread spectrum radio, a TMS320C54x signal processing development board and a Zilog Z80180 microprocessor was underway at the start of this project. The efforts under this project were the development of multiple access schemes, analysis of full duplex voice feedback delays, and the development and analysis of forward error correction (FEC) algorithms. The multiple access analysis involved the selection between code division multiple access (CDMA), frequency division multiple access (FDMA) and time division multiple access (TDMA). Full duplex voice feedback analysis involved the analysis of packet size and delays associated with full loop voice feedback for confirmation of radio system performance. FEC analysis included studies of the performance under the expected burst error scenario with the relatively short packet lengths, and analysis of implementation in the TMS320C54x digital signal processor. When the capabilities and the limitations of the components used were considered, the multiple access scheme chosen was a combination TDMA/FDMA scheme that will provide up to eight users on each of three separate frequencies. Packets to and from each user will consist of 16 samples at a rate of 8,000 samples per second for a total of 2 ms of voice information. The resulting voice feedback delay will therefore be 4 - 6 ms. The most practical FEC algorithm for implementation was a convolutional code with a Viterbi decoder. Interleaving of the bits of each packet will be required to offset the effects of burst errors.
    Keywords: Communications and Radar
    Type: 1999 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 73-83; NASA/CR-1999-208586
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2016-06-07
    Description: This paper outlines the main results of a number of ACTS experiments on the efficacy of using standard Internet protocols over long-delay satellite channels. These experiments have been jointly conducted by NASAs Glenn Research Center and Ohio University over the last six years. The focus of our investigations has been the impact of long-delay networks with non-zero bit-error rates on the performance of the suite of Internet protocols. In particular, we have focused on the most widely used transport protocol, the Transmission Control Protocol (TCP), as well as several application layer protocols. This paper presents our main results, as well as references to more verbose discussions of our experiments.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 135-143 and 279-290; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2016-06-07
    Description: The Satellite Networks and Architectures Branch of NASA's Glenn Research Center has developed and demonstrated several advanced satellite communications technologies through the Advanced Communications Technology Satellite (ACTS) program. One of these technologies is the implementation of a Satellite Telemammography Network (STN) encompassing NASA Glenn, the Cleveland Clinic Foundation. the University of Virginia, and the Ashtabula County Medical Center. This paper will present a look at the STN from its beginnings to the impact it may have on future telemedicine applications. Results obtained using the experimental ACTS satellite demonstrate the feasibility of Satellite Telemammography. These results have improved teleradiology processes and mammography image manipulation, and enabled advances in remote screening methodologies. Future implementation of satellite telemammography using next generation commercial satellite networks will be explored. In addition, the technical aspects of the project will be discussed, in particular how the project has evolved from using NASA developed hardware and software to commercial off the shelf (COTS) products. Development of asymmetrical link technologies was an outcome of this work. Improvements in the display of digital mammographic images, better understanding of end-to-end system requirements, and advances in radiological image compression were achieved as a result of the research. Finally, rigorous clinical medical studies are required for new technologies such as digital satellite telemammography to gain acceptance in the medical establishment. These experiments produced data that were useful in two key medical studies that addressed the diagnostic accuracy of compressed satellite transmitted digital mammography images. The results of these studies will also be discussed.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 109-115 and 253-260; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2016-06-07
    Description: The Advanced Communications Technology Satellite (ACTS) system provided a national testbed that enabled advanced applications to be tested and demonstrated over a live satellite link. Of the applications that used ACTS. some offered unique advantages over current methods, while others simply could not be accommodated by conventional systems. The initial technical and experiments results of the program were reported at the 1995 ACTS Results Conference. in Cleveland, Ohio. Since then, the Experiments Program has involved 45 new experiments comprising 30 application experiments and 15 technology related experiments that took advantage of the advanced technologies and unique capabilities offered by ACTS. The experiments are categorized and quantified to show the organizational mix of the experiments program and relative usage of the satellite. Since paper length guidelines preclude each experiment from being individually reported, the application experiments and significant demonstrations are surveyed to show the breadth of the activities that have been supported. Experiments in a similar application category are collectively discussed, such as. telemedicine. or networking and protocol evaluation. Where available. experiment conclusions and impact are presented and references of results and experiment information are provided. The quantity and diversity of the experiments program demonstrated a variety of service areas for the next generation of commercially available, advanced satellite communications.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 97-101 and 241-249; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2016-06-07
    Description: For at least 20 years, the Space Communications Program at NASA Glenn Research Center (GRC) has focused on enhancing the capability and competitiveness of the U.S. commercial communications satellite industry. GRC has partnered with the industry on the development of enabling technologies to help maintain U.S. preeminence in the worldwide communications satellite marketplace. The Advanced Communications Technology Satellite (ACTS) has been the most significant space communications technology endeavor ever performed at GRC, and the centerpiece of GRC's communication technology program for the last decade. Under new sponsorship from NASA's Human Exploration and Development of Space Enterprise, GRC has transitioned the focus and direction of its program, from commercial relevance to NASA mission relevance. Instead of one major experimental spacecraft and one headquarters sponsor, GRC is now exploring opportunities for all of NASA's Enterprises to benefit from advances in space communications technologies, and accomplish their missions through the use of existing and emerging commercially provided services. A growing vision within NASA is to leverage the best commercial standards, technologies, and services as a starting point to satisfy NASA's unique needs. GRC's heritage of industry partnerships is closely aligned with this vision. NASA intends to leverage the explosive growth of the telecommunications industry through its impressive technology advancements and potential new commercial satellite systems. GRC's partnerships with the industry, academia, and other government agencies will directly support all four NASA's future mission needs, while advancing the state of the art of commercial practice. GRC now conducts applied research and develops and demonstrates advanced communications and network technologies in support of all four NASA Enterprises (Human Exploration and Development of Space, Space Science, Earth Science, and Aero-Space Technologies).
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 171-178 and 321-326; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2016-06-07
    Description: The Advanced Communication Technology Satellite (ACTS) communications system operates in the Ka frequency band. ACTS uses multiple, hopping, narrow beams and very small aperture terminal (VSAT) technology to establish a system availability of 99.5% for bit-error-rates of 5 x 10(exp -7) Or better over the continental United States. In order maintain this minimum system availability in all US rain zones, ACTS uses an adaptive rain fade compensation protocol to reduce the impact of signal attenuation resulting from propagation effects. The purpose of this paper is to present the results of system and sub-system characterizations considering the statistical effects of system variances due to antenna wetting and depolarization effects. In addition the availability enhancements using short distance diversity in a sub-tropical rain zone are investigated.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 85-96 and 233-240; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2016-06-07
    Description: The Advanced Communications Satellite (ACTS) was conceived and developed in the mid- 1980s as an experimental satellite to demonstrate unproven Ka-band technology, and potential new commercial applications and services. Since launch into geostationary orbit in September 1993. ACTS has accumulated almost seven years of essentially trouble-free operation and met all program objectives. The unique technology, service experiments. and system level demonstrations accomplished by ACTS have been reported in many forums over the past several years. As ACTS completes its final experiments activity, this paper will relate the top-level program goals that have been achieved in the design, operation, and performance of the particular satellite subsystems. Pre-launch decisions to ensure satellite reliability and the subsequent operational experiences contribute to lessons learned that may be applicable to other comsat programs.
    Keywords: Communications and Radar
    Type: Proceeding of the Advanced Communications Technology Satellite (ACTS) Conference 2000; 3-8 and 181-190; NASA/CP-2000-210530
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2017-09-27
    Description: Historically NASA has trained teams of astronauts by bringing them to the Johnson Space Center in Houston to undergo generic training, followed by mission-specific training. This latter training begins after a crew has been selected for a mission (perhaps two years before the launch of that mission). While some Space Shuttle flights have included an astronaut from a foreign country, the International Space Station will be consistently crewed by teams comprised of astronauts from two or more of the partner nations. The cost of training these international teams continues to grow in both monetary and personal terms. Thus, NASA has been seeking alternative training approaches for the International Space Station program. Since 1994 we have been developing, testing, and refining shared virtual environments for astronaut team training, including the use of virtual environments for use while in or in transit to the task location. In parallel with this effort, we have also been preparing applications for training teams of military personnel engaged in peacekeeping missions. This paper will describe the applications developed to date, some of the technological challenges that have been overcome in their development, and the research performed to guide the development and to measure the efficacy of these shared environments as training tools.
    Keywords: Computer Programming and Software
    Type: The Capability of Virtual Reality to Meet Military Requirements; 22-1 - 22-6; RTO-MP-54
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2017-09-27
    Description: An integrally stiffened graphite/epoxy composite rotorcraft structure is evaluated via computational simulation. A computer code that scales up constituent micromechanics level material properties to the structure level and accounts for all possible failure modes is used for the simulation of composite degradation under loading. Damage initiation, growth, accumulation, and propagation to fracture are included in the simulation. Design implications with regard to defect and damage tolerance of integrally stiffened composite structures are examined. A procedure is outlined regarding the use of this type of information for setting quality acceptance criteria, design allowables, damage tolerance, and retirement-for-cause criteria.
    Keywords: Computer Programming and Software
    Type: Application of Damage Tolerance Principles for Improved Airworthiness of Rotorcraft; 12 - 1 - 12 - 13; RTO-MP-24
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2017-10-04
    Description: Parallelized versions of genetic algorithms (GAs) are popular primarily for three reasons: the GA is an inherently parallel algorithm, typical GA applications are very compute intensive, and powerful computing platforms, especially Beowulf-style computing clusters, are becoming more affordable and easier to implement. In addition, the low communication bandwidth required allows the use of inexpensive networking hardware such as standard office ethernet. In this paper we describe a parallel GA and its use in automated high-level circuit design. Genetic algorithms are a type of trial-and-error search technique that are guided by principles of Darwinian evolution. Just as the genetic material of two living organisms can intermix to produce offspring that are better adapted to their environment, GAs expose genetic material, frequently strings of 1s and Os, to the forces of artificial evolution: selection, mutation, recombination, etc. GAs start with a pool of randomly-generated candidate solutions which are then tested and scored with respect to their utility. Solutions are then bred by probabilistically selecting high quality parents and recombining their genetic representations to produce offspring solutions. Offspring are typically subjected to a small amount of random mutation. After a pool of offspring is produced, this process iterates until a satisfactory solution is found or an iteration limit is reached. Genetic algorithms have been applied to a wide variety of problems in many fields, including chemistry, biology, and many engineering disciplines. There are many styles of parallelism used in implementing parallel GAs. One such method is called the master-slave or processor farm approach. In this technique, slave nodes are used solely to compute fitness evaluations (the most time consuming part). The master processor collects fitness scores from the nodes and performs the genetic operators (selection, reproduction, variation, etc.). Because of dependency issues in the GA, it is possible to have idle processors. However, as long as the load at each processing node is similar, the processors are kept busy nearly all of the time. In applying GAs to circuit design, a suitable genetic representation 'is that of a circuit-construction program. We discuss one such circuit-construction programming language and show how evolution can generate useful analog circuit designs. This language has the desirable property that virtually all sets of combinations of primitives result in valid circuit graphs. Our system allows circuit size (number of devices), circuit topology, and device values to be evolved. Using a parallel genetic algorithm and circuit simulation software, we present experimental results as applied to three analog filter and two amplifier design tasks. For example, a figure shows an 85 dB amplifier design evolved by our system, and another figure shows the performance of that circuit (gain and frequency response). In all tasks, our system is able to generate circuits that achieve the target specifications.
    Keywords: Computer Programming and Software
    Type: Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000; D-000001
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2017-10-04
    Description: Within NASA's High Performance Computing and Communication (HPCC) program, the NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. To this end, NPSS integrates multiple disciplines such as aerodynamics, structures, and heat transfer and supports "numerical zooming" between O-dimensional to 1-, 2-, and 3-dimensional component engine codes. In order to facilitate the timely and cost-effective capture of complex physical processes, NPSS uses object-oriented technologies such as C++ objects to encapsulate individual engine components and CORBA ORBs for object communication and deployment across heterogeneous computing platforms. Recently, the HPCC program has initiated a concept called the Information Power Grid (IPG), a virtual computing environment that integrates computers and other resources at different sites. IPG implements a range of Grid services such as resource discovery, scheduling, security, instrumentation, and data access, many of which are provided by the Globus toolkit. IPG facilities have the potential to benefit NPSS considerably. For example, NPSS should in principle be able to use Grid services to discover dynamically and then co-schedule the resources required for a particular engine simulation, rather than relying on manual placement of ORBs as at present. Grid services can also be used to initiate simulation components on parallel computers (MPPs) and to address inter-site security issues that currently hinder the coupling of components across multiple sites. These considerations led NASA Glenn and Globus project personnel to formulate a collaborative project designed to evaluate whether and how benefits such as those just listed can be achieved in practice. This project involves firstly development of the basic techniques required to achieve co-existence of commodity object technologies and Grid technologies; and secondly the evaluation of these techniques in the context of NPSS-oriented challenge problems. The work on basic techniques seeks to understand how "commodity" technologies (CORBA, DCOM, Excel, etc.) can be used in concert with specialized "Grid" technologies (for security, MPP scheduling, etc.). In principle, this coordinated use should be straightforward because of the Globus and IPG philosophy of providing low-level Grid mechanisms that can be used to implement a wide variety of application-level programming models. (Globus technologies have previously been used to implement Grid-enabled message-passing libraries, collaborative environments, and parameter study tools, among others.) Results obtained to date are encouraging: we have successfully demonstrated a CORBA to Globus resource manager gateway that allows the use of CORBA RPCs to control submission and execution of programs on workstations and MPPs; a gateway from the CORBA Trader service to the Grid information service; and a preliminary integration of CORBA and Grid security mechanisms. The two challenge problems that we consider are the following: 1) Desktop-controlled parameter study. Here, an Excel spreadsheet is used to define and control a CFD parameter study, via a CORBA interface to a high throughput broker that runs individual cases on different IPG resources. 2) Aviation safety. Here, about 100 near real time jobs running NPSS need to be submitted, run and data returned in near real time. Evaluation will address such issues as time to port, execution time, potential scalability of simulation, and reliability of resources. The full paper will present the following information: 1. A detailed analysis of the requirements that NPSS applications place on IPG. 2. A description of the techniques used to meet these requirements via the coordinated use of CORBA and Globus. 3. A description of results obtained to date in the first two challenge problems.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2017-10-04
    Description: The solution data computed from large scale simulations are sometimes too big for main memory, for local disks, and possibly even for a remote storage disk, creating tremendous processing time as well as technical difficulties in analyzing the data. The excessive storage demands a corresponding huge penalty in I/O time, rendering time and transmission time between different computer systems. In this paper, a multiresolution scheme is proposed to compress field simulation or experimental data without much loss of important information in the representation. Originally, the wavelet based multiresolution scheme was introduced in image processing, for the purposes of data compression and feature extraction. Unlike photographic image data which has rather simple settings, computational field simulation data needs more careful treatment in applying the multiresolution technique. While the image data sits on a regular spaced grid, the simulation data usually resides on a structured curvilinear grid or unstructured grid. In addition to the irregularity in grid spacing, the other difficulty is that the solutions consist of vectors instead of scalar values. The data characteristics demand more restrictive conditions. In general, the photographic images have very little inherent smoothness with discontinuities almost everywhere. On the other hand, the numerical solutions have smoothness almost everywhere and discontinuities in local areas (shock, vortices, and shear layers). The wavelet bases should be amenable to the solution of the problem at hand and applicable to constraints such as numerical accuracy and boundary conditions. In choosing a suitable wavelet basis for simulation data among a variety of wavelet families, the supercompact wavelets designed by Beam and Warming provide one of the most effective multiresolution schemes. Supercompact multi-wavelets retain the compactness of Haar wavelets, are piecewise polynomial and orthogonal, and can have arbitrary order of approximation. The advantages of the multiresolution algorithm are that no special treatment is required at the boundaries of the interval, and that the application to functions which are only piecewise continuous (internal boundaries) can be efficiently implemented. In this presentation, Beam's supercompact wavelets are generalized to higher dimensions using multidimensional scaling and wavelet functions rather than alternating the directions as in the 1D version. As a demonstration of actual 3D data compression, supercompact wavelet transforms are applied to a 3D data set for wing tip vortex flow solutions (2.5 million grid points). It is shown that high data compression ratio can be achieved (around 50:1 ratio) in both vector and scalar data set.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2017-10-02
    Description: The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2017-10-02
    Description: This paper discusses the following topics: (1) Autonomy for Future Missions- Mars Outposts, Titan Aerobot, and Europa Cryobot / Hydrobot; (2) Emergence of Autonomy- Remote Agent Architecture, Closing Loops Onboard, and New Millennium Flight Experiment; and (3) Software Engineering Challenges- Influence of Remote Agent, Scalable Autonomy, Autonomy Software Validation, Analytic Verification Technology, and Autonomy and Software Software Engineering.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2017-10-02
    Description: Software testing is a well-defined phase of the software development life cycle. Functional ("black box") testing and structural ("white box") testing are two methods of test case design commonly used by software developers. A lesser known testing method is risk-based testing, which takes into account the probability of failure of a portion of code as determined by its complexity. For object oriented programs, a methodology is proposed for identification of risk-prone classes. Risk-based testing is a highly effective testing technique that can be used to find and fix the most important problems as quickly as possible.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2017-10-02
    Description: This paper contains the following sections: GSFC Space Missions of the 21st Century, Information Technology Challenges, Components of a GSFC Solution, and Conclusions.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Fourth Annual Software Engineering Workshop; NASA/CP-2000-209890
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...