ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Computer Programming and Software
  • 2010-2014  (513)
  • 1995-1999  (899)
  • 1970-1974  (4)
  • 1
    Publication Date: 2009-05-04
    Description: This presentation looks at logic design from early in the US Space Program, it examines faults in recent logic designs, and gives some examples from the analysis of new tools and techniques.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2004-12-03
    Description: In this presentation we review the current ongoing research within George Mason University's (GMU) Center for Information Systems Integration and Evolution (CISE). We define characteristics of advanced information systems, discuss a family of agents for such systems, and show how GMU's Domain modeling tools and techniques can be used to define a product line Architecture for configuring NASA missions. These concepts can be used to define Advanced Engineering Environments such as those envisioned for NASA's new initiative for intelligent design and synthesis environments.
    Keywords: Computer Programming and Software
    Type: Intelligent Agents and Their Potential for Future Design and Synthesis Environment; 21-38; NASA/CP-1999-208986
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2004-12-03
    Description: This paper presents an overview of the attitude control subsystem flight software development process, identifies how the process has changed due to automatic code generation, analyzes each software development phase in detail, and concludes with a summary of our lessons learned.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2004-12-03
    Description: The Software Engineering Laboratory (SEL) has been operating for more than two decades in the Flight Dynamics Division (FDD) and has adapted to the constant movement of the software development environment. The SEL's Improvement Paradigm shows that process improvement is an iterative process. Understanding, Assessing and Packaging are the three steps that are followed in this cyclical paradigm. As the improvement process cycles back to the first step, after having packaged some experience, the level of understanding will be greater. In the past, products resulting from the packaging step have been large process documents, guidebooks, and training programs. As the technical world moves toward more modularized software, we have made a move toward more modularized software development process documentation, as such the products of the packaging step are becoming smaller and more frequent. In this manner, the QIP takes on a more spiral approach rather than a waterfall. This paper describes the state of the FDD in the area of software development processes, as revealed through the understanding and assessing activities conducted by the COTS study team. The insights presented include: (1) a characterization of a typical FDD Commercial Off the Shelf (COTS) intensive software development life-cycle process, (2) lessons learned through the COTS study interviews, and (3) a description of changes in the SEL due to the changing and accelerating nature of software development in the FDD.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop; 21-55; NASA/TM-1998-208618
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2004-12-03
    Description: For the past five years, the Flight Dynamics Division (FDD) at NASA's Goddard Space Flight Center has been carrying out a detailed domain analysis effort and is now beginning to implement Generalized Support Software (GSS) based on this analysis. GSS is part of the larger Flight Dynamics Distributed System (FDDS), and is designed to run under the FDDS User Interface / Executive (UIX). The FDD is transitioning from a mainframe based environment to systems running on engineering workstations. The GSS will be a library of highly reusable components that may be configured within the standard FDDS architecture to quickly produce low-cost satellite ground support systems. The estimates for the first release is that this library will contain approximately 200,000 lines of code. The main driver for developing generalized software is development cost and schedule improvement. The goal is to ultimately have at least 80 percent of all software required for a spacecraft mission (within the domain supported by the GSS) to be configured from the generalized components.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Collected Software Engineering Papers; Volume 13; 5-3-5-8; NASA/TM-1998-208615/VOL13
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2004-12-03
    Description: This paper presents the highlights and key findings of 10 years of use and study of Ada and object-oriented design in NASA Goddard's Flight Dynamics Division (FDD). In 1985, the Software Engineering Laboratory (SEL) began investigating how the Ada language might apply to FDD software development projects. Although they began cautiously using Ada on only a few pilot projects, they expected that, if the Ada pilots showed promising results, the FDD would fully transition its entire development organization from FORTRAN to Ada within 10 years. However, 10 years later, the FDD still produced 80 percent of its software in FORTRAN and had begun using C and C++, despite positive results on Ada projects. This paper presents the final results of a SEL study to quantify the impact of Ada in the FDD, to determine why Ada has not flourished, and to recommend future directions regarding Ada. Project trends in both languages are examined as are external factors and cultural issues that affected the infusion of this technology. The detailed results of this study were published in a formal study report in March of 1995. This paper supersedes the preliminary results of this study that were presented at the Eighteenth Annual Software Engineering Workshop in 1993.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Collected Software Engineering Papers; Volume 14; NASA/TM-1998-208613
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: An acknowledgement for the significant behind the scenes contributions was given to three people who had contributed to the success of the Software Engineering workshops. The rest of the presentation reports on the results of a survey sent to everyone on the workshop mailing list. The questionnaire elicited information about the state of software engineering in the respondents organization was similar to one sent 10 years before. A review of the results of the questionnaire, comparing the result to one given before is presented.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop; 163-166; NASA/TM-1998-208616
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2004-12-03
    Description: Since 1976 the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. This paper presents an overview of recent activities and studies in SEL, using as a framework the SEL's organizational goals and experience based software improvement approach. It focuses on two SEL experience areas : (1) the evolution of the measurement program and (2) an analysis of three generations of Cleanroom experiments.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop; 3-19; NASA/TM-1998-208616
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2004-12-03
    Description: The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop; 335-351; NASA/TM-1998-208617
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Mechanical Advantage is a mechanical design decision support system. Unlike our CAD/CAM cousins, Mechanical Advantage addresses true engineering processes, not just the form and fit of geometry. If we look at a traditional engineering environment, we see that an engineer starts with two things - performance goals and design rules. The intent is to have a product perform specific functions and accomplish that within a designated environment. Geometry should be a simple byproduct of that engineering process - not the controller of it. Mechanical Advantage is a performance modeler allowing engineers to consider all these criteria in making their decisions by providing such capabilities as critical parameter analysis, tolerance and sensitivity analysis, math driven Geometry, and automated design optimizations. If you should desire an industry standard solid model, we would produce an ACIS-based solid model. If you should desire an ANSI/ISO standard drawing, we would produce this as well with a virtual push of the button. For more information on this and other Advantage Series products, please contact the author.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 153-170; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Real-time three-dimensional simulations are dependent of three areas: databases and models (3D geometric polygon structure, multiple levels of detail, sound sample waveforms, optimized for interactive, real time use); software applications (software program that controls input devices, visual channels, scene objects, interaction, collision detection, special effects, time of day, weather, I/O, and communication); and hardware (computer, sound processor, monitors control sticks, position trackers, head-mounted displays, amplifiers, speakers, and communications.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 77-96; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 1-24; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: We present a multigrid one-shot algorithm, and a smoothing analysis, for the numerical solution of optimal control problems which are governed by an elliptic PDE. The analysis provides a simple tool to determine a smoothing minimization process which is essential for multigrid application. Numerical results include optimal control of boundary data using different discretization schemes and an optimal shape design problem in 2D with Dirichlet boundary conditions.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; Part 1; 15-30; NASA-CP-3339
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2004-12-03
    Description: We will demonstrate and describe the impact of our use of multimedia and network connectivity on a sophomore-level introductory course in materials science. This class services all engineering students, resulting in large (more than 150) class sections with no hands-on laboratory. In 1990 we began to develop computer graphics that might substitute for some laboratory or real-world experiences, and demonstrate relationships hard to show with static textbook images or chalkboard drawings. We created a comprehensive series of modules that cover the entire course content. Called VIMS (Visualizations in Materials Science), these are available in the form of a CD-ROM and also via the internet.
    Keywords: Computer Programming and Software
    Type: National Educators' Workshop: Update 95. Standard Experiments in Engineering Materials Science and Technology; 289-298; NASA-CP-3330
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2004-12-03
    Description: The GSFC IVS Technology Development Center (TDC) develops station software including the Field System (FS), scheduling software (SKED), hardware including tools for station timing and meteorology, scheduling algorithms, operational procedures, and provides a pool of individuals to assist with station implementation, check-out, upgrades, and training.
    Keywords: Computer Programming and Software
    Type: International VLBI Service for Geodesy and Astrometry: 1999 Annual Report; 256-258; NASA/TP-1999-209243
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2004-12-03
    Description: This presentation gives a summary of intelligent agents for design synthesis environments. We'll start with the conclusions, and work backwards to justify them. First, an important assumption is that agents (whatever they are) are good for software engineering. This is especially true for software that operates in an uncertain, changing environment. The "real world" of physical artifacts is like that: uncertain in what we can measure, changing in that things are always breaking down, and we must interact with non-software entities. The second point is that software engineering techniques can contribute to good design. There may have been a time when we wanted to build simple artifacts containing little or no software. But modern aircraft and spacecraft are complex, and rely on a great deal of software. So better software engineering leads to better designed artifacts, especially when we are designing a series of related artifacts and can amortize the costs of software development. The third point is that agents are especially useful for design tasks, above and beyond their general usefulness for software engineering, and the usefulness of software engineering to design.
    Keywords: Computer Programming and Software
    Type: Intelligent Agents and Their Potential for Future Design and Synthesis Environment; 129-138; NASA/CP-1999-208986
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2004-12-03
    Description: Mission to Planet Earth (MTPE) is a long-term NASA research mission to study the processes leading to global climate change. The Earth Observing System (EOS) is a NASA campaign of satellite observatories that are a major component of MTPE. The EOS Data and Information System (EOSDIS) is another component of MTPE that will provide the Earth science community with easy, affordable, and reliable access to Earth science data. EOSDIS is a distributed system, with major facilities at six Distributed Active Archive Centers (DAACS) located throughout the United States. The EOSDIS software architecture is being designed to receive, process, and archive several terabytes of science data on a daily basis. Thousands of science users and perhaps several hundred thousands of non-science users are expected to access the system. While there are many segments in EOSDIS (e.g., flight operations, network) this case study discusses the development of the science data processing segment (SDPS). We briefly review the architecture of the system, the goals of the SDPS, and the development progress to date. This study highlights key software development challenges, experiences integrating COTS, and the difficulties of managing a complex system development effort.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-Second Annual Software Engineering Workshop; 475-494; NASA/TM-1998-208618
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2004-12-03
    Description: The software measures and estimation techniques appropriate to a Commercial Off the Shelf (COTS) integration project differ from those commonly used for custom software development. Labor and schedule estimation tools that model COTS integration are available. Like all estimation tools, they must be calibrated with the organization's local project data. This paper describes the calibration of a commercial model using data collected by the Flight Dynamics Division (FDD) of the NASA Goddard Spaceflight Center (GSFC). The model calibrated is SLIM Release 4.0 from Quantitative Software Management (QSM). By adopting the SLIM reuse model and by treating configuration parameters as lines of code, we were able to establish a consistent calibration for COTS integration projects. The paper summarizes the metrics, the calibration process and results, and the validation of the calibration.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-second Annual Software Engineering Workshop; 81-98; NASA/TM-1998-208618
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2004-12-03
    Description: For the past 17 years, Professor Leveson and her graduate students have been developing a theoretical foundation for safety in complex systems and building a methodology upon that foundation. The methodology includes special management structures and procedures, system hazard analyses, software hazard analysis, requirements modeling and analysis for completeness and safety, special software design techniques including the design of human-machine interaction, verification, operational feedback, and change analysis. The Safeware methodology is based on system safety techniques that are extended to deal with software and human error. Automation is used to enhance our ability to cope with complex systems. Identification, classification, and evaluation of hazards is done using modeling and analysis. To be effective, the models and analysis tools must consider the hardware, software, and human components in these systems. They also need to include a variety of analysis techniques and orthogonal approaches: There exists no single safety analysis or evaluation technique that can handle all aspects of complex systems. Applying only one or two may make us feel satisfied, but will produce limited results. We report here on a demonstration, performed as part of a contract with NASA Langley Research Center, of the Safeware methodology on the Center-TRACON Automation System (CTAS) portion of the air traffic control (ATC) system and procedures currently employed at the Dallas/Fort Worth (DFW) TRACON (Terminal Radar Approach CONtrol). CTAS is an automated system to assist controllers in handling arrival traffic in the DFW area. Safety is a system property, not a component property, so our safety analysis considers the entire system and not simply the automated components. Because safety analysis of a complex system is an interdisciplinary effort, our team included system engineers, software engineers, human factors experts, and cognitive psychologists.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-Second Annual Software Engineering Workshop; 337-352; NASA/TM-1998-208618
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2004-12-03
    Description: The Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center (GSFC) recently embarked on a far-reaching revision of its process for developing and maintaining satellite support software. The new process relies on an object-oriented software development method supported by a domain specific library of generalized components. This Generalized Support Software (GSS) Domain Engineering Process is currently in use at the NASA GSFC Software Engineering Laboratory (SEL). The key facets of the GSS process are (1) an architecture for rapid deployment of FDD applications, (2) a reuse asset library for FDD classes, and (3) a paradigm shift from developing software to configuring software for mission support. This paper describes the GSS architecture and process, results of fielding the first applications, lessons learned, and future directions
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Collected Software Engineering Papers; Volume 15; 189-195; NASA/TM-1998-208614/VOL15
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2004-12-03
    Description: The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Collected Software Engineering Papers; Volume 13; 2-3-2-7; NASA/TM-1998-208615/VOL13
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2004-12-03
    Description: This paper presents the interim results from the Software Engineering Laboratory's (SEL) Reuse Study. The team conducting this study has, over the past few months, been studying the Generalized Support Software (GSS) domain asset library and architecture, and the various processes associated with it. In particular, we have characterized the process used to configure GSS-based attitude ground support systems (AGSS) to support satellite missions at NASA's Goddard Space Flight Center. To do this, we built detailed models of the tasks involved, the people who perform these tasks, and the interdependencies and information flows among these people. These models were based on information gleaned from numerous interviews with people involved in this process at various levels. We also analyzed effort data in order to determine the cost savings in moving from actual development of AGSSs to support each mission (which was necessary before GSS was available) to configuring AGSS software from the domain asset library. While characterizing the GSS process, we became aware of several interesting factors which affect the successful continued use of GSS. Many of these issues fall under the subject of evolving technologies, which were not available at the inception of GSS, but are now. Some of these technologies could be incorporated into the GSS process, thus making the whole asset library more usable. Other technologies are being considered as an alternative to the GSS process altogether. In this paper, we outline some of issues we will be considering in our continued study of GSS and the impact of evolving technologies.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop; 27-58; NASA/TM-1998-208617
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: The focus of the HPCC Earth and Space Sciences (ESS) Project is capability computing - pushing highly scalable computing testbeds to their performance limits. The drivers of this focus are the Grand Challenge problems in Earth and space science: those that could not be addressed in a capacity computing environment where large jobs must continually compete for resources. These Grand Challenge codes require a high degree of communication, large memory, and very large I/O (throughout the duration of the processing, not just in loading initial conditions and saving final results). This set of parameters led to the selection of an SGI/Cray T3E as the current ESS Computing Testbed. The T3E at the Goddard Space Flight Center is a unique computational resource within NASA. As such, it must be managed to effectively support the diverse research efforts across the NASA research community yet still enable the ESS Grand Challenge Investigator teams to achieve their performance milestones, for which the system was intended. To date, all Grand Challenge Investigator teams have achieved the 10 GFLOPS milestone, eight of nine have achieved the 50 GFLOPS milestone, and three have achieved the 100 GFLOPS milestone. In addition, many technical papers have been published highlighting results achieved on the NASA T3E, including some at this Workshop. The successes enabled by the NASA T3E computing environment are best illustrated by the 512 PE upgrade funded by the NASA Earth Science Enterprise earlier this year. Never before has an HPCC computing testbed been so well received by the general NASA science community that it was deemed critical to the success of a core NASA science effort. NASA looks forward to many more success stories before the conclusion of the NASA-SGI/Cray cooperative agreement in June 1999.
    Keywords: Computer Programming and Software
    Type: HPCCP/CAS Workshop Proceedings 1998; 53-58; NASA/CP-1999-208757
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2004-12-03
    Description: Capabilities of STIS currently supported and those going beyond the subset available at the start of Cycle 7 are described. The latter are candidates for later implementation.
    Keywords: Computer Programming and Software
    Type: The 1997 HST Calibration Workshop with a New Generation of Instruments; 60-64; NASA/TM-97-208141
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2004-12-03
    Description: A time efficient algorithm for image registration between two images that differ in translation is discussed. The algorithm is based on coarse-fine strategy using wavelet decomposition of both the images. The wavelet decomposition serves two different purposes: (1) its high frequency components are used to detect feature points (corner points here) and (2) it provides coarse-to-fine structure for making the algorithm time efficient. The algorithm is based on detecting the corner points from one of the images called reference image and computing corresponding points from the other image called test image by using local correlations using 7x7 windows centered around the corner points. The corresponding points are detected at the lowest decomposition level in a search area of about 11x11 (depending on the translation) and potential points of correspondence are projected onto higher levels. In the subsequent levels the local correlations are computed in a search area of no more than 3x3 for refinement of the correspondence.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 243-246; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2004-12-03
    Description: Most automatic registration methods are either correlation-based, feature-based, or a combination of both. Examples of features which can be utilized for automatic image registration are edges, regions, corners, or wavelet-extracted features. In this paper, we describe two proposed approaches, based on edge or edge-like features, which are very appropriate to highlight regions of interest such as coastlines. The two iterative methods utilize the Normalized Cross-Correlation of edge and wavelet features and are applied to such problems as image-to-map registration, landmarking, and channel-to-channel co-registration, utilizing test data, AVHRR data, as well as GOES image data.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 137-146; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    Publication Date: 2004-12-03
    Description: This paper briefly describes and then compares the effectiveness of five image registration approaches for GOES visible band imagery. The techniques compared are (1) NOAA "point" manual landmarking, (2) manual landmarking on extant feature (whole island, lake, etc.), (3) automatic phase correlation, (4) automatic spatial correlation on edge features, and (5) automatic spatial correlation of region boundaries derived from image segmentation.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 133-136; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    Publication Date: 2004-12-03
    Description: The middle atmosphere (20 to 90 km altitude) ha received increasing interest from the scientific community during the last decades, especially since such problems as polar ozone depletion and climatic change have become so important. Temperature profiles have been obtained in this region using a variety of satellite-, rocket-, and balloon-borne instruments as well as some ground-based systems. One of the more promising of these instruments, especially for long-term high resolution measurements, is the lidar. Measurements of laser radiation Rayleigh backscattered, or Raman scattered, by atmospheric air molecules can be used to determine the relative air density profile and subsequently the temperature profile if it is assumed that the atmosphere is in hydrostatic equilibrium and follows the ideal gas law. The high vertical and spatial resolution make the lidar a well adapted instrument for the study of many middle atmospheric processes and phenomena as well as for the evaluation and validation of temperature measurements from satellites, such as the Upper Atmosphere Research Satellite (UARS). In the Network for Detection of Stratospheric Change (NDSC) lidar is the core instrument for measuring middle atmosphere temperature profiles. Using the best lidar analysis algorithm possible is therefore of crucial importance. In this work, the JPL and CNRS/SA lidar analysis software were evaluated. The results of this evaluation allowed the programs to be corrected and optimized and new production software versions were produced. First, a brief description of the lidar technique and the method used to simulate lidar raw-data profiles from a given temperature profile is presented. Evaluation and optimization of the JPL and CNRS/SA algorithms are then discussed.
    Keywords: Computer Programming and Software
    Type: Nineteenth International Laser Radar Conference; 481-484; NASA/CP-1998-207671/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    Publication Date: 2004-12-03
    Description: The goal of my project was to create a graphical user interface for a prototype crop scheduler. The crop scheduler was developed by Dr. Jorge Leon and Laura Whitaker for the ALSS (Advanced Life Support System) program. The addition of a system-independent graphical user interface to the crop planning tool will make the application more accessible to a wider range of users and enhance its value as an analysis, design, and planning tool. My presentation will demonstrate the form and functionality of this interface. This graphical user interface allows users to edit system parameters stored in the file system. Data on the interaction of the crew, crops, and waste processing system with the available system resources is organized and labeled. Program output, which is stored in the file system, is also presented to the user in performance-time plots and organized charts. The menu system is designed to guide the user through analysis and decision making tasks, providing some help if necessary. The Java programming language was used to develop this interface in hopes of providing portability and remote operation.
    Keywords: Computer Programming and Software
    Type: National Aeronautics and Space Administration (NASA)/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program: 1996; Volume 2; NASA-CR-202008-Vol-2
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2004-12-03
    Description: An approach for objective and quantitative technical and cost risk analysis during product development, which is applicable from the earliest stages, is discussed. The approach is supported by a software tool called the Analytical System for Uncertainty and Risk Estimation (ASURE). Details of ASURE, the underlying concepts and its application history, are provided.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 55-76; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    Publication Date: 2004-12-03
    Description: The goal of this research is to improve the performance of automated schedulers by designing and implementing an algorithm by automatically generating heuristics by selecting a schedule. The particular application selected by applying this method solves the problem of scheduling telescope observations, and is called the Associate Principal Astronomer. The input to the APA scheduler is a set of observation requests submitted by one or more astronomers. Each observation request specifies an observation program as well as scheduling constraints and preferences associated with the program. The scheduler employs greedy heuristic search to synthesize a schedule that satisfies all hard constraints of the domain and achieves a good score with respect to soft constraints expressed as an objective function established by an astronomer-user.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2004-12-03
    Description: We describe a high performance parallel multigrid algorithm for a rather general class of unstructured grid problems in two and three dimensions. The algorithm PUMG, for parallel unstructured multigrid, is related in structure to the parallel multigrid algorithm PSMG introduced by McBryan and Frederickson, for they both obtain a higher convergence rate through the use of multiple coarse grids. Another reason for the high convergence rate of PUMG is its smoother, an approximate inverse developed by Baumgardner and Frederickson.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; Part 1; 317-326; NASA-CP-3339
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: This is an overview of new and updated industrial codes for seal design and testing. GCYLT (gas cylindrical seals -- turbulent), SPIRALI (spiral-groove seals -- incompressible), KTK (knife to knife) Labyrinth Seal Code, and DYSEAL (dynamic seal analysis) are covered. CGYLT uses G-factors for Poiseuille and Couette turbulence coefficients. SPIRALI is updated to include turbulence and inertia, but maintains the narrow groove theory. KTK labyrinth seal code handles straight or stepped seals. And DYSEAL provides dynamics for the seal geometry.
    Keywords: Computer Programming and Software
    Type: Seals Code Development Workshop; 115-138; NASA-CP-10181
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Modeling using mathematical optimization dynamics is a design tool used in magnetic suspension system development. MATLAB (software) is used to calculate minimum cost and other desired constraints. The parameters to be measured are programmed into mathematical equations. MATLAB will calculate answers for each set of inputs; inputs cover the boundary limits of the design. A Magnetic Suspension System using Electromagnets Mounted in a Plannar Array is a design system that makes use of optimization modeling.
    Keywords: Computer Programming and Software
    Type: The 1995 NASA-ODU American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program; 105; NASA-CR-198210
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2004-12-03
    Description: Multidisciplinary design optimization (MDO) is expected to play a major role in the competitive transportation industries of tomorrow, i.e., in the design of aircraft and spacecraft, of high speed trains, boats, and automobiles. All of these vehicles require maximum performance at minimum weight to keep fuel consumption low and conserve resources. Here, MDO can deliver mathematically based design tools to create systems with optimum performance subject to the constraints of disciplines such as structures, aerodynamics, controls, etc. Although some applications of MDO are beginning to surface, the key to a widespread use of this technology lies in the improvement of its efficiency. This aspect is investigated here for the MDO subset of structural optimization, i.e., for the weight minimization of a given structure under size, strength, and displacement constraints. Specifically, finite element based multilevel optimization of structures (here, statically indeterminate trusses and beams for proof of concept) is performed. In the system level optimization, the design variables are the coefficients of assumed displacement functions, and the load unbalance resulting from the solution of the stiffness equations is minimized. Constraints are placed on the deflection amplitudes and the weight of the structure. In the subsystems level optimizations, the weight of each element is minimized under the action of stress constraints, with the cross sectional dimensions as design variables. This approach is expected to prove very efficient, especially for complex structures, since the design task is broken down into a large number of small and efficiently handled subtasks, each with only a small number of variables. This partitioning will also allow for the use of parallel computing, first, by sending the system and subsystems level computations to two different processors, ultimately, by performing all subsystems level optimizations in a massively parallel manner on separate processors. It is expected that the subsystems level optimizations can be further improved through the use of controlled growth, a method which reduces an optimization to a more efficient analysis with only a slight degradation in accuracy. The efficiency of all proposed techniques is being evaluated relative to the performance of the standard single level optimization approach where the complete structure is weight minimized under the action of all given constraints by one processor and to the performance of simultaneous analysis and design which combines analysis and optimization into a single step. It is expected that the present approach can be expanded to include additional structural constraints (buckling, free and forced vibration, etc.) or other disciplines (passive and active controls, aerodynamics, etc.) for true MDO.
    Keywords: Computer Programming and Software
    Type: The 1995 NASA-ODU American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program; 104; NASA-CR-198210
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    Publication Date: 2004-12-03
    Description: This report describes an investigation into using model checking to assist validation of domain models for the HSTS planner. The planner models are specified using a qualitative temporal interval logic with quantitative duration constraints. We conducted several experiments to translate the domain modeling language into the SMV, Spin and Murphi model checkers. This allowed a direct comparison of how the different systems would support specific types of validation tasks. The preliminary results indicate that model checking is useful for finding faults in models that may not be easily identified by generating test plans.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    Publication Date: 2004-12-03
    Description: Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. It has done this by developing and refining a continual process improvement approach that allows an organization such as the FDD to fine-tune its process for its particular domain. Experimental software engineering and measurement play a significant role in this approach. The SEL is a partnership of NASA Goddard, its major software contractor, Computer Sciences Corporation (CSC), and the University of Maryland's (LTM) Department of Computer Science. The FDD primarily builds software systems that provide ground-based flight dynamics support for scientific satellites. They fall into two sets: ground systems and simulators. Ground systems are midsize systems that average around 250 thousand source lines of code (KSLOC). Ground system development projects typically last 1 - 2 years. Recent systems have been rehosted to workstations from IBM mainframes, and also contain significant new subsystems written in C and C++. The simulators are smaller systems averaging around 60 KSLOC that provide the test data for the ground systems. Simulator development lasts up to 1 year. Most of the simulators have been built in Ada on workstations. The SEL is responsible for the management and continual improvement of the software engineering processes used on these FDD projects.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop; 3-19; NASA/TM-1998-208618
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2004-12-03
    Description: The National Aeronautics and Space Administration (NASA) Data Assimilation Office (DAO) at the Goddard Space Flight Center (GSFC) has developed the GEOS DAS, a data assimilation system that provides production support for NASA missions and will support NASA's Earth Observing System (EOS) in the coming years. The GEOS DAS will be used to provide background fields of meteorological quantities to EOS satellite instrument teams for use in their data algorithms as well as providing assimilated data sets for climate studies on decadal time scales. The DAO has been involved in prototyping parallel implementations of the GEOS DAS for a number of years and is now embarking on an effort to convert the production version from shared-memory parallelism to distributed-memory parallelism using the portable Message-Passing Interface (MPI). The GEOS DAS consists of two main components, an atmospheric General Circulation Model (GCM) and a Physical-space Statistical Analysis System (PSAS). The GCM operates on data that are stored on a regular grid while PSAS works with observational data that are scattered irregularly throughout the atmosphere. As a result, the two components have different data decompositions. The GCM is decomposed horizontally as a checkerboard with all vertical levels of each box existing on the same processing element(PE). The dynamical core of the GCM can also operate on a rotated grid, which requires communication-intensive grid transformations during GCM integration. PSAS groups observations on PEs in a more irregular and dynamic fashion.
    Keywords: Computer Programming and Software
    Type: Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications; 157-161; NASA/CP-1998-206860
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2004-12-03
    Description: In spite of the large number of different image registration techniques, most of these techniques use the correlation operation to match spatial image characteristics. Correlation is known to be one of the most computationally intensive operations and its computational needs grow rapidly with the increase in the image sizes. In this article, we show that, in many cases, it might be sufficient to determine image transformations by considering only one or several parts of the image rather than the entire image, which could result in substantial computational savings. This paper introduces the concept of registration by parts and investigates its viability. It describes alternative techniques for such image registration by parts and presents early empirical results that address the underlying trade-offs.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 299-306; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2004-12-03
    Description: The first step in the integration of multiple data is registration, either relative image-to-image registration or absolute geo-registration, to a map or a fixed coordinate system. As the need for automating registration techniques is recognized, we feel that there is a need to survey all the registration methods which may be applicable to Earth and space science problems and to evaluate their performances on a large variety of existing remote sensing data as well as on simulated data of soon-to-be-flown instruments. In this paper we will describe: 1) the operational toolbox which we are developing and which will consist in some of the most important registration techniques; and 2) the quantitative intercomparison of the different methods, which will allow a user to select the desired registration technique based on this evaluation and the visualization of the registration results.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 307-316; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2004-12-03
    Description: This paper presents a general strategy for assembling mosaics from numerous individual images where uncertainty exists in the position and orientation of those images. Both of the presented applications relate to remotely operated camera platforms, the first being the Galileo solid state imaging (SSI) camera presently in orbit around Jupiter, and the second being the Imager for Mars Pathfinder (IMP) stereo camera on Mars. A basic strategy in both applications is to determine the correct relative camera pointing followed by direct map projection of the images. It is assumed that approximate camera pointing exists sufficient to locate adjacent images and to place initial tiepoints within reach of the correlator. Spatial correlation is used to fix tiepoints whose initial locations are predicted by the camera pointing. We use either an fast fourier transform (fft) algorithm or a variant of Gruen's scheme permitting limited image rotation and skew. The Gruen correlator has three hierarchical modes: 1) A classical spatial least squares correlation on integral pixel boundaries used when rotation is small. 2) An annealing non-deterministic search used when rotations are unknown. A simplex deterministic search used for the end game. The correlation operation can be performed either interactively or autonomously. The final camera pointing solution relies upon a simplex downhill search in 2n or 3n dimensions where n is the number of images comprising the mosaic and the objective function to be minimized is the disagreement between tiepoint locations predicted from the camera pointing with those observed by the correlator. For Galileo the 3n unknowns are euler angles defining camera pointing in planet coordinates, and for Mars Pathfinder they are 2n unknowns representing commanded azimuth and elevation in the Lander coordinate system.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 123-132; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2004-12-03
    Description: Feature based registration is one of the most reliable methods to register multi-sensor images (both active and passive imagery) since features are often more reliable than intensity or radiometric values. The only situation where a feature based approach will fail is when the scene is completely homogenous or densely textural in which case a combination of feature and intensity based methods may yield better results. In this paper, we present some preliminary results of testing our scale space feature based registration technique, a modified version of feature based method developed earlier for classification of multi-sensor imagery. The proposed approach removes the sensitivity in parameter selection experienced in the earlier version as explained later.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 35-36; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2004-12-03
    Description: The first part of this article introduces the notion of translation invariance in wavelets and discusses several wavelets that have this property. The second part discusses the possible applications of such wavelets to image registration. In the case of registration of affinely transformed images, we would conclude that the notion of translation invariance is not really necessary. What is needed is affine invariance and one way to do this is via the method of moment invariants. Wavelets or, in general, pyramid processing can then be combined with the method of moment invariants to reduce the computational load.
    Keywords: Computer Programming and Software
    Type: Image Registration Workshop Proceedings; 29-34; NASA/CP-1998-206853
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2004-12-03
    Description: Over the last several years, integration of multiple media sources into a single information system has been rapidly developing. It has been found that when sound, graphics, text, animations, and simulations are skillfully integrated, the sum of the parts exceeds the individual parts for effective learning. In addition, simulations can be used to design and understand complex engineering processes. With the recent introduction of many high-level authoring, animation, modeling, and rendering programs for personal computers, significant multimedia programs can be developed by practicing engineers, scientists and even managers for both training and education. However, even with these new tools, a considerable amount of time is required to produce an interactive multimedia program. The development of both CD-ROM and Web-based programs are discussed in addition to the use of technically oriented animations. Also examined are various multimedia development tools and how they are used to develop effective engineering education courseware. Demonstrations of actual programs in engineering mechanics are shown.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 195-208; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    Publication Date: 2004-12-03
    Description: The Portable War Room is an internal TASC project to research and develop a visualization and simulation environment to provide for decision makers the power to review the past, understand the present, and peer into the future.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 97-117; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Simulation models of acquisition processes range in scope from isolated applications to the 'Big Picture' captured by SNAP technology. SNAP integrates a family of models to portray the full scope of acquisition planning and management activities, including budgeting, scheduling, testing and risk analysis. SNAP replicates the dynamic management processes that underlie design, production and life-cycle support. SNAP provides the unique 'Big Picture' capability needed to simulate the entire acquisition process and explore the 'what-if' tradeoffs and consequences of alternative policies and decisions. Comparison of cost, schedule and performance tradeoffs help managers choose the lowest-risk, highest payoff at each step in the acquisition process.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 39-53; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Thirty years ago, the CAD industry was created as electronic drafting tools were developed to move people from the traditional two-dimensional drafting boards. While these tools provided an improvement in accuracy (true perpendicular lines, etc.), they did offer a significant improvement in productivity or impact development times. They electronically captured a manual process.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 25-37; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    Publication Date: 2004-12-03
    Description: Multigrid algorithms for nonconforming and mixed finite element methods for second order elliptic problems on triangular and rectangular finite elements are considered. The construction of several coarse-to-fine intergrid transfer operators for nonconforming multigrid algorithms is discussed. The equivalence between the nonconforming and mixed finite element methods with and without projection of the coefficient of the differential problems into finite element spaces is described.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; Part 1; 183-197; NASA-CP-3339
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2004-12-03
    Description: A multiscale algorithm for the problem of optimal statistical interpolation of observed data has been developed. This problem includes the calculation of the vector of the 'analyzed' (best estimated) atmosphere flow field w(sup a) by the formula: w(sup a) = w(sup f) + P(sup f) H(sup T) y, where the quantity y is defined by the equation (H P(sup f) H(sup T) + R)y = w(sup o) - H w(sup f), using the given model forecast first guess w(sup f) and the vector of observations w(sup o); H is an interpolation operator from the regular grid to the observation network, P(sup f) is the forecast error covariance matrix, and R is the observation error covariance matrix. At this initial stage the case of univariate analysis of single level radiosonde height data is considered. The matrix R is assumed to be diagonal, and the matrix P(sup f) is assumed to be given by the formula P(sub ij)(sup f) = sigma(sub i)(sup f) mu(sub ij) sigma(sub j)(sub f), where mu(sub ij) is a smooth, decreasing function of the distance between the i-th and the j-th points. In this paper we describe a multiscale iterative process based on a multiresolution, simultaneous displacement technique and a localized variational calculation of iteration parameters.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; Part 1; 87-96; NASA-CP-3339
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    Publication Date: 2004-12-03
    Description: A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.
    Keywords: Computer Programming and Software
    Type: ; 209-212
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    Publication Date: 2004-12-03
    Description: Thermal engineering has long been left out of the concurrent engineering environment dominated by CAD (computer aided design) and FEM (finite element method) software. Current tools attempt to force the thermal design process into an environment primarily created to support structural analysis, which results in inappropriate thermal models. As a result, many thermal engineers either build models "by hand" or use geometric user interfaces that are separate from and have little useful connection, if any, to CAD and FEM systems. This paper describes the development of a new thermal design environment called the Thermal Desktop. This system, while fully integrated into a neutral, low cost CAD system, and which utilizes both FEM and FD methods, does not compromise the needs of the thermal engineer. Rather, the features needed for concurrent thermal analysis are specifically addressed by combining traditional parametric surface based radiation and FD based conduction modeling with CAD and FEM methods. The use of flexible and familiar temperature solvers such as SINDA/FLUINT (Systems Improved Numerical Differencing Analyzer/Fluid Integrator) is retained.
    Keywords: Computer Programming and Software
    Type: Ninth Thermal and Fluids Analysis Workshop Proceedings; 217-232; NASA/CP-1999-208695
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2004-12-03
    Description: Integrated analysis methods have the potential to substantially decrease the time required for analysis modeling. Integration with computer aided design (CAD) software can also allow a model to be more accurate by facilitating import of exact design geometry. However, the integrated method utilized must sometimes be tailored to the specific modeling situation, in order to make the process most efficient. Two cases are presented here that illustrate different processes used for thermal analysis on two different models. These examples are used to illustrate how the requirements, available input, expected output, and tools available all affect the process selected by the analyst for the most efficient and effective analysis.
    Keywords: Computer Programming and Software
    Type: Ninth Thermal and Fluids Analysis Workshop Proceedings; 37-48; NASA/CP-1999-208695
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2004-12-03
    Description: The AIRPLANE process starts with an aircraft geometry stored in a CAD system. The surface is modeled with a mesh of triangles and then the flow solver produces pressures at surface points which may be integrated to find forces and moments. The biggest advantage is that the grid generation bottleneck of the CFD process is eliminated when an unstructured tetrahedral mesh is used. MESH3D is the key to turning around the first analysis of a CAD geometry in days instead of weeks. The flow solver part of AIRPLANE has proven to be robust and accurate over a decade of use at NASA. It has been extensively validated with experimental data and compares well with other Euler flow solvers. AIRPLANE has been applied to all the HSR geometries treated at Ames over the course of the HSR program in order to verify the accuracy of other flow solvers. The unstructured approach makes handling complete and complex geometries very simple because only the surface of the aircraft needs to be discretized, i.e. covered with triangles. The volume mesh is created automatically by MESH3D. AIRPLANE runs well on multiple platforms. Vectorization on the Cray Y-MP is reasonable for a code that uses indirect addressing. Massively parallel computers such as the IBM SP2, SGI Origin 2000, and the Cray T3E have been used with an MPI version of the flow solver and the code scales very well on these systems. AIRPLANE can run on a desktop computer as well. AIRPLANE has a future. The unstructured technologies developed as part of the HSR program are now targeting high Reynolds number viscous flow simulation. The pacing item in this effort is Navier-Stokes mesh generation.
    Keywords: Computer Programming and Software
    Type: 1999 NASA High-Speed Research Program Aerodynamic Performance Workshop; Volume 1; Part 1; 213-252; NASA/CP-1999-209704/VOL1/PT1
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: A general overview of the Information Systems Center (ISC) at NASA's Goddard Space Flight Center is presented. The background of the center, its reorganization, as well as the applied engineering and technology directorate and its organization are outlined. The ISC's mission and vision as well as a breakdown of the branch are reviewed. Finally, the role of the Software Engineering Laboratory (SEL) within the ISC is reported, it's short and long term goals, and current activities discussed.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Twenty-Third Annual Software Engineering Workshop; NASA/CP-1999-209236
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    Publication Date: 2004-12-03
    Description: This paper will discuss the use of the Intermetrics AppletMagic tool to build an applet to display a satellite ground track on a world map. This applet is the result of a prototype project that was developed by the Goddard Space Flight Center's Flight Dynamics Division (FDD), starting in June of 1996. Both Version 1 and Version 2 of this applet can be accessed via the URL http://fdd.gsfc.nasa.gov/Java.html. This paper covers Version 1, as Version 2 did not make radical changes to the Ada part of the applet. This paper will briefly describe the design of the applet, discuss the issues that arose during development, and will conclude with lessons learned and future plans for the FDD's use of Ada and Java. The purpose of this paper is to show examples of a successful project using Oi AppletMagic, and to highlight some of the pitfalls that occurred along the way. It is hoped that this discussion will be useful both to users of AppletMagic and to organizations such as Intermetrics that develop new technology.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Collected Software Engineering Papers; Volume 15; 173-188; NASA/TM-1998-208614/VOL15
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2004-12-03
    Description: NASA's Software Engineering Laboratory (SEL), one of the earliest pioneers in the areas of software process improvement and measurement, has had a significant impact on the software business at NASA Goddard. At the heart of the SEL's improvement program is a belief that software products can be improved by optimizing the software engineering process used to develop them and a long-term improvement strategy that facilitates small incremental improvements that accumulate into significant gains. As a result of its efforts, the SEL has incrementally reduced development costs by 60%, decreased error rates by 85%, and reduced cycle time by 25%. In this paper, we analyze the SEL's experiences on three major improvement initiatives to better understand the cyclic nature of the improvement process and to understand why some improvements take much longer than others.
    Keywords: Computer Programming and Software
    Type: Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop; 3-25; NASA/TM-1998-208617
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    Publication Date: 2004-12-03
    Description: Domain-specific automatic program synthesis tools, also called application generators, are playing an ever-increasing role in software development. However, high-performance application generators require difficult manual construction, and are very difficult to verify correct. This paper describes research and an implemented system that transforms program synthesis tools based on deductive synthesis into high-performance application generators. Deductive synthesis uses theorem-proving to construct solutions when given problem specifications. The verification condition for a deductive synthesis tool is essentially the soundness of the implemented inference rules. Theory Operationalization for Program Synthesis (TOPS) synergistically combines reformulation, automated mathematical classification, and compilation through partial deduction to decision procedures. It transforms general-purpose deductive synthesis, with exponential performance, into efficient special-purpose deductive synthesis, with near-linear performance. This paper describes our experience with and empirical results of PD(TH) theory-based partial deduction - in which partial deduction of a set of first-order formulae is performed within the context of a background theory. The implemented TOPS system currently performs a special variant of PD(TH) in which the compilation process results in the transformation of a set of first order formulae into the theory of an instantiated library decision procedure augmented by a compiled unit theory.
    Keywords: Computer Programming and Software
    Type: Fourth NASA Langley Formal Methods Workshop; 129-135; NASA-CP-3356
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2004-12-03
    Description: There are four major components to SmartScene. First, it is shipped with everything necessary to quickly be able to do productive work. It is immersive in that when a user is working in SmartScene he or she cannot see anything except the world being manipulated.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 119-140; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2004-12-03
    Description: Taken together the components of the Einstein Suite provide two revolutionary capabilities - they have the potential to change the way engineering and financial engineering are performed by: (1) providing currently unavailable functionality, and (2) providing a 10-100 times improvement over currently available but impractical or costly functionality.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 171-193; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    Publication Date: 2004-12-03
    Description: Deneb Robotics is a leader in the development of commercially available, leading edge three- dimensional simulation software tools for virtual prototyping,, simulation-based design, manufacturing process simulation, and factory floor simulation and training applications. Deneb has developed and commercially released a preliminary Virtual Collaborative Engineering (VCE) capability for Integrated Product and Process Development (IPPD). This capability allows distributed, real-time visualization and evaluation of design concepts, manufacturing processes, and total factory and enterprises in one seamless simulation environment.
    Keywords: Computer Programming and Software
    Type: Computational Tools and Facilities for the Next-Generation Analysis and Design Environment; 141-151; NASA-CP-3346
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    Publication Date: 2004-12-03
    Description: A multigrid algorithm is developed and analyzed for generalized Stokes problems discretized by various nonnested mixed finite elements within a unified framework. It is abstractly proved by an element-independent analysis that the multigrid algorithm converges with an optimal order if there exists a 'good' prolongation operator. A technique to construct a 'good' prolongation operator for nonnested multilevel finite element spaces is proposed. Its basic idea is to introduce a sequence of auxiliary nested multilevel finite element spaces and define a prolongation operator as a composite operator of two single grid level operators. This makes not only the construction of a prolongation operator much easier (the final explicit forms of such prolongation operators are fairly simple), but the verification of the approximate properties for prolongation operators is also simplified. Finally, as an application, the framework and technique is applied to seven typical nonnested mixed finite elements.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; Part 1; 241-253; NASA-CP-3339
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2004-12-03
    Description: A multilevel algorithm is presented that solves general second order elliptic partial differential equations on adaptive sparse grids. The multilevel algorithm consists of several V-cycles. Suitable discretizations provide that the discrete equation system can be solved in an efficient way. Numerical experiments show a convergence rate of order Omicron(1) for the multilevel algorithm.
    Keywords: Computer Programming and Software
    Type: Seventh Copper Mountain Conference on Multigrid Methods; 661-672; NASA-CP-3339-Pt-2
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2004-12-03
    Description: Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.
    Keywords: Computer Programming and Software
    Type: Autonomous sensor systems; Volume 1; 98-120; DLR-FB-95-43
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Virtual Reality Lab Assistant (VRLA) demonstration model is aligned for engineering and material science experiments to be performed by undergraduate and graduate students in the course as a pre-lab simulation experience. This will help students to get a preview of how to use the lab equipment and run experiments without using the lab hardware/software equipment. The quality of the time available for laboratory experiments can be significantly improved through the use of virtual reality technology.
    Keywords: Computer Programming and Software
    Type: National Educators' Workshop: Update 95. Standard Experiments in Engineering Materials Science and Technology; 299-306; NASA-CP-3330
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2009-05-04
    Description: The issues of image compression and pattern classification have been a primary focus of researchers among a variety of fields including signal and image processing, pattern recognition, data classification, etc. These issues depend on finding an efficient representation of the source data. In this paper we collate our earlier results where we introduced the application of the. Hilbe.rt scan to a principal component algorithm (PCA) with Adaptive Principal Component Extraction (APEX) neural network model. We apply these technique to medical imaging, particularly image representation and compression. We apply the Hilbert scan to the APEX algorithm to improve results
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2005-06-30
    Description: With the advancing computer processing capabilities, practical computer applications are mostly limited by the amount of human programming required to accomplish a specific task. This necessary human participation creates many problems, such as dramatically increased cost. To alleviate the problem, computers must become more autonomous. In other words, computers must be capable to program/reprogram themselves to adapt to changing environments/tasks/demands/domains. Evolutionary computation offers potential means, but it must be advanced beyond its current practical limitations. Evolutionary algorithms model nature. They maintain a population of structures representing potential solutions to the problem at hand. These structures undergo a simulated evolution by means of mutation, crossover, and a Darwinian selective pressure. Genetic programming (GP) is the most promising example of an evolutionary algorithm. In GP, the structures that evolve are trees, which is a dramatic departure from previously used representations such as strings in genetic algorithms. The space of potential trees is defined by means of their elements: functions, which label internal nodes, and terminals, which label leaves. By attaching semantic interpretation to those elements, trees can be interpreted as computer programs (given an interpreter), evolved architectures, etc. JSC has begun exploring GP as a potential tool for its long-term project on evolving dextrous robotic capabilities. Last year we identified representation redundancies as the primary source of inefficiency in GP. Subsequently, we proposed a method to use problem constraints to reduce those redundancies, effectively reducing GP complexity. This method was implemented afterwards at the University of Missouri. This summer, we have evaluated the payoff from using problem constraints to reduce search complexity on two classes of problems: learning boolean functions and solving the forward kinematics problem. We have also developed and implemented methods to use additional problem heuristics to fine-tune the searchable space, and to use typing information to further reduce the search space. Additional improvements have been proposed, but they are yet to be explored and implemented.
    Keywords: Computer Programming and Software
    Type: National Aeronautics and Space Administration (NASA)/American Society for Engineering Education (ASEE) Summer Faculty Fellowship Program: 1996; Volume 2; NASA-CR-202008-Vol-2
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2011-10-14
    Description: A turbulence model has been developed for blade-element helicopter simulation. This model, called Simulation of Rotor Blade Element Turbulence (SORBET), uses an innovative temporal and geometrical distribution algorithm that preserves the statistical characteristics of the turbulence spectra over the rotor disc, while providing velocity components in real time to each of five blade-element stations along each of four blades. An initial investigation of SORBET has been performed using a piloted, motion-based simulation of the Sikorsky UH60A Black Hawk. Although only the vertical component of stochastic turbulence was used in this investigation, vertical turbulence components induce vehicle responses in all translational and rotational degrees of freedom of the helicopter. The single-degree-of-freedom configuration of SORBET was compared to a conventional full 6-degrees-of-freedom baseline configuration, where translational velocity inputs are superimposed at the vehicle center of gravity, and rotational velocity inputs are created from filters that approximate the immersion rate into the turbulent field. For high-speed flight the vehicle responses were satisfactory for both models. Test pilots could not distinguish differences between the baseline configuration and SORBET. In low-speed flight the baseline configuration received criticism for its high frequency content, whereas the SORBET model elicited favorable pilot opinion. For this helicopter, which has fully articulated blades, results from SORBET show that vehicle responses to turbulent blade-station disturbances are severely attenuated. This is corroborated by in-flight observation of the rotor tip path plane as compared to vehicle responses.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    Publication Date: 2011-08-23
    Description: By rigorously extending modern communication theory to the assessment of sampled imaging systems, we develop the formulations that are required to optimize the performance of these systems within the critical constraints of image gathering, data transmission, and image display. The goal of this optimization is to produce images with the best possible visual quality for the wide range of statistical properties of the radiance field of natural scenes that one normally encounters. Extensive computational results are presented to assess the performance of sampled imaging systems in terms of information rate, theoretical minimum data rate, and fidelity. Comparisons of this assessment with perceptual and measurable performance demonstrate that (1) the information rate that a sampled imaging system conveys from the captured radiance field to the observer is closely correlated with the fidelity, sharpness and clarity with which the observed images can be restored and (2) the associated theoretical minimum data rate is closely correlated with the lowest data rate with which the acquired signal can be encoded for efficient transmission.
    Keywords: Computer Programming and Software
    Type: Optical Engineering (ISSN 0091-3286); Volume 38; No. 5; 742-762
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2011-08-23
    Description: The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate. A radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimisation (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behaviour by interaction of a large number of very simple models may be an inspiration for the above algorithms; the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should begin now, even though the widespread availability of massively parallel processing is still a few years away.
    Keywords: Computer Programming and Software
    Type: Aeronautical Journal of the Royal Aeronautical Society; No. 2451; 373-382
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2011-08-23
    Description: The paper identifies speed, agility, human interface, generation of sensitivity information, task decomposition, and data transmission (including storage) as important attributes for a computer environment to have in order to support engineering design effectively. It is argued that when examined in terms of these attributes the presently available environment can be shown to be inadequate a radical improvement is needed, and it may be achieved by combining new methods that have recently emerged from multidisciplinary design optimization (MDO) with massively parallel processing computer technology. The caveat is that, for successful use of that technology in engineering computing, new paradigms for computing will have to be developed - specifically, innovative algorithms that are intrinsically parallel so that their performance scales up linearly with the number of processors. It may be speculated that the idea of simulating a complex behavior by interaction of a large number of very simple models may be an inspiration for the above algorithms, the cellular automata are an example. Because of the long lead time needed to develop and mature new paradigms, development should be now, even though the widespread availability of massively parallel processing is still a few years away.
    Keywords: Computer Programming and Software
    Type: Multidisciplinary Design and Optimisaton: Proceedings; 11.1 - 11.14
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    Publication Date: 2011-08-23
    Description: Software development projects face numerous challenges that threaten their successful completion. Whether it is not enough money, too little time, or a case of "requirements creep" that has turned into a full sprint, projects must meet these challenges or face possible disastrous consequences. A robust, yet flexible process model can provide a mechanism through which software development teams can meet these challenges head on and win. This article describes how the spiral model has been successfully tailored to a specific project and relates some notable results to date.
    Keywords: Computer Programming and Software
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    Publication Date: 2013-08-31
    Description: In this paper, we workout a detailed mathematical analysis for a new learning algorithm termed Cascade Error Projection (CEP) and a general learning frame work. This frame work can be used to obtain the cascade correlation learning algorithm by choosing a particular set of parameters. Furthermore, CEP learning algorithm is operated only on one layer, whereas the other set of weights can be calculated deterministically. In association with the dynamical stepsize change concept to convert the weight update from infinite space into a finite space, the relation between the current stepsize and the previous energy level is also given and the estimation procedure for optimal stepsize is used for validation of our proposed technique. The weight values of zero are used for starting the learning for every layer, and a single hidden unit is applied instead of using a pool of candidate hidden units similar to cascade correlation scheme. Therefore, simplicity in hardware implementation is also obtained. Furthermore, this analysis allows us to select from other methods (such as the conjugate gradient descent or the Newton's second order) one of which will be a good candidate for the learning technique. The choice of learning technique depends on the constraints of the problem (e.g., speed, performance, and hardware implementation); one technique may be more suitable than others. Moreover, for a discrete weight space, the theoretical analysis presents the capability of learning with limited weight quantization. Finally, 5- to 8-bit parity and chaotic time series prediction problems are investigated; the simulation results demonstrate that 4-bit or more weight quantization is sufficient for learning neural network using CEP. In addition, it is demonstrated that this technique is able to compensate for less bit weight resolution by incorporating additional hidden units. However, generation result may suffer somewhat with lower bit weight quantization.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2013-08-31
    Description: In this era of shrinking federal budgets in the USA we need to dramatically improve our efficiency in the spacecraft engineering design process. We have come up with a method which captures much of the experts' expertise in a dataflow design graph: Seamlessly connectable set of local and remote design tools; Seamlessly connectable web based design tools; and Web browser interface to the developing spacecraft design. We have recently completed our first web browser interface and demonstrated its utility in the design of an aeroshell using design tools located at web sites at three NASA facilities. Multiple design engineers and managers are now able to interrogate the design engine simultaneously and find out what the design looks like at any point in the design cycle, what its parameters are, and how it reacts to adverse space environments.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2013-08-31
    Description: Proton tests were performed on the Actel RH1020 at the Indiana University Cyclotron Facility. The devices were active during the irradiation. Upsets and currents were monitored in real time with the devices being clocked at 1 MHz. The results of the tests are summarized.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    Publication Date: 2013-08-31
    Description: Proton irradiation tests were performed on the Actel CKJ911 prototype Field Programmable Gate Array (FPGA) at the Indian University Cyclotron facility. The devices, tests, and results are discussed.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2013-08-31
    Description: A summary of tests performed at the Indiana University Cycl,oltron Facility, on the Actel A1280A circuit device is described. The intent of the study was to investigate the proton response of the hard-wired S-Module flip-flops with a large sample size. This device is sensitive to protons for S-Modules. The device's performance in the test is shown in graphs, and was typical for devices of this class.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    Publication Date: 2013-08-31
    Description: Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.
    Keywords: Computer Programming and Software
    Type: Embedded Systems Programming
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: The UniTree product as it is released emits a rather large amount of logging information into various log files, but this data typically is meant for operator information in difficult operational situations or directly for debugging purposes. It is strongly advised that the existing logging functionality in the standard release is advanced by adding messages with relevant parameters for every major event in the course of executing file or device oriented requests as described. This upgrade would be relatively easy in terms of implementation effort. The benefit is a complete sequence of transaction descriptions generated by external user requests or by internal administration commands. This collection of transaction records can be used for intensive statistics and performance evaluations. It is furthermore a perfect source to drive realistic system simulations to study the effects of possible hardware or software changes.
    Keywords: Computer Programming and Software
    Type: Fifth NASA Goddard Conference on Mass Storage Systems and Technologies.; Volume 1; 91 - 97; NASA-CP-3340-vol-1
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2016-06-07
    Description: This report summarizes the author's summer 1995 work at NASA Kennedy Space Center in the Advanced System Division. The assignment was path planning for the Payload Inspection and Processing System (PIPS). PIPS is an automated system, programmed off-line for inspection of Space Shuttle payloads after integration and prior to launch. PIPS features a hyper-redundant 18-dof serpentine truss manipulator capable of snake-like motions to avoid obstacles. The path planning problem was divided into two segments: (1) determining an obstacle-free trajectory for the inspection camera at the manipulator tip to follow; and (2) development of a follow-the-leader (FTL) algorithm which ensures whole-arm collision avoidance by forcing ensuing links to follow the same tip trajectory. The summer 1995 work focused on the FTL algorithm. This report summarizes development, implementation, testing, and graphical demonstration of the FTL algorithm for prototype PIPS hardware. The method and code was developed in a modular manner so the final PIPS hardware may use them with minimal changes. The FTL algorithm was implemented using MATLAB software and demonstrated with a high-fidelity IGRIP model. The author also supported implementation of the algorithm in C++ for hardware control. The FTL algorithm proved to be successful and robust in graphical simulation. The author intends to return to the project in summer 1996 to implement path planning for PIPS.
    Keywords: Computer Programming and Software
    Type: ; 613-640
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    Publication Date: 2016-06-07
    Description: The research performed this summer focussed on the development of a predictive model for the loading of liquid oxygen (LO2) into the external tank (ET) of the shuttle prior to launch. A predictive model can greatly aid the operational personnel since instrumentation aboard the orbiter and ET is limited due to weight constraints. The model, which focuses primarily on the orbiter section of the system was developed using a block diagram based simulation language known as VisSim. Simulations were run on LO2 loading data for shuttle flights STS50 and STS55 and the model was demonstrated to accurately predict the sensor data recorded for these flights. As a consequence of the simulation results, it can be concluded that the software tool can be very useful for rapid prototyping of complex models.
    Keywords: Computer Programming and Software
    Type: ; 587-612
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2016-06-07
    Description: The Robotic Tile Inspection System (RTPS), under development in NASA-KSC, is expected to automate the processes of post-flight re-water-proofing and the process of inspection of the Shuttle heat absorbing tiles. An important task of the robot vision sub-system is to register the 'real-world' coordinates with the coordinates of the robot model of the Shuttle tiles. The model coordinates relate to a tile data-base and pre-flight tile-images. In the registration process, current (post-flight) images are aligned with pre-flight images to detect the rotation and translation displacement required for the coordinate systems rectification. The research activities performed this summer included study and evaluation of the registration algorithm that is currently implemented by the RTPS, as well as, investigation of the utility of other registration algorithms. It has been found that the current algorithm is not robust enough. This algorithm has a success rate of less than 80% and is, therefore, not suitable for complying with the requirements of the RTPS. Modifications to the current algorithm has been developed and tested. These modifications can improve the performance of the registration algorithm in a significant way. However, this improvement is not sufficient to satisfy system requirements. A new algorithm for registration has been developed and tested. This algorithm presented very high degree of robustness with success rate of 96%.
    Keywords: Computer Programming and Software
    Type: The 1995 Research Reports: NASA/ASEE Summer Faculty Fellowship Program; 473-500; NASA-CR-199891
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2016-06-07
    Description: The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: As part of the RadCAD's development process, it is necessary to compare RadCAD's results with other radiation tools and exact solutions when and where possible. Form factor algorithms have been previously verified with exact solutions. This paper will consider RadCAD's specular capabilities. First, radiation exchange factors will be compared against exact solutions and results from TRASYS for various geometries. Critical dimensions and optical properties are changed for each geometry. Second, a specular adjunct plate system will be used to verify absorbed heat fluxes. This particular geometric problem has had some attention in the literature. Previous authors have used this problem to validate software results with exact analytical solution. This paper will compare absorbed heat rates against the exact solution and other published results from other thermal radiation tools. The agreement between RadCAD and the exact solutions is good. The maximum error for both specular and diffuse exchange factors for both geometries and all optical properties was 3%. The absorbed fluxes differed by a maximum of 4% for the adjunct plate problem.
    Keywords: Computer Programming and Software
    Type: Proceedings of the Eighth Annual Thermal and Fluids Analysis Workshop: Spacecraft Analysis and Design; 1.1-1.15; NASA-CP-3359
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2016-06-07
    Description: This research was undertaken to provide NASA with a survey of state-of-the-art techniques using in industrial and academia to provide safe, reliable, and maintainable software to drive large systems. Such systems must match the complexity and strict safety requirements of NASA's shuttle system. In particular, the Launch Processing System (LPS) is being considered for replacement. The LPS is responsible for monitoring and commanding the shuttle during test, repair, and launch phases. NASA built this system in the 1970's using mostly hardware techniques to provide for increased reliability, but it did so often using custom-built equipment, which has not been able to keep up with current technologies. This report surveys the major techniques used in industry and academia to ensure reliability in large and critical computer systems.
    Keywords: Computer Programming and Software
    Type: NASA/ASEE Summer Faculty Fellowship Program; 111-122; NASA-CR-202756
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2016-06-07
    Description: This paper describes a blueprint that can be used for developing a distributed computing environment (DCE) for NASA in general, and the Kennedy Space Center (KSC) in particular. A comprehensive, open, secure, integrated, and multi-vendor DCE such as OSF DCE has been suggested. Design issues, as well as recommendations for each component have been given. Where necessary, modifications were suggested to fit the needs of KSC. This was done in the areas of security and directory services. Readers requiring a more comprehensive coverage are encouraged to refer to the eight-chapter document prepared for this work.
    Keywords: Computer Programming and Software
    Type: NASA/ASEE Summer Faculty Fellowship Program; 73-88; NASA-CR-202756
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2016-06-07
    Description: The Active Digital Controller is a system used to control the various functions of wind tunnel models. It has the capability of digitizing and saving of up to sixty-four channels of analog data. It can output up to 16 channels of analog command signals. In addition to its use as a general controller, it can run up to two distinct control laws. All of this is done at a regulated speed of two hundred hertz. The Active Digital Controller (ADC) was modified for use in the Actively Controlled Response of Buffet Affected Tails (ACROBAT) tests and for side-wall pressure data acquisition. The changes included general maintenance and updating of the controller as well as setting up special modes of operation. The ACROBAT tests required that two sets of output signals be available. The pressure data acquisition needed a sampling rate of four hundred hertz, twice the standard ADC rate. These modifications were carried out and the ADC was used during the ACROBAT wind tunnel entry.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; Part 2; 833-836; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: My project is designing a flight control program utilizing 'C' language. It consists of paths made up of fixed radius arcs and straight lines. Arcs will be defined by a center, a radius and turn angle. Straight lines will be defined by an end way point and an inbound course. Way points will be pre-defined such that the location of the end of each leg accurately matches the beginning of the next leg. The simulation paths will closely match paths normally flown by the Transport System Research Vehicle (TSRV), but will necessarily be defined identically in terms of type and number of way points.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; 743-746; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: The purpose of this project was to produce a CD-ROM for the Technology Applications Group. The CD was being developed to allow interested people, organizations, or companies to view the technologies available to them that were developed by NASA research. The CD's main audience however, is any small business. The CD will give the small business an opportunity to see what technologies are available in an inexpensive manner. Most companies probably have a CD-ROM drive on their computers but may not have access to the internet. By using only the internet to inform on the technologies, NASA was not considering a large segment of the population. The CD-ROM can now cover that group of the population.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; Part 2; 625-628; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2016-06-07
    Description: Two of the current and most popular implementations of the Message-Passing Standard, Message Passing Interface (MPI), were contrasted: MPICH by Argonne National Laboratory, and LAM by the Ohio Supercomputer Center at Ohio State University. A parallel skyline matrix solver was adapted to be run in a heterogeneous environment using MPI. The Message-Passing Interface Forum was held in May 1994 which lead to a specification of library functions that implement the message-passing model of parallel communication. LAM, which creates it's own environment, is more robust in a highly heterogeneous network. MPICH uses the environment native to the machine architecture. While neither of these free-ware implementations provides the performance of native message-passing or vendor's implementations, MPICH begins to approach that performance on the SP-2. The machines used in this study were: IBM RS6000, 3 Sun4, SGI, and the IBM SP-2. Each machine is unique and a few machines required specific modifications during the installation. When installed correctly, both implementations worked well with only minor problems.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; Part 2; 589-598; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2016-06-07
    Description: The objective of the Reusable Software System (RSS) is to provide NASA Langley Research Center and its contractor personnel with a reusable software technology through the Internet. The RSS is easily accessible, provides information that is extractable, and the capability to submit information or data for the purpose of scientific research at NASA Langley Research Center within the Atmospheric Science Division.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; Part 2; 545-553; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2016-06-07
    Description: To produce a multimedia CD-ROM for the Technology Applications Group which would present the Technology Opportunity Showcase (TOPS) exhibits and Small Business Innovative Research (SBIR) projects to interested companies. The CD-ROM format is being used and developed especially for those companies who do not have Internet access, and cannot directly visit Langley through the World Wide Web. The CD-ROM will include text, pictures, sound, and movies. The information for the CD-ROM will be stored in a database from which the users can query and browse the information, and future CD's can be maintained and updated.
    Keywords: Computer Programming and Software
    Type: Langley Aerospace Research Summer Scholars; Part 2; 467-471; NASA-CR-202464
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2016-06-07
    Description: During the summer experience in the LARSS program, the author created a performance support system showing the techniques of creating text in Quark XPress, placed the text into Adobe Illustrator along with scanned images, signatures and art work partially created in Adobe Photoshop. The purpose of the project was to familiarize the Office of Education Staff with Graphic Arts and the computer skills utilized to typeset and design certificates, brochures, cover pages, manuals, etc.
    Keywords: Computer Programming and Software
    Type: Technical Reports: Langley Aerospace Research Summer Scholars; Part 1; 137-144; NASA-CR-202463
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    Publication Date: 2013-08-29
    Description: Bayesian inference has been wed successfully for many problems where the aim is to infer the parameters of a model of interest. In this paper we formulate the three dimensional reconstruction problem as the problem of inferring the parameters of a surface model from image data, and show how Bayesian methods can be used to estimate the parameters of this model given the image data. Thus we recover the three dimensional description of the scene. This approach also gives great flexibility. We can specify the geometrical properties of the model to suit our purpose, and can also use different models for how the surface reflects the light incident upon it. In common with other Bayesian inference problems, the estimation methodology requires that we can simulate the data that would have been recoded for any values of the model parameters. In this application this means that if we have image data we must be able to render the surface model. However it also means that we can infer the parameters of a model whose resolution can be chosen irrespective of the resolution of the images, and may be super-resolved. We present results of the inference of surface models from simulated aerial photographs for the case of super-resolution, where many surface elements project into a single pixel in the low-resolution images.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2013-08-29
    Description: This chapter presents the science of "COllective INtelligence" (COIN). A COIN is a large multi-agent systems where: i) the agents each run reinforcement learning (RL) algorithms; ii) there is little to no centralized communication or control; iii) there is a provided world utility function that, rates the possible histories of tile full system. Tile conventional approach to designing large distributed systems to optimize a world utility does not use agents running RL algorithms. Rather that approach begins with explicit modeling of the overall system's dynamics, followed by detailed hand-tuning of the interactions between the components to ensure that they "cooperate" as far as the world utility is concerned. This approach is labor-intensive, often results in highly non-robust systems, and usually results in design techniques that, have limited applicability. In contrast, with COINs we wish to solve the system design problems implicitly, via the 'adaptive' character of the RL algorithms of each of the agents. This COIN approach introduces an entirely new, profound design problem: Assuming the RL algorithms are able to achieve high rewards, what reward functions for the individual agents will, when pursued by those agents, result in high world utility? In other words, what reward functions will best ensure that we do not have phenomena like the tragedy of the commons, or Braess's paradox? Although still very young, the science of COINs has already resulted in successes in artificial domains, in particular in packet-routing, the leader-follower problem, and in variants of Arthur's "El Farol bar problem". It is expected that as it matures not only will COIN science expand greatly the range of tasks addressable by human engineers, but it will also provide much insight into already established scientific fields, such as economics, game theory, or population biology.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2013-08-29
    Description: As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-29
    Description: This column will be provided each quarter as a source for reliability, radiation results, NASA capabilities, and other information on programmable logic devices and related applications. This quarter the focus is on some experimental data on low voltage drop out regulators to support mixed 5 and 3.3 volt systems. A discussion of the Small Explorer WIRE spacecraft will also be given. Lastly, we show take a first look at robust state machines in Hardware Description Languages (VHDL) and their use in critical systems. If you have information that you would like to submit or an area you would like discussed or researched, please give me a call or e-mail.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2013-08-29
    Description: An SRAM (static random access memory)-based reprogrammable FPGA (field programmable gate array) is investigated for space applications. A new commercial prototype, named the RS family, was used as an example for the investigation. The device is fabricated in a 0.25 micrometers CMOS technology. Its architecture is reviewed to provide a better understanding of the impact of single event upset (SEU) on the device during operation. The SEU effect of different memories available on the device is evaluated. Heavy ion test data and SPICE simulations are used integrally to extract the threshold LET (linear energy transfer). Together with the saturation cross-section measurement from the layout, a rate prediction is done on each memory type. The SEU in the configuration SRAM is identified as the dominant failure mode and is discussed in detail. The single event transient error in combinational logic is also investigated and simulated by SPICE. SEU mitigation by hardening the memories and employing EDAC (error detection and correction) at the device level are presented. For the configuration SRAM (CSRAM) cell, the trade-off between resistor de-coupling and redundancy hardening techniques are investigated with interesting results. Preliminary heavy ion test data show no sign of SEL (single event latch-up). With regard to ionizing radiation effects, the increase in static leakage current (static I(sub CC)) measured indicates a device tolerance of approximately 50krad(Si).
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2013-08-29
    Description: Architecture and process, combined, significantly affect the hardness of programmable technologies. The effects of high energy ions, ferroelectric memory architectures, and shallow trench isolation are investigated. A detailed single event latchup (SEL) study has been performed.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2013-08-29
    Description: Dragonfly - NASA and the Crisis Aboard MIR (New York: HarperCollins Publishers), the story of the Russian-American misadventures on MIR. An expose with almost embarrassing detail about the inner-workings of Johnson Space Center in Houston, this book is best read with the JSC organization chart in hand. Here's the real world of engineering and life in extreme environments. It makes most other accounts of "requirements analysis" appear glib and simplistic. The book vividly portrays the sometimes harrowing experiences of the American astronauts in the web of Russian interpersonal relations and literally in the web of MIR's wiring. Burrough's exposition reveals how handling bureaucratic procedures and bulky facilities is as much a matter of moxie and goodwill as technical capability. Lessons from MIR showed NASA that getting to Mars required a different view of knowledge and improvisation-long-duration missions are not at all like the scripted and pre-engineered flights of Apollo or the Space Shuttle.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2013-08-29
    Description: In a recent paper, two operational algorithms to derive ice concentration from satellite multichannel passive microwave sensors have been compared. Although the results of these, known as the NASA Team algorithm and the Bootstrap algorithm, have been validated and are generally in good agreement, there are areas where the ice concentrations differ, by up to 30%. These differences can be explained by shortcomings in one or the other algorithm. Here, we present an algorithm which, in addition to the 19 and 37 GHz channels used by both the Bootstrap and NASA Team algorithms, makes use of the 85 GHz channels as well. Atmospheric effects particularly at 85 GHz are reduced by using a forward atmospheric radiative transfer model. Comparisons with the NASA Team and Bootstrap algorithm show that the individual shortcomings of these algorithms are not apparent in this new approach. The results further show better quantitative agreement with ice concentrations derived from NOAA AVHRR infrared data.
    Keywords: Computer Programming and Software
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...