ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
  • Other Sources  (9,246)
  • NASA Technical Reports  (9,246)
  • EARTH RESOURCES AND REMOTE SENSING  (6,077)
  • CYBERNETICS  (3,169)
  • 1990-1994  (3,101)
  • 1985-1989  (2,571)
  • 1980-1984  (3,574)
  • 1955-1959
Collection
  • Other Sources  (9,246)
Source
  • NASA Technical Reports  (9,246)
Years
Year
  • 1
    Publication Date: 2004-12-03
    Description: The development of an integrated approach to the modeling of forest dynamics encompassing submodels of forest growth and succession, soil processes and radiation interactions, is reported. Remote sensing technology is a key element of this study in that it provides data for developing, initializing, updating, and validating the models. The objectives are reviewed, the data collected and models in use are discussed, and a framework for studying interactions between the forest growth, soil process and energy interaction components, is described. Remote sensing technology used in the study includes optical and microwave field, aircraft and satellite borne instruments. The types of data collected during intensive field and aircraft campaigns included bidirectional reflectance, thermal emittance and multifrequency, multipolarization synthetic aperture radar backscatter. Synthetic imagery of derived products such as forest biomass and NDVI (Normalized Difference Vegetative Index), and collections of ground data are being assembled in a georeferenced data base. These data are used to drive or test multidiscipline simulations of forested ecosystems. Enhancements to the modeling environment permit considerable flexibility in configuring simulations and selecting results for reporting and graphical display.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 1005-1012
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2004-12-03
    Description: The quantitative interpretation of satellite observations requires the use of mathematical tools to extract the desired information on terrestrial environments from the radiation data collected in space. A whole range of approaches can be pursued, from the development of models capable of explaining the nature of the physical signal being measured and of characterizing the state of the system under observation, to the empirical correlations between the variables of interest and the space measurements. The premises and implications of these approaches are outlined, paying special attention to the mathematical and numerical requirements. The role and specific applications of empirical bidirectional reflectance models is also discussed, even though these models do not contribute to the understanding of the theory of radiation transfer or to the assessment of the variables of interest. The advantages and drawbacks of these various approaches and the research priorities for the next few years are discussed in the context of the planned availability of new sensors.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 993-1004
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2004-12-03
    Description: Management of crop residues, the portion of a crop left in the field after harvest, is an important conservation practice for minimizing soil erosion and for improving water quality. Quantification of crop residue cover is required to evaluate the effectiveness of conservation tillage practices. Methods are needed to quantify residue cover that are rapid, accurate, and objective. The fluorescence of crop residue was found to be a broadband phenomenon with emission maxima at 420 to 495 nm for excitations of 350 to 420 nm. Soils had low intensity broadband emissions over the 400 to 690 nm region for excitations of 300 to 600 nm. The range of relative fluorescence intensities for the crop residues was much greater than the fluorescence observed of the soils. As the crop residues decompose their blue fluorescence values approach the fluorescence of the soil. Fluorescence techniques are concluded to be less ambiguous and better suited for discriminating crop residues and soils than reflectance methods. If properly implemented, fluorescence techniques can be used to quantify, not only crop residue cover, but also photosynthetic efficiency in the field.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 855-862
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2004-12-03
    Description: The importance of the measurement of wind fields is discussed. Wind regime data can be used to infer the amount and type of wind induced (aerolian) transport of sand and dust, or to establish global circulation models, for example on other planets. Since local measurements are costly and often impossible, it is desired to infer such data from remotely sensed information. A potential mechanism for remotely inferring the wind regime by using synthetic aperture radar data to describe the roughness of the surface is described. A project to estimate the practicality of using such a mechanism is described. An experiment that extends the mechanism to vegetated sites, where the goal is to measure potential for erosion, is reported.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 451-456
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2004-12-03
    Description: Surface reflectance is required to quantitatively investigate molecular absorption and particle scattering properties of materials on the Earth's surface. Atmospheric aerosol optical depth, surface pressure and water vapor are required to constrain a radiative transfer code for the inversion of measured spectral radiance to apparent surface reflectance. A suite of algorithms using nonlinear least squares fitting techniques are described that directly estimate these atmospheric parameters from spectral radiance measured by the Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). The derived atmospheric parameters are used to constrain a radiative transfer code for the inversion of the imaging spectrometer radiance to apparent reflectance. The derived apparent reflectance is validated with respect to in situ measurement on the same target.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 193-200
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2004-12-03
    Description: Investigations designed to study land surface hydrologic-atmospheric interactions, showing the potential of L band passive microwave radiometry for measuring surface soil moisture over large areas, are discussed. Satisfying the data needs of these investigations requires the ability to map large areas rapidly. With aircraft systems this means a need for more beam positions over a wider swath on each flightline. For satellite systems the essential problem is resolution. Both of these needs are currently being addressed through the development and verification of Electronically Scanned Thinned Array Radiometer (ESTAR) technology. The ESTAR L band radiometer was evaluated for soil moisture mapping applications in two studies. The first was conducted over the semiarid rangeland Walnut Gulch watershed located in south eastern Arizona (U.S.). The second was performed in the subhumid Little Washita watershed in south west Oklahoma (U.S.). Both tests showed that the ESTAR is capable of providing soil moisture with the same level of accuracy as existing systems.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 467-474
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2004-12-03
    Description: A weather resistant automatic scanning Sun photometer system is assessed and demonstrated as practical for measurements of aerosol concentrations and properties at remote sites. Interfaced with a transmitter using the Geostationary Data Collection System (GDCS), the data are processed in near real time. The processing allows a time dependence of the aerosols and water vapor and an ongoing assessment of the health and calibration of the instruments. The system's automatic data acquisition, transmission, and processing offer immediate application to atmospheric monitoring and modeling on a regional to global scale and validation of satellite retrievals. It is estimated that under normal circumstances the retrieved aerosol optical thickness has a network wide accuracy of +/- 0.02 from 340 nm to 1020 nm, water vapor +/- 0.2 cm and size distribution from 0.1 to 3 micrometers.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 75-83
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2004-12-03
    Description: As part of a global program to validate the ocean surface sensors on board ERS-1, a joint experiment on the Grand Banks of Newfoundland was carried out in Nov. 1991. The principal objective was to provide a field validation of ERS-1 Synthetic Aperture Radar (SAR) measurement of ocean surface structure. The NASA-P3 aircraft measurements made during this experiment provide independent measurements of the ocean surface along the validation swath. The Radar Ocean Wave Spectrometer (ROWS) is a radar sensor designed to measure direction of the long wave components using spectral analysis of the tilt induced radar backscatter modulation. This technique greatly differs from SAR and thus, provides a unique set of measurements for use in evaluating SAR performance. Also, an altimeter channel in the ROWS gives simultaneous information on the surface wave height and radar mean square slope parameter. The sets of geophysical parameters (wind speed, significant wave height, directional spectrum) are used to study the SAR's ability to accurately measure ocean gravity waves. The known distortion imposed on the true directional spectrum by the SAR imaging mechanism is discussed in light of the direct comparisons between ERS-1 SAR, airborne Canadian Center for Remote Sensing (CCRS) SAR, and ROWS spectra and the use of the nonlinear ocean SAR transform.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ESA, Proceedings of 2nd ERS-1 Symposium on Space at the Service of Our Environment, Volume 2; p 1161-1164
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2004-12-03
    Description: Terrain slopes, which can be measured with Synthetic Aperture Radar (SAR) interferometry either from a height map or from the interferometric phase gradient, were used to calculate the local incidence angle and the correct pixel area. Both are required for correct thematic interpretation of SAR data. The interferometric correlation depends on the pixel area projected on a plane perpendicular to the look vector and requires correction for slope effects. Methods for normalization of the backscatter and interferometric correlation for ERS-1 SAR are presented.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ESA, Proceedings of 2nd ERS-1 Symposium on Space at the Service of Our Environment, Volume 2; p 723-726
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2004-12-03
    Description: The relationship between the gradient of the interferometric phase and the terrain slope, which, it is thought, would allow a derivation of the terrain slopes without phase unwrapping, is presented. A linear relationship between the interferometric phase gradient and the terrain slopes was found. A quantitative error analysis showed that only very small errors are introduced by these approximations for orbital Synthetic Aperture Radar (SAR) geometries. An example of a slope map for repeat pass interferometry from ERS-1 SAR data is given. A number of direct and indirect applications of the terrain slope are indicated: erosion and avalanche hazard studies, radiometric calibration of SAR data, and normalization of the interferometric correlation coefficient.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ESA, Proceedings of 2nd ERS-1 Symposium on Space at the Service of Our Environment, Volume 2; p 711-715
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 11
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2004-12-03
    Description: Research on the use of active microwaves in remote sensing, presented during plenary and poster sessions, is summarized. The main highlights are: calibration techniques are well understood; innovative modeling approaches have been developed which increase active microwave applications (segmentation prior to model inversion, use of ERS-1 scatterometer, simulations); polarization angle and frequency diversity improves characterization of ice sheets, vegetation, and determination of soil moisture (X band sensor study); SAR (Synthetic Aperture Radar) interferometry potential is emerging; use of multiple sensors/extended spectral signatures is important (increase emphasis).
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 1219-1221
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 12
    Publication Date: 2004-12-03
    Description: Crop residues, the portion of the crop left in the field after harvest, can be an important management factor in controlling soil erosion. Methods to quantify residue cover are needed that are rapid, accurate, and objective. Scenes with known amounts of crop residue were illuminated with long wave ultraviolet (UV) radiation and fluorescence images were recorded with an intensified video camera fitted with a 453 to 488 nm band pass filter. A light colored soil and a dark colored soil were used as background for the weathered soybean stems. Residue cover was determined by counting the proportion of the pixels in the image with fluorescence values greater than a threshold. Soil pixels had the lowest gray levels in the images. The values of the soybean residue pixels spanned nearly the full range of the 8-bit video data. Classification accuracies typically were within 3(absolute units) of measured cover values. Video imaging can provide an intuitive understanding of the fraction of the soil covered by residue.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 923-928
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 13
    Publication Date: 2004-12-03
    Description: Global study of land surface properties uses AVHRR channels 1 and 2, but channel 3 may be of interest, although its use requires preprocessing. It consists of both a reflective part and an emissive part, the former can be derived from T3, T4 and T5. Since the water vapor affects channel 3, its content is retrieved from the channel 4 and 5 using the split window technique. A formula of reflective part retrieval at 3.75 micrometers is tested in the case of sunglint observations where the emissivities of channels 4 and 5 can be set to the unity. The formula is adapted and validated to land surface using the FIFE-87 data set. Preliminary applications of the reflectance at 3.75 micrometers to the studies of surface properties retrieval, aerosol retrieval over land, and desertic aerosol retrieval, are addressed.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 817-824
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 14
    Publication Date: 2004-12-03
    Description: The polarization of the sunlight scattered by atmospheric aerosols or cloud droplets and reflected from ground surfaces or plant canopies may convey much information when used for remote sensing purposes. The typical polarization features of aerosols, cloud droplets, and plant canopies, as observed by ground based and airborne sensors, are investigated, looking especially for those invariant properties amenable to description by simple models when possible. The question of polarization measurements from space is addressed. The interest of such measurements for remote sensing purposes is investigated, and their feasibility is tested by using results obtained during field campaigns of the airborne POLDER instrument, a radiometer designed to measure the directionality and polarization of the sunlight scattered by the ground atmosphere system.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 569-580
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 15
    Publication Date: 2004-12-03
    Description: Papers focused on land surface, atmospheric, and ocean properties are reported. Specific comments pertaining to polarization, models and inversion, and measurements, are given. Recommendations are: continued research into the application potential of the BRDF (Bidirectional Reflectance Distribution Function) and polarization properties of ground surface and atmospheric targets; three dimensional models, which account for the statistical behavior of remotely sensed data, should be extended and inverted in order to support analysis of data potentially covering rolling terrain such that pixels represent heterogeneous mixtures of surface cover types and project ground footprints with sizes between 10 to 6 km, the ground pixel sizes of planned future sensors; available reflectance models should be further validated by means of multi dimensional (directional, spectral, temporal) field data and existing models should be intercompared in more depth to evaluate their performance and limitations; existing methods for model inversion should be validated in more depth in order to quantify the practical limitations and the expected accuracy of the parameters retrieved and new approaches should be developed based upon apriori knowledge of plant canopy development and spectral BRDF properties; there is a need to establish a protocol of validation and intercomparison of the indices and compositing techniques which have been proposed during these last years.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 1225-1227
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 16
    Publication Date: 2004-12-03
    Description: Traditionally, the remote sensing community has relied totally on spectral knowledge to extract vegetation characteristics. However, there are other knowledge bases (KB's) that can be used to significantly improve the accuracy and robustness of inference techniques. Using AI (artificial intelligence) techniques a KB system (VEG) was developed that integrates input spectral measurements with diverse KB's. These KB's consist of data sets of directional reflectance measurements, knowledge from literature, and knowledge from experts which are combined into an intelligent and efficient system for making vegetation inferences. VEG accepts spectral data of an unknown target as input, determines the best techniques for inferring the desired vegetation characteristic(s), applies the techniques to the target data, and provides a rigorous estimate of the accuracy of the inference. VEG was developed to: infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; infer percent ground cover from any combination of nadir and/or off-nadir view angles; infer unknown view angle(s) from known view angle(s) (known as view angle extension); and discriminate between user defined vegetation classes using spectral and directional reflectance relationships developed from an automated learning algorithm. The errors for these techniques were generally very good ranging between 2 to 15% (proportional root mean square). The system is designed to aid scientists in developing, testing, and applying new inference techniques using directional reflectance data.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 581-592
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 17
    Publication Date: 2004-12-03
    Description: Most earth surfaces, particularly those supporting natural vegetation ecosystems, constitute structurally and spectrally complex surfaces that are distinctly non-Lambertian reflectors. Obtaining meaningful measurements of the directional radiances of landscapes and obtaining estimates of the complete bidirectional reflectance distribution functions of ground targets with complex and variable landscape and radiometric features are challenging tasks. Reasons for the increased interest in directional radiance measurements are presented, and the issues that must be addressed when trying to acquire directional radiances for vegetated land surfaces from different types of remote sensing platforms are discussed. Priority research emphases are suggested, concerning field measurements of directional surface radiances and reflectances for future research. Primarily, emphasis must be given to the acquisition of more complete and directly associated radiometric and biometric parameter data sets that will empower the exploitation of the 'angular dimension' in remote sensing of vegetation through enabling the further development and rigorous validation of state of the art plant canopy models.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 561-567
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 18
    Publication Date: 2004-12-03
    Description: Synthetic aperture radar (SAR) images of the Greenland ice sheet collected by an airborne system clearly reveal the four melting facies of this sheet defined 30 years ago from snow stratigraphy studies by glaciologists. In particular, the radar echoes from the percolation facies have radiometric and polarimetric characteristics that are unique among terrestrial surfaces, but that resemble the exotic radar echoes recorded from the icy Galilean satellites. There, the radar signals interact with subsurface, massive ice features created in the cold, dry snow by seasonal melting and refreezing events. The subsurface features act as efficient reflectors of the incident radiation most likely via internal reflections. In the soaked-snow facies, the radar reflectivity is much lower because radar signals are attenuated by the wetter snow before they can interact with subsurface structures. Inversion algorithms to derive geophysical information from the SAR data are developed in both cases to estimate snow wetness in the soaked-snow facies and the mass of ice water retained in the percolation facies.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 431-436
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 19
    Publication Date: 2004-12-03
    Description: An operational stratospheric correction scheme used after the Mount Pinatubo (Phillipines) eruption (Jun. 1991) is presented. The stratospheric aerosol distribution is assumed to be only variable with latitude. Each 9 days the latitudinal distribution of the optical thickness is computed by inverting radiances observed in the NOAA AVHRR channel 1 (0.63 micrometers) and channel 2 (0.83 micrometers) over the Pacific Ocean. This radiance data set is used to check the validity of model used for inversion by checking consistency of the optical thickness deduced from each channel as well as optical thickness deduced from different scattering angles. Using the optical thickness profile previously computed and radiative transfer code assuming Lambertian boundary condition, each pixel of channel 1 and 2 are corrected prior to computation of NDVI (Normalized Difference Vegetation Index). Comparison between corrected, non corrected, and years prior to Pinatubo eruption (1989 to 1990) NDVI composite, shows the necessity and the accuracy of the operational correction scheme.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 151-158
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 20
    Publication Date: 2004-12-03
    Description: Aspects of aerosol studies and remote sensing are reviewed. Aerosol scatters solar radiation before it reaches the surface and scatters and absorbs it again after it is reflected from the surface and before it reaches the satellite sensor. The effect is spectrally and spatially dependent. Therefore atmospheric aerosol (dust, smoke and air pollution particles) has a significant effect on remote sensing. Correction for the aerosol effect was never achieved on an operational basis though several case studies were demonstrated. Correction can be done in a direct way by deriving the aerosol loading from the image itself and correcting for it using the appropriate radiative transfer model or by an indirect way, by defining remote sensing functions that are less dependent on the aerosol loading. To some degree this was already achieved in global remote sensing of vegetation where a composite of several days of NDVI (Normalized Difference Vegetation Index) measurements, choosing the maximal value, was used instead of a single cloud screened value. The Atmospheric Resistant Vegetation Index (ARVI) introduced recently for the NASA Earth Observing System EOS-MODIS is the most appropriate example of indirect correction, where the index is defined in such a way that the atmospheric effect in the blue spectral channel cancels to a large degree the atmospheric in the red channel in computations of a vegetation index. Atmospheric corrections can also use aerosol climatology and ground based instrumentation.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 7-19
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 21
    Publication Date: 2004-12-03
    Description: A session dedicated to high spectral resolution in the solar spectrum, covering topics of calibration, atmospheric correction, geology/pedology, inland water, and vegetation, is reported. The session showed a high degree of diversity in the topics and the approaches used. It was highlighted that high spectral resolution data could provide atmospherically corrected ground level calibrated reflectance values. Important advances were shown in the use of radiative transfer models applied either on water bodies or vegetation. Several studies highlighted the high degree of redundancy contained in high spectral resolution data.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 1217-1218
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 22
    Publication Date: 2011-08-24
    Description: Spectral absorption-coefficients (cross-sections) kappa(sub nu) (/cm/atm) have been measured in the 7.62, 8.97, and 12.3 micrometer bands of HCFC-22 (CHClF2) and the 10.6 micrometer bands of SF6 employing a high-resolution Fourier-transform spectrometer. Temperature and total pressure have been varied to simulate conditions corresponding to tropospheric and stratospheric layers in the atmosphere. The kappa(sub nu) are compared with values measured by us previously using a tunable diode laser spectrometer and with the appropriate entries in HITRAN and GEISA, two of the databases known to the atmospheric scientist. The measured absolute intensities of the bands are compared with previously published values.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: Journal of Quantitative Spectroscopy & Radiative Transfer (ISSN 0022-4073); 52; 3-4; p. 323-332
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 23
    Publication Date: 2011-08-24
    Description: Microwave radar and radiometer measurements of grasslands indicate a substantial reduction in sensor sensitivity to soil moisture in the presence of a thatch layer. When this layer is wet it masks changes in the underlying soil, making the canopy appear warm in the case of passive sensors (radiometer) and decreasing backscatter in the active case (scatterometer). A model for a grass canopy with thatch will be presented in this paper to explain this behavior and to compare with observations. The canopy model consists of three layers: grass, thatch, and the underlying soil. The grass blades are modeled by elongated elliptical discs and the thatch is modeled as a collection of disk shaped water droplets (i.e., the dry matter is neglected). The ground is homogeneous and flat. The distorted Born approximation is used to compute the radar cross section of this three layer canopy and the emissivity is computed from the radar cross section using the Peake formulation for the passive problem. Results are computed at L-band (1.4 GHz) and C-band (4.75 GHz) using canopy parameters (i.e., plant geometry, soil moisture, plant moisture, etc.) representative of Konza Prairie grasslands. The results are compared to C-band scatterometer measurements and L-band radiometer measurements at these grasslands.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: IEEE Transactions on Geoscience and Remote Sensing (ISSN 0196-2892); 32; 1; p. 177-186
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 24
    Publication Date: 2011-08-24
    Description: There have been many significant improvements in the public access to the Space Shuttle Earth Observations Photography Database. New information is provided for the user community on the recently released videodisc of this database. Topics covered included the following: earlier attempts; our first laser videodisc in 1992; the new laser videodisc in 1994; and electronic database access.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: Geocarto (ISSN 1010-6049); 9; 2; p. 65-66
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 25
    Publication Date: 2011-08-24
    Description: The Challenge Awards are designed to provide a unique perspective to students gifted in the arts and humanities from which to understand scientific endeavor by giving students an opportunity to participate in an ongoing research project. In the graduate program, seven students who had participated in previous Challenge Awards programs were selected to help develop the tools for Earth observations for the astronauts on the Space Radar Laboratory (SRL) missions. The goal of the Challenge Awards program was to prepare a training manual for the astronauts on the SRL missions. This paper describes the observations to be made by the astronauts on the SRL missions. The emphasis is on the dynamic seasonal features of the Earth's surface and atmosphere which justify the need for more than one flight of the SRL. Complete notebooks of the sites, global seasonal patterns, examples of radar and the Measurement of Air Pollution from Satellites data, and shuttle photographs have been given to each of the SRL crews.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: Geocarto (ISSN 1010-6049); 9; 1; p. 61-80
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 26
    Publication Date: 2011-08-24
    Description: MacSigma0 is an interactive tool for the Macintosh which allows you to display and make computations from radar data collected by the following sensors: the JPL AIRSAR, ERS-1, JERS-1, and Magellan. The JPL AIRSAR system is a multi-polarimetric airborne synthetic aperture radar developed and operated by the Jet Propulsion Laboratory. It includes the single-frequency L-band sensor mounted on the NASA CV990 aircraft and its replacement, the multi-frequency P-, L-, and C-band sensors mounted on the NASA DC-8. MacSigma0 works with data in the standard JPL AIRSAR output product format, the compressed Stokes matrix format. ERS-1 and JERS-1 are single-frequency, single-polarization spaceborne synthetic aperture radars launched by the European Space Agency and NASDA respectively. To be usable by MacSigma0, The data must have been processed at the Alaska SAR Facility and must be in the "low-resolution" format. Magellan is a spacecraft mission to map the surface of Venus with imaging radar. The project is managed by the Jet Propulsion Laboratory. The spacecraft carries a single-frequency, single-polarization synthetic aperture radar. MacSigma0 works with framelets of the standard MIDR CD-ROM data products. MacSigma0 provides four basic functions: synthesis of images (if necessary), statistical analysis of selected areas, analysis of corner reflectors as a calibration measure (if appropriate and possible), and informative mouse tracking. For instance, the JPL AIRSAR data can be used to synthesize a variety of images such as a total power image. The total power image displays the sum of the polarized and unpolarized components of the backscatter for each pixel. Other images which can be synthesized are HH, HV, VV, RL, RR, HHVV*, HHHV*, HVVV*, HHVV* phase and correlation coefficient images. For the complex and phase images, phase is displayed using color and magnitude is displayed using intensity. MacSigma0 can also be used to compute statistics from within a selected area. The statistics computed depend on the image type. For JPL AIRSAR data, the HH, HV, VV, HHVV* phase, and correlation coefficient means and standard deviation measures are calculated. The mean, relative standard deviation, minimum, and maximum values are calculated for all other data types. A histogram of the selected area is also calculated and displayed. The selected area can be rectangular, linear, or polygonal in shape. The user is allowed to select multiple rectangular areas, but not multiple linear or polygonal areas. The statistics and histogram are displayed to the user and can either be printed or saved as a text file. MacSigma0 can also be used to analyze corner reflectors as a measure of the calibration for JPL AIRSAR, ERS-1, and JERS-1 data types. It computes a theoretical radar cross section and the actual radar cross section for a selected trihedral corner reflector. The theoretical cross section, measured cross section, their ratio in dBs, and other information are displayed to the user and can be saved into a text file. For ERS-1, JERS-1, and Magellan data, MacSigma0 simultaneously displays pixel location in data coordinates and in latitude, longitude coordinates. It also displays sigma0, the incidence angle (for Magellan data), the original pixel value (for Magellan data), and the noise power value (for ERS-1 and JERS-1 data). Grey scale computed images can be saved in a byte format (a headerless format which saves the image as a string of byte values) or a PICT format (a standard format readable by other image processing programs for the Macintosh). Images can also be printed. MacSigma0 is written in C-language for use on Macintosh series computers. The minimum configuration requirements for MacSigma0 are System 6.0, Finder 6.1, 1Mb of RAM, and at least a 4-bit color or grey-scale graphics display. MacSigma0 is also System 7 compatible. To compile the source code, Apple's Macintosh Programmers Workbench (MPW) 3.2 and the MPW C language compiler version 3.2 are required. The source code will not compile with a later version of the compiler; however, the compiled application which will run under the minimum hardware configuration is provided on the distribution medium. In addition, the distribution media includes an executable which runs significantly faster but requires a 68881 compatible math coprocessor and a 68020 compatible CPU. Since JPL AIRSAR data files can be very large, it is often desirable to reduce the size of a data file before transferring it to the Macintosh for use in MacSigma0. A small FORTRAN program which can be used for this purpose is included on the distribution media. MacSigma0 will print statistics on any output device which supports QuickDraw, and it will print images on any device which supports QuickDraw or PostScript. The standard distribution medium for MacSigma0 is a set of five 1.4Mb Macintosh format diskettes. This program was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Version 4.2 of MacSigma0 was released in 1993.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-19060
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 27
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The integration of CLIPS into HyperCard combines the intuitive, interactive user interface of the Macintosh with the powerful symbolic computation of an expert system interpreter. HyperCard is an excellent environment for quickly developing the front end of an application with buttons, dialogs, and pictures, while the CLIPS interpreter provides a powerful inference engine for complex problem solving and analysis. In order to understand the benefit of integrating HyperCard and CLIPS, consider the following: HyperCard is an information storage and retrieval system which exploits the use of the graphics and user interface capabilities of the Apple Macintosh computer. The user can easily define buttons, dialog boxes, information templates, pictures, and graphic displays through the use of the HyperCard tools and scripting language. What is generally lacking in this environment is a powerful reasoning engine for complex problem solving, and this is where CLIPS plays a role. CLIPS 5.0 (C Language Integrated Production System, v5.0) was developed at the Johnson Space Center Software Technology Branch to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 5.0 supports forward chaining rule systems, object-oriented language, and procedural programming for the construction of expert systems. It features incremental reset, seven conflict resolution stategies, truth maintenance, and user-defined external functions. Since CLIPS is implemented in the C language it is highly portable; in addition, it is embeddable as a callable routine from a program written in another language such as Ada or Fortran. By integrating HyperCard and CLIPS the advantages and uses of both packages are made available for a wide range of applications: rapid prototyping of knowledge-based expert systems, interactive simulations of physical systems and intelligent control of hypertext processes, to name a few. HyperCLIPS 2.0 is written in C-Language (54%) and Pascal (46%) for Apple Macintosh computers running Macintosh System 6.0.2 or greater. HyperCLIPS requires HyperCard 1.2 or higher and at least 2Mb of RAM are recommended to run. An executable is provided. To compile the source code, the Macintosh Programmer's Workshop (MPW) version 3.0, CLIPS 5.0 (MSC-21927), and the MPW C-Language compiler are also required. NOTE: Installing this program under Macintosh System 7 requires HyperCard v2.1. This program is distributed on a 3.5 inch Macintosh format diskette. A copy of the program documentation is included on the diskette, but may be purchased separately. HyperCLIPS was developed in 1990 and version 2.0 was released in 1991. HyperCLIPS is a copyrighted work with all copyright vested in NASA. Apple, Macintosh, MPW, and HyperCard are registered trademarks of Apple Computer, Inc.
    Keywords: CYBERNETICS
    Type: NPO-18087
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 28
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: VICAR (Video Image Communication and Retrieval) is a general purpose image processing software system that has been under continuous development since the late 1960's. Originally intended for data from the NASA Jet Propulsion Laboratory's unmanned planetary spacecraft, VICAR is now used for a variety of other applications including biomedical image processing, cartography, earth resources, and geological exploration. The development of this newest version of VICAR emphasized a standardized, easily-understood user interface, a shield between the user and the host operating system, and a comprehensive array of image processing capabilities. Structurally, VICAR can be divided into roughly two parts; a suite of applications programs and an executive which serves as the interfaces between the applications, the operating system, and the user. There are several hundred applications programs ranging in function from interactive image editing, data compression/decompression, and map projection, to blemish, noise, and artifact removal, mosaic generation, and pattern recognition and location. An information management system designed specifically for handling image related data can merge image data with other types of data files. The user accesses these programs through the VICAR executive, which consists of a supervisor and a run-time library. From the viewpoint of the user and the applications programs, the executive is an environment that is independent of the operating system. VICAR does not replace the host computer's operating system; instead, it overlays the host resources. The core of the executive is the VICAR Supervisor, which is based on NASA Goddard Space Flight Center's Transportable Applications Executive (TAE). Various modifications and extensions have been made to optimize TAE for image processing applications, resulting in a user friendly environment. The rest of the executive consists of the VICAR Run-Time Library, which provides a set of subroutines (image I/O, label I/O, parameter I/O, etc.) to facilitate image processing and provide the fastest I/O possible while maintaining a wide variety of capabilities. The run-time library also includes the Virtual Raster Display Interface (VRDI) which allows display oriented applications programs to be written for a variety of display devices using a set of common routines. (A display device can be any frame-buffer type device which is attached to the host computer and has memory planes for the display and manipulation of images. A display device may have any number of separate 8-bit image memory planes (IMPs), a graphics overlay plane, pseudo-color capabilities, hardware zoom and pan, and other features). The VRDI supports the following display devices: VICOM (Gould/Deanza) IP8500, RAMTEK RM-9465, ADAGE (Ikonas) IK3000 and the International Imaging Systems IVAS. VRDI's purpose is to provide a uniform operating environment not only for an application programmer, but for the user as well. The programmer is able to write programs without being concerned with the specifics of the device for which the application is intended. The VICAR Interactive Display Subsystem (VIDS) is a collection of utilities for easy interactive display and manipulation of images on a display device. VIDS has characteristics of both the executive and an application program, and offers a wide menu of image manipulation options. VIDS uses the VRDI to communicate with display devices. The first step in using VIDS to analyze and enhance an image (one simple example of VICAR's numerous capabilities) is to examine the histogram of the image. The histogram is a plot of frequency of occurrence for each pixel value (0 - 255) loaded in the image plane. If, for example, the histogram shows that there are no pixel values below 64 or above 192, the histogram can be "stretched" so that the value of 64 is mapped to zero and 192 is mapped to 255. Now the user can use the full dynamic range of the display device to display the data and better see its contents. Another example of a VIDS procedure is the JMOVIE command, which allows the user to run animations interactively on the display device. JMOVIE uses the concept of "frames", which are the individual frames which comprise the animation to be viewed. The user loads images into the frames after the size and number of frames has been selected. VICAR's source languages are primarily FORTRAN and C, with some VAX Assembler and array processor code. The VICAR run-time library is designed to work equally easily from either FORTRAN or C. The program was implemented on a DEC VAX series computer operating under VMS 4.7. The virtual memory required is 1.5MB. Approximately 180,000 blocks of storage are needed for the saveset. VICAR (version 2.3A/3G/13H) is a copyrighted work with all copyright vested in NASA and is available by license for a period of ten (10) years to approved licensees. This program was developed in 1989.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-18076
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 29
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Interactive Image Display Program (IMDISP) is an interactive image display utility for the IBM Personal Computer (PC, XT and AT) and compatibles. Until recently, efforts to utilize small computer systems for display and analysis of scientific data have been hampered by the lack of sufficient data storage capacity to accomodate large image arrays. Most planetary images, for example, require nearly a megabyte of storage. The recent development of the "CDROM" (Compact Disk Read-Only Memory) storage technology makes possible the storage of up to 680 megabytes of data on a single 4.72-inch disk. IMDISP was developed for use with the CDROM storage system which is currently being evaluated by the Planetary Data System. The latest disks to be produced by the Planetary Data System are a set of three disks containing all of the images of Uranus acquired by the Voyager spacecraft. The images are in both compressed and uncompressed format. IMDISP can read the uncompressed images directly, but special software is provided to decompress the compressed images, which can not be processed directly. IMDISP can also display images stored on floppy or hard disks. A digital image is a picture converted to numerical form so that it can be stored and used in a computer. The image is divided into a matrix of small regions called picture elements, or pixels. The rows and columns of pixels are called "lines" and "samples", respectively. Each pixel has a numerical value, or DN (data number) value, quantifying the darkness or brightness of the image at that spot. In total, each pixel has an address (line number, sample number) and a DN value, which is all that the computer needs for processing. DISPLAY commands allow the IMDISP user to display all or part of an image at various positions on the display screen. The user may also zoom in and out from a point on the image defined by the cursor, and may pan around the image. To enable more or all of the original image to be displayed on the screen at once, the image can be "subsampled." For example, if the image were subsampled by a factor of 2, every other pixel from every other line would be displayed, starting from the upper left corner of the image. Any positive integer may be used for subsampling. The user may produce a histogram of an image file, which is a graph showing the number of pixels per DN value, or per range of DN values, for the entire image. IMDISP can also plot the DN value versus pixels along a line between two points on the image. The user can "stretch" or increase the contrast of an image by specifying low and high DN values; all pixels with values lower than the specified "low" will then become black, and all pixels higher than the specified "high" value will become white. Pixels between the low and high values will be evenly shaded between black and white. IMDISP is written in a modular form to make it easy to change it to work with different display devices or on other computers. The code can also be adapted for use in other application programs. There are device dependent image display modules, general image display subroutines, image I/O routines, and image label and command line parsing routines. The IMDISP system is written in C-language (94%) and Assembler (6%). It was implemented on an IBM PC with the MS DOS 3.21 operating system. IMDISP has a memory requirement of about 142k bytes. IMDISP was developed in 1989 and is a copyrighted work with all copyright vested in NASA. Additional planetary images can be obtained from the National Space Science Data Center at (301) 286-6695.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-17977
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 30
    Publication Date: 2011-08-24
    Description: PC-SEAPAK is a user-interactive satellite data analysis software package specifically developed for oceanographic research. The program is used to process and interpret data obtained from the Nimbus-7/Coastal Zone Color Scanner (CZCS), and the NOAA Advanced Very High Resolution Radiometer (AVHRR). PC-SEAPAK is a set of independent microcomputer-based image analysis programs that provide the user with a flexible, user-friendly, standardized interface, and facilitates relatively low-cost analysis of oceanographic satellite data. Version 4.0 includes 114 programs. PC-SEAPAK programs are organized into categories which include CZCS and AVHRR level-1 ingest, level-2 analyses, statistical analyses, data extraction, remapping to standard projections, graphics manipulation, image board memory manipulation, hardcopy output support and general utilities. Most programs allow user interaction through menu and command modes and also by the use of a mouse. Most programs also provide for ASCII file generation for further analysis in spreadsheets, graphics packages, etc. The CZCS scanning radiometer aboard the NIMBUS-7 satellite was designed to measure the concentration of photosynthetic pigments and their degradation products in the ocean. AVHRR data is used to compute sea surface temperatures and is supported for the NOAA 6, 7, 8, 9, 10, 11, and 12 satellites. The CZCS operated from November 1978 to June 1986. CZCS data may be obtained free of charge from the CZCS archive at NASA/Goddard Space Flight Center. AVHRR data may be purchased through NOAA's Satellite Data Service Division. Ordering information is included in the PC-SEAPAK documentation. Although PC-SEAPAK was developed on a COMPAQ Deskpro 386/20, it can be run on most 386-compatible computers with an AT bus, EGA controller, Intel 80387 coprocessor, and MS-DOS 3.3 or higher. A Matrox MVP-AT image board with appropriate monitor and cables is also required. Note that the authors have received some reports of incompatibilities between the MVP-AT image board and ZENITH computers. Also, the MVP-AT image board is not necessarily compatible with 486-based systems; users of 486-based systems should consult with Matrox about compatibility concerns. Other PC-SEAPAK requirements include a Microsoft mouse (serial version), 2Mb RAM, and 100Mb hard disk space. For data ingest and backup, 9-track tape, 8mm tape and optical disks are supported and recommended. PC-SEAPAK has been under development since 1988. Version 4.0 was updated in 1992, and is distributed without source code. It is available only as a set of 36 1.2Mb 5.25 inch IBM MS-DOS format diskettes. PC-SEAPAK is a copyrighted product with all copyright vested in the National Aeronautics and Space Administration. Phar Lap's DOS_Extender run-time version is integrated into several of the programs; therefore, the PC-SEAPAK programs may not be duplicated. Three of the distribution diskettes contain DOS_Extender files. One of the distribution diskettes contains Media Cybernetics' HALO88 font files, also licensed by NASA for dissemination but not duplication. IBM is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. HALO88 is a registered trademark of Media Cybernetics, but the product was discontinued in 1991.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: GSC-13320
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 31
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Land Analysis System (LAS) is an image analysis system designed to manipulate and analyze digital data in raster format and provide the user with a wide spectrum of functions and statistical tools for analysis. LAS offers these features under VMS with optional image display capabilities for IVAS and other display devices as well as the X-Windows environment. LAS provides a flexible framework for algorithm development as well as for the processing and analysis of image data. Users may choose between mouse-driven commands or the traditional command line input mode. LAS functions include supervised and unsupervised image classification, film product generation, geometric registration, image repair, radiometric correction and image statistical analysis. Data files accepted by LAS include formats such as Multi-Spectral Scanner (MSS), Thematic Mapper (TM) and Advanced Very High Resolution Radiometer (AVHRR). The enhanced geometric registration package now includes both image to image and map to map transformations. The over 200 LAS functions fall into image processing scenario categories which include: arithmetic and logical functions, data transformations, fourier transforms, geometric registration, hard copy output, image restoration, intensity transformation, multispectral and statistical analysis, file transfer, tape profiling and file management among others. Internal improvements to the LAS code have eliminated the VAX VMS dependencies and improved overall system performance. The maximum LAS image size has been increased to 20,000 lines by 20,000 samples with a maximum of 256 bands per image. The catalog management system used in earlier versions of LAS has been replaced by a more streamlined and maintenance-free method of file management. This system is not dependent on VAX/VMS and relies on file naming conventions alone to allow the use of identical LAS file names on different operating systems. While the LAS code has been improved, the original capabilities of the system have been preserved. These include maintaining associated image history, session logging, and batch, asynchronous and interactive mode of operation. The LAS application programs are integrated under version 4.1 of an interface called the Transportable Applications Executive (TAE). TAE 4.1 has four modes of user interaction: menu, direct command, tutor (or help), and dynamic tutor. In addition TAE 4.1 allows the operation of LAS functions using mouse-driven commands under the TAE-Facelift environment provided with TAE 4.1. These modes of operation allow users, from the beginner to the expert, to exercise specific application options. LAS is written in C-language and FORTRAN 77 for use with DEC VAX computers running VMS with approximately 16Mb of physical memory. This program runs under TAE 4.1. Since TAE 4.1 is not a current version of TAE, TAE 4.1 is included within the LAS distribution. Approximately 130,000 blocks (65Mb) of disk storage space are necessary to store the source code and files generated by the installation procedure for LAS and 44,000 blocks (22Mb) of disk storage space are necessary for TAE 4.1 installation. The only other dependencies for LAS are the subroutine libraries for the specific display device(s) that will be used with LAS/DMS (e.g. X-Windows and/or IVAS). The standard distribution medium for LAS is a set of two 9~track 6250 BPI magnetic tapes in DEC VAX BACKUP format. It is also available on a set of two TK50 tape cartridges in DEC VAX BACKUP format. This program was developed in 1986 and last updated in 1992.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: GSC-13075
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 32
    Publication Date: 2011-08-24
    Description: The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.
    Keywords: CYBERNETICS
    Type: COS-10025
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 33
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: VASP is a variable dimension Fortran version of the Automatic Synthesis Program, ASP. The program is used to implement Kalman filtering and control theory. Basically, it consists of 31 subprograms for solving most modern control problems in linear, time-variant (or time-invariant) control systems. These subprograms include operations of matrix algebra, computation of the exponential of a matrix and its convolution integral, and the solution of the matrix Riccati equation. The user calls these subprograms by means of a FORTRAN main program, and so can easily obtain solutions to most general problems of extremization of a quadratic functional of the state of the linear dynamical system. Particularly, these problems include the synthesis of the Kalman filter gains and the optimal feedback gains for minimization of a quadratic performance index. VASP, as an outgrowth of the Automatic Synthesis Program, has the following improvements: more versatile programming language; more convenient input/output format; some new subprograms which consolidate certain groups of statements that are often repeated; and variable dimensioning. The pertinent difference between the two programs is that VASP has variable dimensioning and more efficient storage. The documentation for the VASP program contains a VASP dictionary and example problems. The dictionary contains a description of each subroutine and instructions on its use. The example problems include dynamic response, optimal control gain, solution of the sampled data matrix Riccati equation, matrix decomposition, and a pseudo-inverse of a matrix. This program is written in FORTRAN IV and has been implemented on the IBM 360. The VASP program was developed in 1971.
    Keywords: CYBERNETICS
    Type: ARC-10616
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 34
    Publication Date: 2011-08-24
    Description: Spaceborne Synthetic Aperture Radar (SAR) images are useful for planetary mapping and Earth sciences investigations. However, swath widths rarely exceed 100 Kilometers, and images must be patched together to create a mosaic in order to analyze larger areas. The primary function of this program is to generate large digital mosaics of SAR imagery without manually marked tiepoints. MOSK can produce multiframe mosaics by combining images in the along-track, adjacent cross-track swaths, or ascending and descending passes. Geocoded map registered images, such as the ones produced by MAPJTC (NPO-17718), are required as input. The output is a geocoded mosaic on a standard map grid which permits easy registration with other geocoded data sets. Mosaicking of geocoded SAR imagery involves three steps. First, a match point is selected at the center of the overlapping area, then an image patch around the match point is extracted from both images and cross-correlation is done on this area. Then, images with their refined match points are merged together to form a mosaic. To handle the large data volume of overlapping intermediate stages, large mosaics are divided into equal size quadrants with each quadrant cut from an intermediate mosaic. The full mosaic can then be assembled from the individual quadrants. Finally, radiometric disparities at the image seams are smoothed by a "feathering" technique. The automatic mosaic system generates output with minimal operator interaction. However, manual tiepointing is required in cases of a large registration error or two images with smooth surfaces such as ocean images. MOSK is implemented on a DEC VAX 11/785 running VMS 4.5. Most subroutines are in FORTRAN, but three are in MAXL and one is in APAL. The program requires 1 Mb of memory and a Floating Point Systems AP-5210 array processor. The system memory usage is approximately 1000 pages and the requirement of page file size is 2000 blocks. MOSK was developed in 1988.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-17586
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 35
    Publication Date: 2011-08-24
    Description: MAPJTC was designed to rectify and transform the standard image output of the digital Synthetic Aperture Radar (SAR) correlator into a geocoded map registered image without operator interaction or manual tiepointing. This is accomplished by modeling the distortions and predicting the pixel displacements based on platform and radar parameters. The map projection implemented in MAPJTC is the Universal Transverse Mercator. Since the re-sampling operation is independent of the transformation data generation, other cartographic projections can be implemented with few software modifications. MAPJTC makes a precise determination of the geodetic location of an arbitrary pixel within the image frame based on the simultaneous solution of a set of earth model equations, SAR Doppler equations, and SAR range equations that identify the slant range from the sensor to the target at a specific image pixel. Based on a table of geodetic coordinates of the image pixels, the image is then mapped onto the desired cartographic projection by applying the appropriate transformation equations. Typically, mapping involves a two-dimensional re-sampling and is very computationally intensive. MAPJTC reduces the procedure to two one-dimensional passes, which saves computer time. Geocoding transforms the rectified image into a grid defined by a specific map projection. (The image is rotated and rectified to match the map projection.) Again, the two dimensional re-sampling process can be separated into two one-dimensional re-sampling processes. Optionally, MAPJTC can correct terrain-induced distortions in SAR imagery when a digital elevation map is available. MAPJTC was developed on a DEC VAX 11/785 under VMS 4.5. The program is written in FORTRAN (84%), APAL (2%), and MAXL (14%). It requires 6Mb of memory and a Floating Point Systems AP-5210 Array Processor equipped with 1Mb of memory. MAPJTC can run interactively or as a batch job. MAPJTC was developed in 1987.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-17418
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 36
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different machine environments. There is a DEC VAX/VMS version with a central memory requirement of approximately 242K of 8 bit bytes and a machine independent UNIX 4.2 version. The display device currently supported is the Raster Technologies display processor. Other 512 x 512 resolution color display devices, such as De Anza, may be added with minor code modifications. This program was developed in 1986.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-17182
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 37
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Spectral Analysis Manager (SPAM) was developed to allow easy qualitative analysis of multi-dimensional imaging spectrometer data. Imaging spectrometers provide sufficient spectral sampling to define unique spectral signatures on a per pixel basis. Thus direct material identification becomes possible for geologic studies. SPAM provides a variety of capabilities for carrying out interactive analysis of the massive and complex datasets associated with multispectral remote sensing observations. In addition to normal image processing functions, SPAM provides multiple levels of on-line help, a flexible command interpretation, graceful error recovery, and a program structure which can be implemented in a variety of environments. SPAM was designed to be visually oriented and user friendly with the liberal employment of graphics for rapid and efficient exploratory analysis of imaging spectrometry data. SPAM provides functions to enable arithmetic manipulations of the data, such as normalization, linear mixing, band ratio discrimination, and low-pass filtering. SPAM can be used to examine the spectra of an individual pixel or the average spectra over a number of pixels. SPAM also supports image segmentation, fast spectral signature matching, spectral library usage, mixture analysis, and feature extraction. High speed spectral signature matching is performed by using a binary spectral encoding algorithm to separate and identify mineral components present in the scene. The same binary encoding allows automatic spectral clustering. Spectral data may be entered from a digitizing tablet, stored in a user library, compared to the master library containing mineral standards, and then displayed as a timesequence spectral movie. The output plots, histograms, and stretched histograms produced by SPAM can be sent to a lineprinter, stored as separate RGB disk files, or sent to a Quick Color Recorder. SPAM is written in C for interactive execution and is available for two different machine environments. There is a DEC VAX/VMS version with a central memory requirement of approximately 242K of 8 bit bytes and a machine independent UNIX 4.2 version. The display device currently supported is the Raster Technologies display processor. Other 512 x 512 resolution color display devices, such as De Anza, may be added with minor code modifications. This program was developed in 1986.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-17180
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 38
    Publication Date: 2011-08-24
    Description: The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
    Keywords: CYBERNETICS
    Type: NPO-16965
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 39
    Publication Date: 2011-08-24
    Description: The Simple Tool for Automated Reasoning program (STAR) is an interactive, interpreted programming language for the development and operation of artificial intelligence (AI) application systems. STAR provides an environment for integrating traditional AI symbolic processing with functions and data structures defined in compiled languages such as C, FORTRAN and PASCAL. This type of integration occurs in a number of AI applications including interpretation of numerical sensor data, construction of intelligent user interfaces to existing compiled software packages, and coupling AI techniques with numerical simulation techniques and control systems software. The STAR language was created as part of an AI project for the evaluation of imaging spectrometer data at NASA's Jet Propulsion Laboratory. Programming in STAR is similar to other symbolic processing languages such as LISP and CLIP. STAR includes seven primitive data types and associated operations for the manipulation of these structures. A semantic network is used to organize data in STAR, with capabilities for inheritance of values and generation of side effects. The AI knowledge base of STAR can be a simple repository of records or it can be a highly interdependent association of implicit and explicit components. The symbolic processing environment of STAR may be extended by linking the interpreter with functions defined in conventional compiled languages. These external routines interact with STAR through function calls in either direction, and through the exchange of references to data structures. The hybrid knowledge base may thus be accessed and processed in general by either side of the application. STAR is initially used to link externally compiled routines and data structures. It is then invoked to interpret the STAR rules and symbolic structures. In a typical interactive session, the user enters an expression to be evaluated, STAR parses the input, evaluates the expression, performs any file input/output required, and displays the results. The STAR interpreter is written in the C language for interactive execution. It has been implemented on a VAX 11/780 computer operating under VMS, and the UNIX version has been implemented on a Sun Microsystems 2/170 workstation. STAR has a memory requirement of approximately 200K of 8 bit bytes, excluding externally compiled functions and application-dependent symbolic definitions. This program was developed in 1985.
    Keywords: CYBERNETICS
    Type: NPO-16832
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 40
    Publication Date: 2011-08-24
    Description: CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
    Keywords: CYBERNETICS
    Type: MSC-22434
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 41
    Publication Date: 2011-08-24
    Description: CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
    Keywords: CYBERNETICS
    Type: MSC-22433
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 42
    Publication Date: 2011-08-24
    Description: CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
    Keywords: CYBERNETICS
    Type: MSC-22430
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 43
    Publication Date: 2011-08-24
    Description: CLIPS, the C Language Integrated Production System, is a complete environment for developing expert systems -- programs which are specifically intended to model human expertise or knowledge. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. CLIPS 6.0 provides a cohesive tool for handling a wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming allows knowledge to be represented as heuristics, or "rules-of-thumb" which specify a set of actions to be performed for a given situation. Object-oriented programming allows complex systems to be modeled as modular components (which can be easily reused to model other systems or create new components). The procedural programming capabilities provided by CLIPS 6.0 allow CLIPS to represent knowledge in ways similar to those allowed in languages such as C, Pascal, Ada, and LISP. Using CLIPS 6.0, one can develop expert system software using only rule-based programming, only object-oriented programming, only procedural programming, or combinations of the three. CLIPS provides extensive features to support the rule-based programming paradigm including seven conflict resolution strategies, dynamic rule priorities, and truth maintenance. CLIPS 6.0 supports more complex nesting of conditional elements in the if portion of a rule ("and", "or", and "not" conditional elements can be placed within a "not" conditional element). In addition, there is no longer a limitation on the number of multifield slots that a deftemplate can contain. The CLIPS Object-Oriented Language (COOL) provides object-oriented programming capabilities. Features supported by COOL include classes with multiple inheritance, abstraction, encapsulation, polymorphism, dynamic binding, and message passing with message-handlers. CLIPS 6.0 supports tight integration of the rule-based programming features of CLIPS with COOL (that is, a rule can pattern match on objects created using COOL). CLIPS 6.0 provides the capability to define functions, overloaded functions, and global variables interactively. In addition, CLIPS can be embedded within procedural code, called as a subroutine, and integrated with languages such as C, FORTRAN and Ada. CLIPS can be easily extended by a user through the use of several well-defined protocols. CLIPS provides several delivery options for programs including the ability to generate stand alone executables or to load programs from text or binary files. CLIPS 6.0 provides support for the modular development and execution of knowledge bases with the defmodule construct. CLIPS modules allow a set of constructs to be grouped together such that explicit control can be maintained over restricting the access of the constructs by other modules. This type of control is similar to global and local scoping used in languages such as C or Ada. By restricting access to deftemplate and defclass constructs, modules can function as blackboards, permitting only certain facts and instances to be seen by other modules. Modules are also used by rules to provide execution control. The CRSV (Cross-Reference, Style, and Verification) utility included with previous version of CLIPS is no longer supported. The capabilities provided by this tool are now available directly within CLIPS 6.0 to aid in the development, debugging, and verification of large rule bases. COSMIC offers four distribution versions of CLIPS 6.0: UNIX (MSC-22433), VMS (MSC-22434), MACINTOSH (MSC-22429), and IBM PC (MSC-22430). Executable files, source code, utilities, documentation, and examples are included on the program media. All distribution versions include identical source code for the command line version of CLIPS 6.0. This source code should compile on any platform with an ANSI C compiler. Each distribution version of CLIPS 6.0, except that for the Macintosh platform, includes an executable for the command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory and require an 80386 CPU or better. Users needing an 8086/8088 or 80286 executable must recompile the CLIPS source code themselves. Users who wish to recompile the DOS executable using Borland C++ or MicroSoft C must use a DOS extender program to produce an executable capable of using extended memory. The version of CLIPS 6.0 for IBM PC compatibles requires DOS v3.3 or later and/or Windows 3.1 or later. It is distributed on a set of three 1.4Mb 3.5 inch diskettes. A hard disk is required. The Macintosh version is distributed in compressed form on two 3.5 inch 1.4Mb Macintosh format diskettes, and requires System 6.0.5, or higher, and 1Mb RAM. The version for DEC VAX/VMS is available in VAX BACKUP format on a 1600 BPI 9-track magnetic tape (standard distribution medium) or a TK50 tape cartridge. The UNIX version is distributed in UNIX tar format on a .25 inch streaming magnetic tape cartridge (Sun QIC-24). For the UNIX version, alternate distribution media and formats are available upon request. The CLIPS 6.0 documentation includes a User's Guide and a three volume Reference Manual consisting of Basic and Advanced Programming Guides and an Interfaces Guide. An electronic version of the documentation is provided on the distribution medium for each version: in MicroSoft Word format for the Macintosh and PC versions of CLIPS, and in both PostScript format and MicroSoft Word for Macintosh format for the UNIX and DEC VAX versions of CLIPS. CLIPS was developed in 1986 and Version 6.0 was released in 1993.
    Keywords: CYBERNETICS
    Type: MSC-22429
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 44
    Publication Date: 2011-08-24
    Description: NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
    Keywords: CYBERNETICS
    Type: MSC-22108
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 45
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The CLIPS Intelligent Tutoring System (CLIPSITS) is designed to be used to learn CLIPS, the C-language Integrated Production System expert system shell developed by the Software Technology Branch at Johnson Space Center. The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. This version of CLIPSITS is compatible with the Version 4.2 and 4.3 CLIPS User's Guide. However, the program does not cover any new features of CLIPS v4.3 that were added since the release of v4.2. The chapter numbers in the CLIPS User's Guide correspond directly with the lesson numbers in CLIPSITS. Each lesson in the program contains anywhere from 1 to 10 problems. Most of these have multiple parts. The student is given a subset of these problems from each lesson to work. The actual number of problems presented depends on how well the student masters the previous problem(s). The progression through these lessons is maintained in a personalized file under the student's name. As with most computer languages, there is usually more than one way to solve a problem. CLIPSITS attempts to be as flexible as possible and to allow as many correct solutions as possible. CLIPSITS gives the student the option of setting his/her own colors for the screen interface and the option of redefining special keystroke combinations used within the program. CLIPSITS requires an IBM PC compatible with 640K RAM and optional 2 or 3 button mouse. A 286- or 386-based machine is preferable. Performance will be somewhat slower on an XT class machine. The program must be installed on a hard disk with 825 KB space available. The program was developed in 1989. The standard distribution media is three 5.25" IBM PC DOS format diskettes. The program is also sold bundled with CLIPS for a special combined price as COS-10025. NOTE: Only the executable code is distributed. Supporting documentation is included on the diskettes. IBM, IBM PC and XT are registered trademarks of International Business Machines Corporation.
    Keywords: CYBERNETICS
    Type: MSC-21679
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 46
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: F77NNS (A FORTRAN-77 Neural Network Simulator) simulates the popular back error propagation neural network. F77NNS is an ANSI-77 FORTRAN program designed to take advantage of vectorization when run on machines having this capability, but it will run on any computer with an ANSI-77 FORTRAN Compiler. Artificial neural networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to biological nerve cells. Problems which involve pattern matching or system modeling readily fit the class of problems which F77NNS is designed to solve. The program's formulation trains a neural network using Rumelhart's back-propagation algorithm. Typically the nodes of a network are grouped together into clumps called layers. A network will generally have an input layer through which the various environmental stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. The back-propagation training algorithm can require massive computational resources to implement a large network such as a network capable of learning text-to-phoneme pronunciation rules as in the famous Sehnowski experiment. The Sehnowski neural network learns to pronounce 1000 common English words. The standard input data defines the specific inputs that control the type of run to be made, and input files define the NN in terms of the layers and nodes, as well as the input/output (I/O) pairs. The program has a restart capability so that a neural network can be solved in stages suitable to the user's resources and desires. F77NNS allows the user to customize the patterns of connections between layers of a network. The size of the neural network to be solved is limited only by the amount of random access memory (RAM) available to the user. The program has a memory requirement of about 900K. The standard distribution medium for this package is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. F77NNS was developed in 1989.
    Keywords: CYBERNETICS
    Type: MSC-21638
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 47
    Publication Date: 2011-08-24
    Description: NETS, A Tool for the Development and Evaluation of Neural Networks, provides a simulation of Neural Network algorithms plus an environment for developing such algorithms. Neural Networks are a class of systems modeled after the human brain. Artificial Neural Networks are formed from hundreds or thousands of simulated neurons, connected to each other in a manner similar to brain neurons. Problems which involve pattern matching readily fit the class of problems which NETS is designed to solve. NETS uses the back propagation learning method for all of the networks which it creates. The nodes of a network are usually grouped together into clumps called layers. Generally, a network will have an input layer through which the various environment stimuli are presented to the network, and an output layer for determining the network's response. The number of nodes in these two layers is usually tied to some features of the problem being solved. Other layers, which form intermediate stops between the input and output layers, are called hidden layers. NETS allows the user to customize the patterns of connections between layers of a network. NETS also provides features for saving the weight values of a network during the learning process, which allows for more precise control over the learning process. NETS is an interpreter. Its method of execution is the familiar "read-evaluate-print" loop found in interpreted languages such as BASIC and LISP. The user is presented with a prompt which is the simulator's way of asking for input. After a command is issued, NETS will attempt to evaluate the command, which may produce more prompts requesting specific information or an error if the command is not understood. The typical process involved when using NETS consists of translating the problem into a format which uses input/output pairs, designing a network configuration for the problem, and finally training the network with input/output pairs until an acceptable error is reached. NETS allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard distribution medium) or a .25 inch streaming magnetic tape cartridge in UNIX tar format. NETS was developed in 1989 and updated in 1992. IBM PC is a registered trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. DEC, VAX, and VMS are trademarks of Digital Equipment Corporation. SunIPC and SunOS are trademarks of Sun Microsystems, Inc. CRAY Y-MP and UNICOS are trademarks of Cray Research, Inc.
    Keywords: CYBERNETICS
    Type: MSC-21588
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 48
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The primary purpose of NNETS (Neural Network Environment on a Transputer System) is to provide users a high degree of flexibility in creating and manipulating a wide variety of neural network topologies at processing speeds not found in conventional computing environments. To accomplish this purpose, NNETS supports back propagation and back propagation related algorithms. The back propagation algorithm used is an implementation of Rumelhart's Generalized Delta Rule. NNETS was developed on the INMOS Transputer. NNETS predefines a Back Propagation Network, a Jordan Network, and a Reinforcement Network to assist users in learning and defining their own networks. The program also allows users to configure other neural network paradigms from the NNETS basic architecture. The Jordan network is basically a feed forward network that has the outputs connected to a pseudo input layer. The state of the network is dependent on the inputs from the environment plus the state of the network. The Reinforcement network learns via a scalar feedback signal called reinforcement. The network propagates forward randomly. The environment looks at the outputs of the network to produce a reinforcement signal that is fed back to the network. NNETS was written for the INMOS C compiler D711B version 1.3 or later (MS-DOS version). A small portion of the software was written in the OCCAM language to perform the communications routing between processors. NNETS is configured to operate on a 4 X 10 array of Transputers in sequence with a Transputer based graphics processor controlled by a master IBM PC 286 (or better) Transputer. A RGB monitor is required which must be capable of 512 X 512 resolution. It must be able to receive red, green, and blue signals via BNC connectors. NNETS is meant for experienced Transputer users only. The program is distributed on 5.25 inch 1.2Mb MS-DOS format diskettes. NNETS was developed in 1991. Transputer and OCCAM are registered trademarks of Inmos Corporation. MS-DOS is a registered trademark of Microsoft Corporation. IBM PC is a registered trademark of International Business Machines.
    Keywords: CYBERNETICS
    Type: MSC-21485
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 49
    Publication Date: 2011-08-24
    Description: The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.
    Keywords: CYBERNETICS
    Type: MSC-21467
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 50
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.
    Keywords: CYBERNETICS
    Type: MSC-21208
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 51
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Nickel Cadmium Battery Expert System-2 (NICBES2) is a prototype diagnostic expert system for Nickel Cadmium Battery Health Management. NICBES2 is intended to support evaluation of the performance of Hubble Space Telescope spacecraft batteries, and to alert personnel to possible malfunctions. To achieve this, NICBES2 provides a reasoning system supported by appropriate battery domain knowledge. NICBES2 oversees the status of the batteries by evaluating data gathered in orbit packets, and when the status so merits, raises an alarm and provides fault diagnosis as well as advice on the actions to be taken to remedy the particular alarm. In addition to diagnosis and advice, it provides status history of the batteries' health, and a graphical display capability to help in assimilation of the information by the operator. NICBES2 is composed of three cooperating processes driven by a program written in SunOS C. A serial port process gathers incoming data from an RS-232 connection and places it into a raw data pipe. The data handler processes read this information from the raw data pipe and perform statistical data reduction to generate a set of reduced data files per orbit. The expert system process starts the Quintus Prolog interpreter and the expert system and then uses the reduced data files for the generation of status and advice information. The expert system presents the user with an interface window composed of six subwindows: Battery Status, Advice Selection, Support, Battery Selection, Graphics, and Actions. The Battery status subwindow can provide a display of the current status of a battery. Similarly, advice on battery reconditioning, charging, and workload can be obtained from the Advice Selection subwindow. A display of trends for the last orbit and over a sequence of the last twelve orbits is available in the Graph subwindow. A WHY button is available to give the user an explanation of the rules that the expert system used in determining the current information. The Support subwindow contains an editor for altering the knowledge base. NICBES2 is written in C-language and Quintus Prolog for Sun series computers running SunOS. It requires 8Mb of RAM for execution. The Quintus ProWindows graphics system is required for graphical display, and a Postscript printer is required to print graphics. A DEC LSI-11 is required to send telemetry via a RS-232 connection. The program is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format. NICBES2 was developed in 1989. Sun and SunOS are trademarks of Sun Microsystems, Inc. PostScript is a registered trademark of Adobe Systems Incorporated. UNIX is a registered trademark of AT&T Bell Laboratories. DEC LSI-11 is a trademark of Digital Equipment Corporation.
    Keywords: CYBERNETICS
    Type: MFS-28683
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 52
    Publication Date: 2011-08-24
    Description: The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL~10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ERL-10017
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 53
    Publication Date: 2011-08-24
    Description: The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL~10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ERL-10013
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 54
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: IMAGEP is a FORTRAN computer algorithm containing various image processing, analysis, and enhancement functions. It is a keyboard-driven program organized into nine subroutines. Within the subroutines are other routines, also, selected via keyboard. Some of the functions performed by IMAGEP include digitization, storage and retrieval of images; image enhancement by contrast expansion, addition and subtraction, magnification, inversion, and bit shifting; display and movement of cursor; display of grey level histogram of image; and display of the variation of grey level intensity as a function of image position. This algorithm has possible scientific, industrial, and biomedical applications in material flaw studies, steel and ore analysis, and pathology, respectively. IMAGEP is written in VAX FORTRAN for DEC VAX series computers running VMS. The program requires the use of a Grinnell 274 image processor which can be obtained from Mark McCloud Associates, Campbell, CA. An object library of the required GMR series software is included on the distribution media. IMAGEP requires 1Mb of RAM for execution. The standard distribution medium for this program is a 1600 BPI 9~track magnetic tape in VAX FILES-11 format. It is also available on a TK50 tape cartridge in VAX FILES-11 format. This program was developed in 1991. DEC, VAX, VMS, and TK50 are trademarks of Digital Equipment Corporation.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: LEW-15370
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 55
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: CO-ST-IN is a program developed for NASA to help facilitate the study of Control Structure Interaction, the dynamic coupling between control systems and flexible structures. Current space structures are larger and more flexible than previous designs. At the same time, increased demands are being placed on the performance of control systems. For many space structures it is essential to analyze the interaction of control systems with structural flexibility. CO-ST-IN was designed to complement and enhance rather than to replace the structural dynamics and control system analysis tools already available at NASA. The functions performed by CO-ST-IN can be roughly divided into three areas: 1) data transfer between structural dynamics and control systems software (MSC/NASTRAN, I-DEAS, EASY5 and MATRIXx are currently supported to varying degrees); 2) modal selection at both the component and system level as a means of model reduction; and 3) simulation of the coupled system (given simple controllers). CO-ST-IN reduces the size of the structural model by selecting system modes on the basis of input/output coupling (three algorithms along with a number of other options are offered). This allows the analyst to use far fewer modes in the coupled analysis, since the program will select those which are most closely coupled to the structural inputs and outputs. Another special capability is the calculation of structural outputs such as element forces and stresses using either the mode acceleration or mode displacement approach directly within the coupled simulation. This eliminates the need to return to MSC/NASTRAN for recovery of this data, accelerating the turnaround time of analyses. The transfer of input forces for transient analysis in MSC/NASTRAN is also supported. CO-ST-IN was implemented on a DEC VAX with the VMS operating system. This FORTRAN77 program has a memory requirement of 9.4 MB. CO-ST-IN was developed in 1989.
    Keywords: CYBERNETICS
    Type: LEW-14904
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 56
    Publication Date: 2011-08-24
    Description: This control theory design package, called Optimal Regulator Algorithms for the Control of Linear Systems (ORACLS), was developed to aid in the design of controllers and optimal filters for systems which can be modeled by linear, time-invariant differential and difference equations. Optimal linear quadratic regulator theory, currently referred to as the Linear-Quadratic-Gaussian (LQG) problem, has become the most widely accepted method of determining optimal control policy. Within this theory, the infinite duration time-invariant problems, which lead to constant gain feedback control laws and constant Kalman-Bucy filter gains for reconstruction of the system state, exhibit high tractability and potential ease of implementation. A variety of new and efficient methods in the field of numerical linear algebra have been combined into the ORACLS program, which provides for the solution to time-invariant continuous or discrete LQG problems. The ORACLS package is particularly attractive to the control system designer because it provides a rigorous tool for dealing with multi-input and multi-output dynamic systems in both continuous and discrete form. The ORACLS programming system is a collection of subroutines which can be used to formulate, manipulate, and solve various LQG design problems. The ORACLS program is constructed in a manner which permits the user to maintain considerable flexibility at each operational state. This flexibility is accomplished by providing primary operations, analysis of linear time-invariant systems, and control synthesis based on LQG methodology. The input-output routines handle the reading and writing of numerical matrices, printing heading information, and accumulating output information. The basic vector-matrix operations include addition, subtraction, multiplication, equation, norm construction, tracing, transposition, scaling, juxtaposition, and construction of null and identity matrices. The analysis routines provide for the following computations: the eigenvalues and eigenvectors of real matrices; the relative stability of a given matrix; matrix factorization; the solution of linear constant coefficient vector-matrix algebraic equations; the controllability properties of a linear time-invariant system; the steady-state covariance matrix of an open-loop stable system forced by white noise; and the transient response of continuous linear time-invariant systems. The control law design routines of ORACLS implement some of the more common techniques of time-invariant LQG methodology. For the finite-duration optimal linear regulator problem with noise-free measurements, continuous dynamics, and integral performance index, a routine is provided which implements the negative exponential method for finding both the transient and steady-state solutions to the matrix Riccati equation. For the discrete version of this problem, the method of backwards differencing is applied to find the solutions to the discrete Riccati equation. A routine is also included to solve the steady-state Riccati equation by the Newton algorithms described by Klein, for continuous problems, and by Hewer, for discrete problems. Another routine calculates the prefilter gain to eliminate control state cross-product terms in the quadratic performance index and the weighting matrices for the sampled data optimal linear regulator problem. For cases with measurement noise, duality theory and optimal regulator algorithms are used to calculate solutions to the continuous and discrete Kalman-Bucy filter problems. Finally, routines are included to implement the continuous and discrete forms of the explicit (model-in-the-system) and implicit (model-in-the-performance-index) model following theory. These routines generate linear control laws which cause the output of a dynamic time-invariant system to track the output of a prescribed model. In order to apply ORACLS, the user must write an executive (driver) program which inputs the problem coefficients, formulates and selects the routines to be used to solve the problem, and specifies the desired output. There are three versions of ORACLS source code available for implementation: CDC, IBM, and DEC. The CDC version has been implemented on a CDC 6000 series computer with a central memory of approximately 13K (octal) of 60 bit words. The CDC version is written in FORTRAN IV, was developed in 1978, and last updated in 1986. The IBM version has been implemented on an IBM 370 series computer with a central memory requirement of approximately 300K of 8 bit bytes. The IBM version is written in FORTRAN IV and was generated in 1981. The DEC version has been implemented on a VAX series computer operating under VMS. The VAX version is written in FORTRAN 77 and was generated in 1986.
    Keywords: CYBERNETICS
    Type: GSC-13067
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 57
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Interactive Controls Analysis (INCA) program was developed to provide a user friendly environment for the design and analysis of linear control systems, primarily feedback control systems. INCA is designed for use with both small and large order systems. Using the interactive graphics capability, the INCA user can quickly plot a root locus, frequency response, or time response of either a continuous time system or a sampled data system. The system configuration and parameters can be easily changed, allowing the INCA user to design compensation networks and perform sensitivity analysis in a very convenient manner. A journal file capability is included. This stores an entire sequence of commands, generated during an INCA session into a file which can be accessed later. Also included in INCA are a context-sensitive help library, a screen editor, and plot windows. INCA is robust to VAX-specific overflow problems. The transfer function is the basic unit of INCA. Transfer functions are automatically saved and are available to the INCA user at any time. A powerful, user friendly transfer function manipulation and editing capability is built into the INCA program. The user can do all transfer function manipulations and plotting without leaving INCA, although provisions are made to input transfer functions from data files. By using a small set of commands, the user may compute and edit transfer functions, and then examine these functions by using the ROOT_LOCUS, FREQUENCY_RESPONSE, and TIME_RESPONSE capabilities. Basic input data, including gains, are handled as single-input single-output transfer functions. These functions can be developed using the function editor or by using FORTRAN- like arithmetic expressions. In addition to the arithmetic functions, special functions are available to 1) compute step, ramp, and sinusoid functions, 2) compute closed loop transfer functions, 3) convert from S plane to Z plane with optional advanced Z transform, and 4) convert from Z plane to W plane and back. These capabilities allow the INCA user to perform block diagram algebraic manipulations quickly for functions in the S, Z, and W domains. Additionally, a versatile digital control capability has been included in INCA. Special plane transformations allow the user to easily convert functions from one domain to another. Other digital control capabilities include: 1) totally independent open loop frequency response analyses on a continuous plant, discrete control system with a delay, 2) advanced Z-transform capability for systems with delays, and 3) multirate sampling analyses. The current version of INCA includes Dynamic Functions (which change when a parameter changes), standard filter generation, PD and PID controller generation, incorporation of the QZ-algorithm (function addition, inverse Laplace), and describing functions that allow the user to calculate the gain and phase characteristics of a nonlinear device. The INCA graphic modes provide the user with a convenient means to document and study frequency response, time response, and root locus analyses. General graphics features include: 1) zooming and dezooming, 2) plot documentation, 3) a table of analytic computation results, 4) multiple curves on the same plot, and 5) displaying frequency and gain information for a specific point on a curve. Additional capabilities in the frequency response mode include: 1) a full complement of graphical methods Bode magnitude, Bode phase, Bode combined magnitude and phase, Bode strip plots, root contour plots, Nyquist, Nichols, and Popov plots; 2) user selected plot scaling; and 3) gain and phase margin calculation and display. In the time response mode, additional capabilities include: 1) support for inverse Laplace and inverse Z transforms, 2) support for various input functions, 3) closed loop response evaluation, 4) loop gain sensitivity analyses, 5) intersample time response for discrete systems using the advanced Z transform, and 6) closed loop time response using mixed plane (S, Z, W) operations with delay. A Graphics mode command was added to the current version of INCA, version 3.13, to produce Metafiles (graphic files) of the currently displayed plot. The metafile can be displayed and edited using the QPLOT Graphics Editor and Replotter for Metafiles (GERM) program included with the INCA package. The INCA program is written in Pascal and FORTRAN for interactive or batch execution and has been implemented on a DEC VAX series computer under VMS. Both source code and executable code are supplied for INCA. Full INCA graphics capabilities are supported for various Tektronix 40xx and 41xx terminals; DEC VT graphics terminals; many PC and Macintosh terminal emulators; TEK014 hardcopy devices such as the LN03 Laserprinter; and bit map graphics external hardcopy devices. Also included for the TEK4510 rasterizer users are a multiple copy feature, a wide line feature, and additional graphics fonts. The INCA program was developed in 1985, Version 2.04 was released in 1986, Version 3.00 was released in 1988, and Version 3.13 was released in 1989. An INCA version 2.0X conversion program is included.
    Keywords: CYBERNETICS
    Type: GSC-12998
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 58
    Publication Date: 2011-08-24
    Description: Expensive analysis programs are often combined with optimization procedures to solve engineering problems. An optimal solution requires numerous iterations between the analysis program and an optimizer. This often becomes prohibitive due to cost and amount of computer time needed to converge to an optimal solution. NETS/PROSSS was developed to provide a system for combining NETS (MSC-21588), a neural network program developed at NASA's Johnson Space Center, and the optimization program CONMIN (Constrained Function Minimization, ARC-10836) developed at Ames Research Center. After training, NETS approximates the results from the analysis program, possibly allowing the user to reach a near-optimal solution in much less time than before. These results can then be used as a starting point in a normal optimization process, possibly allowing the user to converge to an optimal solution in significantly fewer iterations. NETS/PROSSS is written in C-language and FORTRAN 77 for Sun series computers running SunOS. The required CONMIN and NETS v3.0 files are included in this package. The documentation for CONMIN and NETS are included with the documentation of NETS/PROSSS. The program requires 342K of RAM for execution. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. NETS/PROSSS was developed in 1991.
    Keywords: CYBERNETICS
    Type: LAR-14818
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 59
    Publication Date: 2011-08-24
    Description: AESOP was developed to solve a number of problems associated with the design of controls and state estimators for linear time-invariant systems. The systems considered are modeled in state-variable form by a set of linear differential and algebraic equations with constant coefficients. Two key problems solved by AESOP are the linear quadratic regulator (LQR) design problem and the steady-state Kalman filter design problem. AESOP is designed to be used in an interactive manner. The user can solve design problems and analyze the solutions in a single interactive session. Both numerical and graphical information are available to the user during the session. The AESOP program is structured around a list of predefined functions. Each function performs a single computation associated with control, estimation, or system response determination. AESOP contains over sixty functions and permits the easy inclusion of user defined functions. The user accesses these functions either by inputting a list of desired functions in the order they are to be performed, or by specifying a single function to be performed. The latter case is used when the choice of function and function order depends on the results of previous functions. The available AESOP functions are divided into several general areas including: 1) program control, 2) matrix input and revision, 3) matrix formation, 4) open-loop system analysis, 5) frequency response, 6) transient response, 7) transient function zeros, 8) LQR and Kalman filter design, 9) eigenvalues and eigenvectors, 10) covariances, and 11) user-defined functions. The most important functions are those that design linear quadratic regulators and Kalman filters. The user interacts with AESOP when using these functions by inputting design weighting parameters and by viewing displays of designed system response. Support functions obtain system transient and frequency responses, transfer functions, and covariance matrices. AESOP can also provide the user with open-loop system information including stability, controllability, and observability. The AESOP program is written in FORTRAN IV for interactive execution and has been implemented on an IBM 3033 computer using TSS 370. As currently configured, AESOP has a central memory requirement of approximately 2 Megs of 8 bit bytes. Memory requirements can be reduced by redimensioning arrays in the AESOP program. Graphical output requires adaptation of the AESOP plot routines to whatever device is available. The AESOP program was developed in 1984.
    Keywords: CYBERNETICS
    Type: LEW-14128
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 60
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The Automation Technology Branch of NASA's Langley Research Center is employing increasingly complex degrees of operator/robot cooperation (telerobotics). A good relationship between the operator and computer is essential for smooth performance by a telerobotic system. ESG (Expert Script Generator) is a software package that automatically generates high-level task objective commands from the NASA Intelligent Systems Research Lab's (ISRL's) complex menu-driven language. ESG reduces errors and makes the telerobotics lab accessible to researchers who are not familiar with the comprehensive language developed by ISRL for interacting with the various systems of the ISRL testbed. ESG incorporates expert system technology to capture the typical rules of operation that a skilled operator would use. The result is an operator interface which optimizes the system's capability to perform a task remotely in a hazardous environment, in a timely manner, and without undue stress to the operator, while minimizing the chance for operator errors that may damage equipment. The intricate menu-driven command interface which provides for various control modes of both manipulators and their associated sensors in the TeleRobotic System Simulation (TRSS) has a syntax which is both irregular and verbose. ESG eliminates the following two problems with this command "language": 1) knowing the correct command sequence to accomplish a task, and 2) inputting a known command sequence without typos and other errors. ESG serves as an additional layer of interface, working in conjunction with the menu command processor, not supplanting it. By specifying task-level commands, such as GRASP, CONNECT, etc., ESG will generate the appropriate menu elements to accomplish the task. These elements will be collected in a script file which can then be executed by the ISRL menu command processor. In addition, the operator can extend the list of task-level commands to include customized tasks composed of sub-task commands. This mechanism gives the operator the ability to build a task-hierarchy tree of increasingly powerful commands. ESG also provides automatic regeneration of scripts based on system knowledge of telerobotic environment updates. The commands generated by ESG may be displayed at the terminal screen and/or stored. ESG is implemented as a rule-based expert system written in CLIPS (C Language Integrated Production System). The system consists of a knowledge-base of task heuristics, a static (unchanged during execution) database which describes the physical features of objects, and a dynamic (may change as a result of task achievement) database which maintains changes in the environment. Capabilites are provided for adding new environmental objects and for modifying existing objects and configuration data. Options are available for interactively viewing both the static and dynamic attribute values of database items. Execution of the ESG may be suspended to allow access to system-level functions. ESG was implemented on a VAX 11/780 with the VMS 4.7 operating system using a VT100 compatible terminal. Its source code is 47% CLIPS and 53% C-language, with a memory requirement of approximately 205 KB. The program was developed in 1988.
    Keywords: CYBERNETICS
    Type: LAR-14065
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 61
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.
    Keywords: CYBERNETICS
    Type: LAR-13687
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 62
    Publication Date: 2011-08-24
    Description: CFORM was developed by the Kennedy Space Center Robotics Lab to assist in linear control system design and analysis using closed form and transient response mechanisms. The program computes the closed form solution and transient response of a linear (constant coefficient) differential equation. CFORM allows a choice of three input functions: the Unit Step (a unit change in displacement); the Ramp function (step velocity); and the Parabolic function (step acceleration). It is only accurate in cases where the differential equation has distinct roots, and does not handle the case for roots at the origin (s=0). Initial conditions must be zero. Differential equations may be input to CFORM in two forms - polynomial and product of factors. In some linear control analyses, it may be more appropriate to use a related program, Linear Control System Design and Analysis (KSC-11376), which uses root locus and frequency response methods. CFORM was written in VAX FORTRAN for a VAX 11/780 under VAX VMS 4.7. It has a central memory requirement of 30K. CFORM was developed in 1987.
    Keywords: CYBERNETICS
    Type: KSC-11394
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 63
    Publication Date: 2011-08-24
    Description: This control theory design package, called Optimal Regulator Algorithms for the Control of Linear Systems (ORACLS), was developed to aid in the design of controllers and optimal filters for systems which can be modeled by linear, time-invariant differential and difference equations. Optimal linear quadratic regulator theory, currently referred to as the Linear-Quadratic-Gaussian (LQG) problem, has become the most widely accepted method of determining optimal control policy. Within this theory, the infinite duration time-invariant problems, which lead to constant gain feedback control laws and constant Kalman-Bucy filter gains for reconstruction of the system state, exhibit high tractability and potential ease of implementation. A variety of new and efficient methods in the field of numerical linear algebra have been combined into the ORACLS program, which provides for the solution to time-invariant continuous or discrete LQG problems. The ORACLS package is particularly attractive to the control system designer because it provides a rigorous tool for dealing with multi-input and multi-output dynamic systems in both continuous and discrete form. The ORACLS programming system is a collection of subroutines which can be used to formulate, manipulate, and solve various LQG design problems. The ORACLS program is constructed in a manner which permits the user to maintain considerable flexibility at each operational state. This flexibility is accomplished by providing primary operations, analysis of linear time-invariant systems, and control synthesis based on LQG methodology. The input-output routines handle the reading and writing of numerical matrices, printing heading information, and accumulating output information. The basic vector-matrix operations include addition, subtraction, multiplication, equation, norm construction, tracing, transposition, scaling, juxtaposition, and construction of null and identity matrices. The analysis routines provide for the following computations: the eigenvalues and eigenvectors of real matrices; the relative stability of a given matrix; matrix factorization; the solution of linear constant coefficient vector-matrix algebraic equations; the controllability properties of a linear time-invariant system; the steady-state covariance matrix of an open-loop stable system forced by white noise; and the transient response of continuous linear time-invariant systems. The control law design routines of ORACLS implement some of the more common techniques of time-invariant LQG methodology. For the finite-duration optimal linear regulator problem with noise-free measurements, continuous dynamics, and integral performance index, a routine is provided which implements the negative exponential method for finding both the transient and steady-state solutions to the matrix Riccati equation. For the discrete version of this problem, the method of backwards differencing is applied to find the solutions to the discrete Riccati equation. A routine is also included to solve the steady-state Riccati equation by the Newton algorithms described by Klein, for continuous problems, and by Hewer, for discrete problems. Another routine calculates the prefilter gain to eliminate control state cross-product terms in the quadratic performance index and the weighting matrices for the sampled data optimal linear regulator problem. For cases with measurement noise, duality theory and optimal regulator algorithms are used to calculate solutions to the continuous and discrete Kalman-Bucy filter problems. Finally, routines are included to implement the continuous and discrete forms of the explicit (model-in-the-system) and implicit (model-in-the-performance-index) model following theory. These routines generate linear control laws which cause the output of a dynamic time-invariant system to track the output of a prescribed model. In order to apply ORACLS, the user must write an executive (driver) program which inputs the problem coefficients, formulates and selects the routines to be used to solve the problem, and specifies the desired output. There are three versions of ORACLS source code available for implementation: CDC, IBM, and DEC. The CDC version has been implemented on a CDC 6000 series computer with a central memory of approximately 13K (octal) of 60 bit words. The CDC version is written in FORTRAN IV, was developed in 1978, and last updated in 1989. The IBM version has been implemented on an IBM 370 series computer with a central memory requirement of approximately 300K of 8 bit bytes. The IBM version is written in FORTRAN IV and was generated in 1981. The DEC version has been implemented on a VAX series computer operating under VMS. The VAX version is written in FORTRAN 77 and was generated in 1986.
    Keywords: CYBERNETICS
    Type: LAR-12313
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 64
    Publication Date: 2011-08-24
    Description: The paper presents the synthesis of neural network based feedback laws for dynamic systems using the computed optimal and time histories of the state and control variables. The efficacy of the proposed approach has been successfully demonstrated on a minimum time orbit injection problem. If the method is found to be effective to real life problems with many state and control variables, it can used for a variety of guidance and control problems.
    Keywords: CYBERNETICS
    Type: Journal of Guidance, Control, and Dynamics (ISSN 0731-5090); 17; 4; p. 868-870
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 65
    Publication Date: 2011-08-24
    Description: A brief mission overview of STS-49 is given, and some of the pictorially outstanding and scientifically interesting photographs obtained during the mission are presented. The Earth observations are described and include the following: the Southwestern Pacific Ocean -- wind and water; the Southwestern Pacific Ocean -- coasts and volcanoes; the US; Cuba and the Bahamas; South America; Africa; the Red Sea and Western Indian Ocean; and the Indian Subcontinent.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: Geocarto (ISSN 1010-6049); 9; 2; p. 67-80
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 66
    Publication Date: 2011-08-24
    Description: The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL~10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: SSC-00020
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 67
    Publication Date: 2011-08-24
    Description: The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL~10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: SSC-00019
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 68
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: MacMultiview is an interactive tool for the Macintosh II family which allows one to display and make computations utilizing polarimetric radar data collected by the Jet Propulsion Laboratory's imaging SAR (synthetic aperture radar) polarimeter system. The system includes the single-frequency L-band sensor mounted on the NASA CV990 aircraft and its replacement, the multi-frequency P-, L-, and C-band sensors mounted on the NASA DC-8. MacMultiview provides two basic functions: computation of synthesized polarimetric images and computation of polarization signatures. The radar data can be used to compute a variety of images. The total power image displays the sum of the polarized and unpolarized components of the backscatter for each pixel. The magnitude/phase difference image displays the HH (horizontal transmit and horizontal receive polarization) to VV (vertical transmit and vertical receive polarization) phase difference using color. Magnitude is displayed using intensity. The user may also select any transmit and receive polarization combination from which an image is synthesized. This image displays the backscatter which would have been observed had the sensor been configured using the selected transmit and receive polarizations. MacMultiview can also be used to compute polarization signatures, three dimensional plots of backscatter versus transmit and receive polarizations. The standard co-polarization signatures (transmit and receive polarizations are the same) and cross-polarization signatures (transmit and receive polarizations are orthogonal) can be plotted for any rectangular subset of pixels within a radar data set. In addition, the ratio of co- and cross-polarization signatures computed from different subsets within the same data set can also be computed. Computed images can be saved in a variety of formats: byte format (headerless format which saves the image as a string of byte values), MacMultiview (a byte image preceded by an ASCII header), and PICT2 format (standard format readable by MacMultiview and other image processing programs for the Macintosh). Images can also be printed on PostScript output devices. Polarization signatures can be saved in either a PICT format or as a text file containing PostScript commands and printed on any QuickDraw output device. The associated Stokes matrices can be stored in a text file. MacMultiview is written in C-language for Macintosh II series computers. MacMultiview will only run on Macintosh II series computers with 8-bit video displays (gray shades or color). The program also requires a minimum configuration of System 6.0, Finder 6.1, and 1Mb of RAM. MacMultiview is NOT compatible with System 7.0. It requires 32-Bit QuickDraw. Note: MacMultiview may not be fully compatible with preliminary versions of 32-Bit QuickDraw. Macintosh Programmer's Workshop and Macintosh Programmer's Workshop C (version 3.0) are required for recompiling and relinking. The standard distribution medium for this package is a set of three 800K 3.5 inch diskettes in Macintosh format. This program was developed in 1989 and updated in 1991. MacMultiview is a copyrighted work with all copyright vested in NASA. QuickDraw, Finder, Macintosh, and System 7 are trademarks of Apple Computer, Inc.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-18966
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 69
    Publication Date: 2004-12-03
    Description: The accurate partitioning of available energy into sensible and latent heat flux is crucial to the understanding of surface atmosphere interactions. This issue is more complicated in arid and semi arid regions where the relative contribution to surface fluxes from the soil and vegetation may vary significantly throughout the day and throughout the season. A three component model to estimate sensible heat flux over heterogeneous surfaces is presented. The surface was represented with two adjacent compartments. The first compartment is made up of two components, shrubs and shaded soil, the second of open 'illuminated' soil. Data collected at two different sites in Nevada (U.S.) during the Summers of 1991 and 1992 were used to evaluate model performance. The results show that the present model is sufficiently general to yield satisfactory results for both sites.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: CNES, Proceedings of 6th International Symposium on Physical Measurements and Signatures in Remote Sensing; p 777-784
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 70
    Publication Date: 2005-09-21
    Description: Since September 1991, the ERS-1 SAR (Synthetic Aperture Radar) has collected approximately 170 frames of ocean imagery on 28 passes over the western Gulf Stream in support of ESA experiment US8-2c. SAR signatures of the north wall are seen on nearly all passes, with modulation depth varying from 3% to 35% for 100 m samples. Many small and mesoscale circulation features associated with the Stream are evident. The detailed form of the signature varies considerably, however. Narrow bright or dark linear features appear to follow streamlines, and on one occasion a strong dark line was associated with an in situ measurement of a sharp current shear. Similarly, larger spatial scale changes in backscatter over the Stream were associated with in situ measurements of atmospheric stability transitions. Physical explanations for the narrow features are not so obvious. However, the accumulation of surfactants along converging current boundaries or local short wave straining and breaking appear plausible. These preliminary results strongly suggest that a wide swath (approximately 500 km) SAR with at least 100 m resolution would be a useful adjunct to existing satellite AVHRR (Advanced Very High Resolution Radiometer) imagery.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ESA, Proceedings of the Second ERS-1 Symposium on Space at the Service of Our Environment, Volume 1; p 547-552
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 71
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2005-09-21
    Description: ERS-1 SAR (Synthetic Aperture Radar) data were used to specify Winter physical processes on the surface of the Greenland Sea, and SSM/I (Special Sensor Microwave/Imager) data were used to characterize the regional behavior of the ice cover. Examination of the SSM/I data indicated that the convective water was likely to be confined to small (less than 100 km) domains near the ice edge. Using ERS-1 SAR data crossing the ice edge, it was possible to identify ice edge features that are very similar to modeled plumes; they have a diameter of about 100 m, a spacing of about 300 m, and cover an area about 20 by 90 km. This plume observation is the first such identification of convection. The plumes are topped by ice, and the return-water areas are open, indicating a freshened top layer of the sea and emphasizing the importance of this layer to convection.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: ESA, Proceedings of the Second ERS-1 Symposium on Space at the Service of Our Environment, Volume 1; p 347-352
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 72
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: There is renewed interest in using optical remote sensing techniques for measuring and monitoring airborne toxins. Many instruments have been used to measure traces of atmospheric molecular species. Some optical remote sensing technology is in use for measuring or monitoring toxic or pollutant gases. The examples in this article provide a number of lessons that may be of use in the new field of optical remote sensing of toxic gases. Topics covered include the following: atmospheric properties, Fourier transform spectrometry, gas-filter correlation, differential absorption laser systems, aircraft missions, laser long-path instruments, differential optical absorption spectrometry, and ozone measurement.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: American Ceramic Society Bulletin (ISSN 0002-7812); 73; 7; p. 79-82
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 73
    Publication Date: 2011-08-24
    Description: This paper is on the control of nonlinear-nonstationary vibration of a frame-stringer structure resulting from high levels of excitation from a nearby supersonic jet exhaust. The structure exhibits periodic, chaotic, or random behaviors when forced by high-intensity sound from a supersonic jet exhaust with 'shock' loading superimposed on a broadband response. The time history of the pressure, showing the rotation and flapping of the shock structure in the jet column due to large-scale instabilities, indicates that the response is not only nonlinear but also nonstationary. The acoustic pressure radiated by the structure also contains shocks and the formation of harmonics with distance. Control of the structural response is achieved by actively forcing the structure with an actuator at the shock oscillation frequency whose amplitude is locked into a self-control cycle. Results show that the peak power level is reduced by a factor of 63, or 18 dB. As a result, new broadband components emerge with at least four harmonics. At accelerating and decelerating supersonic speeds, the exhaust from the jet induces higher transient loading on the nearby flexible structure due to the occurrence of multiple shock from the jet.
    Keywords: CYBERNETICS
    Type: AIAA Journal (ISSN 0001-1452); 32; 7; p. 1367-1376
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 74
    Publication Date: 2011-08-24
    Description: This paper presents a novel method to design decentralized controllers for large complex flexible structures by using the idea of joint decoupling. Decoupling of joint degrees of freedom from the interior degrees of freedom is achieved by setting the joint actuator commands to cancel the internal forces exerting on the joint degrees of freedom. By doing so, the interactions between substructures are eliminated. The global structure control design problem is then decomposed into several substructure control design problems. Control commands for interior actuators are set to be localized state feedback using decentralized observers for state estimation. The proposed decentralized controllers can operate successfully at the individual substructure level as well as at the global structure level. Not only control design but also control implementation is decentralized. A two-component mass-spring-damper system is used as an example to demonstrate the proposed method.
    Keywords: CYBERNETICS
    Type: Journal of Guidance, Control, and Dynamics (ISSN 0731-5090); 17; 4; p. 676-684
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 75
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: Global and regional temperature variations in the lower troposphere and lower stratosphere are examined for the period 1979-92 from Microwave Sounder Unit (MSU) data obtained by the Television Infrared Observation Satellite (TIROS)-N series of National Oceanic and Atmospheric Administration (NOAA) operational satellites. In the lower troposphere, globally-averaged temperature variations appear to be dominated by tropical El Nino (warm) and La Nina (cool) events and volcanic eruptions. The Pinatubo volcanic eruption in June 1991 appears to have initiated a cooling trend which persisted through the most recent data analyzed (July, 1992), and largely overwhelmed the warming from the 1991-92 El Nino. The cooling has been stronger in the Northern Hemisphere than in the Southern Hemisphere. The temperature trend over the 13.5 year satellite record is small (+0.03 C) compared to the year-to-year variability (0.2-0.4 C), making detection of any global warming signal fruitless to date. However, the future global warming trend, currently predicted to be around 0.3 C/decade, will be much easier to discern should it develop. The lower stratospheric temperature record is dominated by warm episodes from the Pinatubo eruption and the March 1982 eruption of El Chichon volcano.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: Advances in Space Research (ISSN 0273-1177); 14; 1; p. (1)69-(1)75
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 76
    Publication Date: 2011-08-24
    Description: Several types of algorithms are generally used to process digital imagery such as Landsat data. The most commonly used algorithms perform the task of registration, compression, and classification. Because there are different techniques available for performing registration, compression, and classification, imagery data users need a rationale for selecting a particular approach to meet their particular needs. This collection of registration, compression, and classification algorithms was developed so that different approaches could be evaluated and the best approach for a particular application determined. Routines are included for six registration algorithms, six compression algorithms, and two classification algorithms. The package also includes routines for evaluating the effects of processing on the image data. This collection of routines should be useful to anyone using or developing image processing software. Registration of image data involves the geometrical alteration of the imagery. Registration routines available in the evaluation package include image magnification, mapping functions, partitioning, map overlay, and data interpolation. The compression of image data involves reducing the volume of data needed for a given image. Compression routines available in the package include adaptive differential pulse code modulation, two-dimensional transforms, clustering, vector reduction, and picture segmentation. Classification of image data involves analyzing the uncompressed or compressed image data to produce inventories and maps of areas of similar spectral properties within a scene. The classification routines available include a sequential linear technique and a maximum likelihood technique. The choice of the appropriate evaluation criteria is quite important in evaluating the image processing functions. The user is therefore given a choice of evaluation criteria with which to investigate the available image processing functions. All of the available evaluation criteria basically compare the observed results with the expected results. For the image reconstruction processes of registration and compression, the expected results are usually the original data or some selected characteristics of the original data. For classification processes the expected result is the ground truth of the scene. Thus, the comparison process consists of determining what changes occur in processing, where the changes occur, how much change occurs, and the amplitude of the change. The package includes evaluation routines for performing such comparisons as average uncertainty, average information transfer, chi-square statistics, multidimensional histograms, and computation of contingency matrices. This collection of routines is written in FORTRAN IV for batch execution and has been implemented on an IBM 360 computer with a central memory requirement of approximately 662K of 8 bit bytes. This collection of image processing and evaluation routines was developed in 1979.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: MFS-25367
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 77
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: A common approach to supervised classification and prediction in artificial intelligence and statistical pattern recognition is the use of decision trees. A tree is "grown" from data using a recursive partitioning algorithm to create a tree which has good prediction of classes on new data. Standard algorithms are CART (by Breiman Friedman, Olshen and Stone) and ID3 and its successor C4 (by Quinlan). As well as reimplementing parts of these algorithms and offering experimental control suites, IND also introduces Bayesian and MML methods and more sophisticated search in growing trees. These produce more accurate class probability estimates that are important in applications like diagnosis. IND is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or it may be omitted. One of the attributes is delegated the "target" and IND grows trees to predict the target. Prediction can then be done on new data or the decision tree printed out for inspection. IND provides a range of features and styles with convenience for the casual user as well as fine-tuning for the advanced user or those interested in research. IND can be operated in a CART-like mode (but without regression trees, surrogate splits or multivariate splits), and in a mode like the early version of C4. Advanced features allow more extensive search, interactive control and display of tree growing, and Bayesian and MML algorithms for tree pruning and smoothing. These often produce more accurate class probability estimates at the leaves. IND also comes with a comprehensive experimental control suite. IND consists of four basic kinds of routines: data manipulation routines, tree generation routines, tree testing routines, and tree display routines. The data manipulation routines are used to partition a single large data set into smaller training and test sets. The generation routines are used to build classifiers. The test routines are used to evaluate classifiers and to classify data using a classifier. And the display routines are used to display classifiers in various formats. IND is written in C-language for Sun4 series computers. It consists of several programs with controlling shell scripts. Extensive UNIX man entries are included. IND is designed to be used on any UNIX system, although it has only been thoroughly tested on SUN platforms. The standard distribution medium for IND is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation in PostScript format is included on the distribution medium. IND was developed in 1992.
    Keywords: CYBERNETICS
    Type: ARC-13188
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 78
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: The program AUTOCLASS III, Automatic Class Discovery from Data, uses Bayesian probability theory to provide a simple and extensible approach to problems such as classification and general mixture separation. Its theoretical basis is free from ad hoc quantities, and in particular free of any measures which alter the data to suit the needs of the program. As a result, the elementary classification model used lends itself easily to extensions. The standard approach to classification in much of artificial intelligence and statistical pattern recognition research involves partitioning of the data into separate subsets, known as classes. AUTOCLASS III uses the Bayesian approach in which classes are described by probability distributions over the attributes of the objects, specified by a model function and its parameters. The calculation of the probability of each object's membership in each class provides a more intuitive classification than absolute partitioning techniques. AUTOCLASS III is applicable to most data sets consisting of independent instances, each described by a fixed length vector of attribute values. An attribute value may be a number, one of a set of attribute specific symbols, or omitted. The user specifies a class probability distribution function by associating attribute sets with supplied likelihood function terms. AUTOCLASS then searches in the space of class numbers and parameters for the maximally probable combination. It returns the set of class probability function parameters, and the class membership probabilities for each data instance. AUTOCLASS III is written in Common Lisp, and is designed to be platform independent. This program has been successfully run on Symbolics and Explorer Lisp machines. It has been successfully used with the following implementations of Common LISP on the Sun: Franz Allegro CL, Lucid Common Lisp, and Austin Kyoto Common Lisp and similar UNIX platforms; under the Lucid Common Lisp implementations on VAX/VMS v5.4, VAX/Ultrix v4.1, and MIPS/Ultrix v4, rev. 179; and on the Macintosh personal computer. The minimum Macintosh required is the IIci. This program will not run under CMU Common Lisp or VAX/VMS DEC Common Lisp. A minimum of 8Mb of RAM is required for Macintosh platforms and 16Mb for workstations. The standard distribution medium for this program is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format and a 3.5 inch diskette in Macintosh format. An electronic copy of the documentation is included on the distribution medium. AUTOCLASS was developed between March 1988 and March 1992. It was initially released in May 1991. Sun is a trademark of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories. DEC, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation. Macintosh is a trademark of Apple Computer, Inc. Allegro CL is a registered trademark of Franz, Inc.
    Keywords: CYBERNETICS
    Type: ARC-13180
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 79
    Publication Date: 2011-08-24
    Description: The Science and Technology Laboratory Applications Software (ELAS) was originally designed to analyze and process digital imagery data, specifically remotely-sensed scanner data. This capability includes the processing of Landsat multispectral data; aircraft-acquired scanner data; digitized topographic data; and numerous other ancillary data, such as soil types and rainfall information, that can be stored in digitized form. ELAS has the subsequent capability to geographically reference this data to dozens of standard, as well as user created projections. As an integrated image processing system, ELAS offers the user of remotely-sensed data a wide range of capabilities in the areas of land cover analysis and general purpose image analysis. ELAS is designed for flexible use and operation and includes its own FORTRAN operating subsystem and an expandable set of FORTRAN application modules. Because all of ELAS resides in one "logical" FORTRAN program, data inputs and outputs, directives, and module switching are convenient for the user. There are over 230 modules presently available to aid the user in performing a wide range of land cover analyses and manipulation. The file management modules enable the user to allocate, define, access, and specify usage for all types of files (ELAS files, subfiles, external files etc.). Various other modules convert specific types of satellite, aircraft, and vector-polygon data into files that can be used by other ELAS modules. The user also has many module options which aid in displaying image data, such as magnification/reduction of the display; true color display; and several memory functions. Additional modules allow for the building and manipulation of polygonal areas of the image data. Finally, there are modules which allow the user to select and classify the image data. An important feature of the ELAS subsystem is that its structure allows new applications modules to be easily integrated in the future. ELAS has as a standard the flexibility to process data elements exceeding 8 bits in length, including floating point (noninteger) elements and 16 or 32 bit integers. Thus it is able to analyze and process "non-standard" nonimage data. The VAX (ERL-10017) and Concurrent (ERL-10013) versions of ELAS 9.0 are written in FORTRAN and ASSEMBLER for DEC VAX series computers running VMS and Concurrent computers running MTM. The Sun (SSC-00019), Masscomp (SSC-00020), and Silicon Graphics (SSC-00021) versions of ELAS 9.0 are written in FORTRAN 77 and C-LANGUAGE for Sun4 series computers running SunOS, Masscomp computers running UNIX, and Silicon Graphics IRIS computers running IRIX. The Concurrent version requires at least 15 bit addressing and a direct memory access channel. The VAX and Concurrent versions of ELAS both require floating-point hardware, at least 1Mb of RAM, and approximately 70Mb of disk space. Both versions also require a COMTAL display device in order to display images. For the Sun, Masscomp, and Silicon Graphics versions of ELAS, the disk storage required is approximately 115Mb, and a minimum of 8Mb of RAM is required for execution. The Sun version of ELAS requires either the X-Window System Version 11 Revision 4 or Sun OpenWindows Version 2. The Masscomp version requires a GA1000 display device and the associated "gp" library. The Silicon Graphics version requires Silicon Graphics' GL library. ELAS display functions will not work with a monochrome monitor. The standard distribution medium for the VAX version (ERL~10017) is a set of two 9-track 1600 BPI magnetic tapes in DEC VAX BACKUP format. This version is also available on a TK50 tape cartridge in DEC VAX BACKUP format. The standard distribution medium for the Concurrent version (ERL-10013) is a set of two 9-track 1600 BPI magnetic tapes in Concurrent BACKUP format. The standard distribution medium for the Sun version (SSC-00019) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Masscomp version, (SSC-00020) is a .25 inch streaming magnetic tape cartridge in UNIX tar format. The standard distribution medium for the Silicon Graphics version (SSC-00021) is a .25 inch streaming magnetic IRIS tape cartridge in UNIX tar format. Version 9.0 was released in 1991. Sun4, SunOS, and Open Windows are trademarks of Sun Microsystems, Inc. MIT X Window System is licensed by Massachusetts Institute of Technology.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: SSC-00021
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 80
    facet.materialart.
    Unknown
    In:  Other Sources
    Publication Date: 2011-08-24
    Description: Calibration of polarimetric radar systems is a field of research in which great progress has been made over the last few years. POLCAL (Polarimetric Radar Calibration) is a software tool intended to assist in the calibration of Synthetic Aperture Radar (SAR) systems. In particular, POLCAL calibrates Stokes matrix format data produced as the standard product by the NASA/Jet Propulsion Laboratory (JPL) airborne imaging synthetic aperture radar (AIRSAR). POLCAL was designed to be used in conjunction with data collected by the NASA/JPL AIRSAR system. AIRSAR is a multifrequency (6 cm, 24 cm, and 68 cm wavelength), fully polarimetric SAR system which produces 12 x 12 km imagery at 10 m resolution. AIRSTAR was designed as a testbed for NASA's Spaceborne Imaging Radar program. While the images produced after 1991 are thought to be calibrated (phase calibrated, cross-talk removed, channel imbalance removed, and absolutely calibrated), POLCAL can and should still be used to check the accuracy of the calibration and to correct it if necessary. Version 4.0 of POLCAL is an upgrade of POLCAL version 2.0 released to AIRSAR investigators in June, 1990. New options in version 4.0 include automatic absolute calibration of 89/90 data, distributed target analysis, calibration of nearby scenes with calibration parameters from a scene with corner reflectors, altitude or roll angle corrections, and calibration of errors introduced by known topography. Many sources of error can lead to false conclusions about the nature of scatterers on the surface. Errors in the phase relationship between polarization channels result in incorrect synthesis of polarization states. Cross-talk, caused by imperfections in the radar antenna itself, can also lead to error. POLCAL reduces cross-talk and corrects phase calibration without the use of ground calibration equipment. Removing the antenna patterns during SAR processing also forms a very important part of the calibration of SAR data. Errors in the processing altitude or in the aircraft roll angle are possible causes of error in computing the antenna patterns inside the processor. POLCAL uses an altitude error correction algorithm to correctly remove the antenna pattern from the SAR images. POLCAL also uses a topographic calibration algorithm to reduce calibration errors resulting from ground topography. By utilizing the backscatter measurements from either the corner reflectors or a well-known distributed target, POLCAL can correct the residual amplitude offsets in the various polarization channels and correct for the absolute gain of the radar system. POLCAL also gives the user the option of calibrating a scene using the calibration data from a nearby site. This allows precise calibration of all the scenes acquired on a flight line where corner reflectors were present. Construction and positioning of corner reflectors is covered extensively in the program documentation. In an effort to keep the POLCAL code as transportable as possible, the authors eliminated all interactions with a graphics display system. For this reason, it is assumed that users will have their own software for doing the following: (1) synthesize an image using HH or VV polarization, (2) display the synthesized image on any display device, and (3) read the pixel locations of the corner reflectors from the image. The only inputs used by the software (in addition to the input Stokes matrix data file) is a small data file with the corner reflector information. POLCAL is written in FORTRAN 77 for use on Sun series computers running SunOS and DEC VAX computers running VMS. It requires 4Mb of RAM under SunOS and 3.7Mb of RAM under VMS for execution. The standard distribution medium for POLCAL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 9-track 1600 BPI magnetic tape in DEC VAX FILES-11 format or on a TK50 tape cartridge in DEC VAX FILES-11 format. Other distribution media may be available upon request. Documentation is included in the price of the program. POLCAL 4.0 was released in 1992 and is a copyrighted work with all copyright vested in NASA.
    Keywords: EARTH RESOURCES AND REMOTE SENSING
    Type: NPO-18954
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 81
    Publication Date: 2013-08-31
    Description: In this paper we study the robustness with respect to stability of the closed-loop system with collocated rate sensor using LQG (mean square rate) optimized compensators. Our main result is that the transmission zeros of the compensator are precisely the structure modes when the actuator/sensor locations are 'pinned' and/or 'clamped': i.e., motion in the direction sensed is not allowed. We have stability even under parameter mismatch, except in the unlikely situation where such a mode frequency of the assumed system coincides with an undamped mode frequency of the real system and the corresponding mode shape is an eigenvector of the compensator transfer function matrix at that frequency. For a truncated modal model - such as that of the NASA LaRC Phase Zero Evolutionary model - the transmission zeros of the corresponding compensator transfer function can be interpreted as the structure modes when motion in the directions sensed is prohibited.
    Keywords: CYBERNETICS
    Type: NASA. Langley Research Center, NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems; p 445-463
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 82
    Publication Date: 2013-08-31
    Description: In optimal placement of actuators for stochastic systems, it is commonly assumed that the actuator noise variances are not related to the feedback matrix and the actuator locations. In this paper, we will discuss the limitation of that assumption and develop a more practical noise variance model. Various properties associated with optimal actuator placement under the assumption of this noise variance model are discovered through the analytical study of a second order system.
    Keywords: CYBERNETICS
    Type: NASA. Langley Research Center, NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems; p 323-331
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 83
    Publication Date: 2013-08-31
    Description: Optimal regulation of hyperbolic systems in the presence of unknown disturbances is considered. Necessary conditions for determining the optimal control that tracks a desired trajectory in the presence of the worst possible perturbations are developed. The results also characterize the worst possible disturbance that the system will be able to tolerate before any degradation of the system performance. Numerical results on the control of a vibrating beam are presented.
    Keywords: CYBERNETICS
    Type: NASA. Langley Research Center, NASA Workshop on Distributed Parameter Modeling and Control of Flexible Aerospace Systems; p 317-322
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 84
    Publication Date: 2013-08-31
    Description: This paper presents the development of a general-purpose fuzzy logic (FL) control methodology for isolating the external vibratory disturbances of space-based devices. According to the desired performance specifications, a full investigation regarding the development of an FL controller was done using different scenarios, such as variances of passive reaction-compensating components and external disturbance load. It was shown that the proposed FL controller is robust in that the FL-controlled system closely follows the prespecified ideal reference model. The comparative study also reveals that the FL-controlled system achieves significant improvement in reducing vibrations over passive systems.
    Keywords: CYBERNETICS
    Type: The 28th Aerospace Mechanisms Symposium; p 159-165; NASA-CP-3260
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 85
    Publication Date: 2013-08-31
    Description: Remote viewing is critical for teleoperations, but the inherent limitations of standard video reduce the operator's effectiveness. These limitations have been compensated for in many ways, from using the operator's adaptability, to augmenting his capability with feedback from a variety of sensors and simulations. Omniview can overcome some of these limitations and improve the operator's efficiency without adding additional sensors or computational burden. It can minimize the potential collisions with facility equipment, provide peripheral vision, and display multiple images simultaneously from a single input device. The Omniview technology provides electronic pan, tilt, magnify, and rotational orientation within a hemispherical field-of-view without any moving parts. Image sizes, viewing directions, scale, offset, etc., may be adjusted to fit operator needs. This paper discusses the derivation of the image transformation, the design of the electronics, and two applications to telepresence that are under development. These are Video Emulated Tweening (VET), and Manipulator Guidance and Positioning (ManGAP). The VET effort uses Omniview to compensate for time-delayed video in teleoperation of remote vehicles. In ManGAP two Omniview systems are used to provide two sets of orientation vectors to points in the field-of-view (FOV). These vectors then provide absolute position information to both control the position of the telerobot, and to avoid collisions with the work sight equipment.
    Keywords: CYBERNETICS
    Type: NASA. Johnson Space Center, The Seventh Annual Workshop on Space Operations Applications and Research (SOAR 1993), Volume 1; p 86-93
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 86
    Publication Date: 2013-08-31
    Description: A fuzzy-neural control system simulation was developed for the control of a camera platform used to observe aircraft on final approach to an aircraft carrier. The fuzzy-neural approach to control combines the structure of a fuzzy knowledge base with a supervised neural network's ability to adapt and improve. The performance characteristics of this hybrid system were compared to those of a fuzzy system and a neural network system developed independently to determine if the fusion of these two technologies offers any advantage over the use of one or the other. The results of this study indicate that the fuzzy-neural approach to control offers some advantages over either fuzzy or neural control alone.
    Keywords: CYBERNETICS
    Type: NASA, Washington, Technology 2003: The Fourth National Technology Transfer Conference and Exposition, Volume 2; p 17-23
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 87
    Publication Date: 2013-08-31
    Description: A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.
    Keywords: CYBERNETICS
    Type: NASA, Washington, Technology 2003: The Fourth National Technology Transfer Conference and Exposition, Volume 2; p 7-16
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 88
    Publication Date: 2013-08-31
    Description: Neural networks are an outgrowth of interdisciplinary studies concerning the brain. These studies are guiding the field of Artificial Intelligence towards the, so-called, 6th Generation Computer. Enormous amounts of resources have been poured into R/D. Wavelet Transforms (WT) have replaced Fourier Transforms (FT) in Wideband Transient (WT) cases since the discovery of WT in 1985. The list of successful applications includes the following: earthquake prediction; radar identification; speech recognition; stock market forecasting; FBI finger print image compression; and telecommunication ISDN-data compression.
    Keywords: CYBERNETICS
    Type: NASA, Washington, Technology 2003: The Fourth National Technology Transfer Conference and Exposition, Volume 2; p 34-39
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 89
    Publication Date: 2013-08-31
    Description: Artificial neural systems (ANS), also known as neural networks, are an attempt to develop computer systems that emulate the neural reasoning behavior of biological neural systems (e.g. the human brain). As such, they are loosely based on biological neural networks. The ANS consists of a series of nodes (neurons) and weighted connections (axons) that, when presented with a specific input pattern, can associate specific output patterns. It is essentially a highly complex, nonlinear, mathematical relationship or transform. These constructs have two significant properties that have proven useful to the authors in signal processing and process modeling: noise tolerance and complex pattern recognition. Specifically, the authors have developed a new network learning algorithm that has resulted in the successful application of ANS's to high speed signal processing and to developing models of highly complex processes. Two of the applications, the Weld Bead Geometry Control System and the Welding Penetration Monitoring System, are discussed in the body of this paper.
    Keywords: CYBERNETICS
    Type: NASA, Washington, Technology 2003: The Fourth National Technology Transfer Conference and Exposition, Volume 2; p 24-33
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 90
    Publication Date: 2013-08-31
    Description: Neural network capabilities include automatic and organized handling of complex information, quick adaptation to continuously changing environments, nonlinear modeling, and parallel implementation. This viewgraph presentation presents Bellcore work on applications, learning chip computational function, learning system block diagram, neural network equalization, broadband access control, calling-card fraud detection, software reliability prediction, and conclusions.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 209-218
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 91
    Publication Date: 2013-08-31
    Description: Over the past three years, our group has concentrated on the application of neural network methods to the training of controllers for real-world systems. This presentation describes our approach, surveys what we have found to be important, mentions some contributions to the field, and shows some representative results. Topics discussed include: (1) executing model studies as rehearsal for experimental studies; (2) the importance of correct derivatives; (3) effective training with second-order (DEKF) methods; (4) the efficacy of time-lagged recurrent networks; (5) liberation from the tyranny of the control cycle using asynchronous truncated backpropagation through time; and (6) multistream training for robustness. Results from model studies of automotive idle speed control serve as examples for several of these topics.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 191
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 92
    Publication Date: 2013-08-31
    Description: The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 163-170
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 93
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: This viewgraph presentation presents four working analog VLSI vision chips: (1) time-derivative retina, (2) zero-crossing chip, (3) resistive fuse, and (4) figure-ground chip; work in progress on computing motion and neuromorphic systems; and conceptual and practical lessons learned.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 127-135
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 94
    Publication Date: 2013-08-31
    Description: The problem under consideration in this viewgraph presentation is to understand, predict, and control the fluid mechanics of dynamic maneuvers, unsteady boundary layers, and vortex dominated flows. One solution is the application of neural networks demonstrating closed-loop control. Neural networks offer unique opportunities: simplify modeling of three dimensional, vortex dominated, unsteady separated flow fields; are effective means for controlling unsteady aerodynamics; and address integration of sensors, controllers, and time lags into adaptive control systems.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 107-126
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 95
    Publication Date: 2013-08-31
    Description: Irvine Sensors Corporation (ISC), working closely with JPL under BMDO/ONR sponsorship, is developing a radically new neural computing technology. Primarily aimed at discrimination and target recognition for BMDO missile interceptor applications, it appears to have near term commercial applicability to such problems as handwriting and face recognition, just to name two. In its earliest form it will be able to perform inner product computation using 262 thousand 64x64 templates (weighted synapse arrays) where the 64(exp 5) weights can all be changed every millisecond. Internal switching provides an inherent capability to zoom, translate, or rotate the templates. The 3D silicon architecture is manufactured on a commercial, high volume DRAM production line at very low cost, enabling its commercialization. Two technology thrusts are beginning: in the first, the 64 layer capability of 3DANN-I will be extended to 1024 layers and beyond. In the second layer size will be shrunk to 2-3 millimeters to reduce layer costs. Our workshop goal is to expose this technology to the neural network community as an emerging tool for their use and to obtain their desire for its future development.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 65-74
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 96
    facet.materialart.
    Unknown
    In:  CASI
    Publication Date: 2013-08-31
    Description: Electronic and optoelectronic hardware implementations of highly parallel computing architectures address several ill-defined and/or computation-intensive problems not easily solved by conventional computing techniques. The concurrent processing architectures developed are derived from a variety of advanced computing paradigms including neural network models, fuzzy logic, and cellular automata. Hardware implementation technologies range from state-of-the-art digital/analog custom-VLSI to advanced optoelectronic devices such as computer-generated holograms and e-beam fabricated Dammann gratings. JPL's concurrent processing devices group has developed a broad technology base in hardware implementable parallel algorithms, low-power and high-speed VLSI designs and building block VLSI chips, leading to application-specific high-performance embeddable processors. Application areas include high throughput map-data classification using feedforward neural networks, terrain based tactical movement planner using cellular automata, resource optimization (weapon-target assignment) using a multidimensional feedback network with lateral inhibition, and classification of rocks using an inner-product scheme on thematic mapper data. In addition to addressing specific functional needs of DOD and NASA, the JPL-developed concurrent processing device technology is also being customized for a variety of commercial applications (in collaboration with industrial partners), and is being transferred to U.S. industries. This viewgraph p resentation focuses on two application-specific processors which solve the computation intensive tasks of resource allocation (weapon-target assignment) and terrain based tactical movement planning using two extremely different topologies. Resource allocation is implemented as an asynchronous analog competitive assignment architecture inspired by the Hopfield network. Hardware realization leads to a two to four order of magnitude speed-up over conventional techniques and enables multiple assignments, (many to many), not achievable with standard statistical approaches. Tactical movement planning (finding the best path from A to B) is accomplished with a digital two-dimensional concurrent processor array. By exploiting the natural parallel decomposition of the problem in silicon, a four order of magnitude speed-up over optimized software approaches has been demonstrated.
    Keywords: CYBERNETICS
    Type: A Decade of Neural Networks: Practical Applications and Prospects; p 39-51
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 97
    Publication Date: 2013-08-31
    Description: We have previously reported on the use of neural networks for detection and identification of faults in complex microprocessor controlled powertrain systems. The data analyzed in those studies consisted of the full spectrum of signals passing between the engine and the real-time microprocessor controller. The specific task of the classification system was to classify system operation as nominal or abnormal and to identify the fault present. The primary concern in earlier work was the identification of faults, in sensors or actuators in the powertrain system as it was exercised over its full operating range. The use of data from a variety of sources, each contributing some potentially useful information to the classification task, is commonly referred to as sensor fusion and typifies the type of problems successfully addressed using neural networks. In this work we explore the application of neural networks to a different diagnostic problem, the diagnosis of faults in newly manufactured engines and the utility of neural networks for process control.
    Keywords: CYBERNETICS
    Type: JPL, A Decade of Neural Networks: Practical Applications and Prospects; p 15-22
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 98
    Publication Date: 2013-08-31
    Description: The Intelligent Satellite Control Software (ISACS) for the geoMagnetic tail observation satellite named GEOTAIL (launched in July 1992) has been successfully developed. ISACS has made it possible by applying Artificial Intelligence (AI) technology including an expert system to autonomously generate a tracking schedule, which originally used to be conducted manually. Using ISACS, a satellite operator can generate a maximum four day period of stored command stream autonomously and can easily confirm its safety. The ISACS system has another function -- to diagnose satellite troubles and to suggest necessary remedies. The workload of satellite operators has drastically been reduced since ISACS has been introduced into the operations of GEOTAIL.
    Keywords: CYBERNETICS
    Type: JPL, Third International Symposium on Artificial Intelligence, Robotics, and Automation for Space 1994; p 397-400
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 99
    Publication Date: 2013-08-31
    Description: Time delay and small capacity of communication are the primary constraint in super long distance telerobotic systems such as astronautical robotic tasks. Intelligent telerobotics is thought to break this constraint. We aim to realize this super long distance telerobotic system with object handling knowledge base and intelligent monitoring. We will discuss physical and technical factors for this purpose.
    Keywords: CYBERNETICS
    Type: JPL, Third International Symposium on Artificial Intelligence, Robotics, and Automation for Space 1994; p 285-288
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 100
    Publication Date: 2013-08-31
    Description: This paper reports experiments involving the handling of flexible parts (e.g. wires) when using a teleoperated system with time delay. The task is principally a peg-in-hole task involving the wrapping of a wire around two posts on the task-board. It is difficult to estimate the effects of the flexible parts; therefore, on-line teleoperation is indispensable for this class of unpredictable task. We first propose a teleoperation system based on the predictive image display, then describe an experimental teleoperation testbed with a four second transmission time delay. Finally, we report on wire handling operations that were performed to evaluate the performance of this system. Those experiments will contribute to future advanced experiments for the MITI ETS-7 mission.
    Keywords: CYBERNETICS
    Type: JPL, Third International Symposium on Artificial Intelligence, Robotics, and Automation for Space 1994; p 289-292
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...