ALBERT

All Library Books, journals and Electronic Records Telegrafenberg

feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Publication Date: 2013-08-31
    Description: NASA Marshall Space Flight Center (MSFC) has developed and tested an engineering model of an automated rendezvous and docking sensor system composed of a video camera ringed with laser diodes at two wavelengths and a standard remote manipulator system target that has been modified with retro-reflective tape and 830 and 780 mm optical filters. TRW has provided additional engineering analysis, design, and manufacturing support, resulting in a robust, low cost, automated rendezvous and docking sensor design. We have addressed the issue of space qualification using off-the-shelf hardware components. We have also addressed the performance problems of increased signal to noise ratio, increased range, increased frame rate, graceful degradation through component redundancy, and improved range calibration. Next year, we will build a breadboard of this sensor. The phenomenology of the background scene of a target vehicle as viewed against earth and space backgrounds under various lighting conditions will be simulated using the TRW Dynamic Scene Generator Facility (DSGF). Solar illumination angles of the target vehicle and candidate docking target ranging from eclipse to full sun will be explored. The sensor will be transportable for testing at the MSFC Flight Robotics Laboratory (EB24) using the Dynamic Overhead Telerobotic Simulator (DOTS).
    Keywords: SPACECRAFT DESIGN, TESTING AND PERFORMANCE
    Type: NASA, Washington, NASA Automated Rendezvous and Capture Review. A Compilation of the Abstracts; 3 p
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 2
    Publication Date: 2013-08-31
    Description: A video-based sensor has been developed specifically for the close-range maneuvering required in the last phase of autonomous rendezvous and capture. The system is a combination of target and sensor, with the target being a modified version of the standard target used by the astronauts with the Remote Manipulator System (RMS). The system, as currently configured, works well for autonomous docking maneuvers from approximately forty feet in to soft-docking and capture. The sensor was developed specifically to track and calculate its position and attitude relative to a target consisting of three retro-reflective spots, equally spaced, with the center spot being on a pole. This target configuration was chosen for its sensitivity to small amounts of relative pitch and yaw and because it could be used with a small modification to the standard RMS target already in use by NASA.
    Keywords: SPACECRAFT INSTRUMENTATION
    Type: NASA, Washington, NASA Automated Rendezvous and Capture Review. A Compilation of the Abstracts; 2 p
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 3
    Publication Date: 2013-08-31
    Description: The Automated Rendezvous and Docking system (ARAD) is composed of two parts. The first part is the sensor which consists of a video camera ringed with two wavelengths of laser diode. The second part is a standard Remote Manipulator System (RMS) target used on the Orbiter that has been modified with three circular pieces of retro-reflective tape covered by optical filters which correspond to one of the wavelengths of laser diode. The sensor is on the chase vehicle and the target is on the target vehicle. The ARAD system works by pulsing one wavelength laser diodes and taking a picture. Then the second wavelength laser diodes are pulsed and a second picture is taken. One picture is subtracted from the other and the resultant picture is thresholded. All adjacent pixels above threshold are blobbed together (X and Y centroids calculated). All blob centroids are checked to recognize the target out of noise. Then the three target spots are windowed and tracked. The three target spot centroids are used to evaluate the roll, yaw, pitch, range, azimuth, and elevation. From that a guidance routine can guide the chase vehicle to dock with the target vehicle with the correct orientation.
    Keywords: SPACECRAFT INSTRUMENTATION
    Type: NASA. Lyndon B. Johnson Space Center, Fifth Annual Workshop on Space Operations Applications and Research (SOAR 1991), Volume 1; p 214-219
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 4
    Publication Date: 2019-06-28
    Description: A Global Positioning System Synchronized Active Light Autonomous Docking System (GPSSALADS) for automatically docking a chase vehicle with a target vehicle comprising at least one active light emitting target which is operatively attached to the target vehicle. The target includes a three-dimensional array of concomitantly flashing lights which flash at a controlled common frequency. The GPSSALADS further comprises a visual tracking sensor operatively attached to the chase vehicle for detecting and tracking the target vehicle. Its performance is synchronized with the flash frequency of the lights by a synchronization means which is comprised of first and second internal clocks operatively connected to the active light target and visual tracking sensor, respectively, for providing timing control signals thereto, respectively. The synchronization means further includes first and second Global Positioning System receivers operatively connected to the first and second internal clocks, respectively, for repeatedly providing simultaneous synchronization pulses to the internal clocks, respectively. In addition, the GPSSALADS includes a docking process controller means which is operatively attached to the chase vehicle and is responsive to the visual tracking sensor for producing commands for the guidance and propulsion system of the chase vehicle.
    Keywords: Aircraft Communications and Navigation
    Type: US-Patent-Class-364-459 , US-Patent-Class-364-424.02 , Int-Patent-Class-B64G-1/64
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 5
    Publication Date: 2019-06-28
    Description: Proposed optoelectronic system for guiding vehicle in approaching and docking with another vehicle includes active optical targets (flashing lights) on approached vehicle synchronized with sensor and image-processing circuitry on approaching vehicle. Conceived for use in automated approach and docking of two spacecraft. Also applicable on Earth to manually controlled and automated approach and docking of land vehicles, aircraft, boats, and submersible vehicles, using GPS or terrestrial broadcast time signals for synchronization. Principal advantage: optical power reduced, with consequent enhancement of safety.
    Keywords: ELECTRONIC SYSTEMS
    Type: MFS-28853 , Laser Tech. Brief.; 2; 1; P. 96
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 6
    Publication Date: 2019-07-27
    Description: NASA's Marshall Space Flight Center flew on the STS-87 mission an active sensor system, the Video Guidance Sensor (VGS), to demonstrate its functioning in space and to collect performance data. The VGS was designed to provide near-range sensor data as part of an automatic rendezvous and docking system. The sensor determines the relative positions and attitudes between the active sensor and the passive target. The VGS uses laser diodes to illuminate retro-reflectors in the target, a solid-state camera to detect the return from the target, and a frame grabber and digital signal processor to convert the video information into the relative positions and attitudes. The system is designed to operate with the target within a relative azimuth of +/- 9.5 degrees and a relative elevation of +/- 7.5 degrees. The system will acquire and track the target within that field-of-view anywhere from 1.5 meters to 110 meters range, and is designed to acquire at relative attitudes of +/- 10 degrees in pitch and yaw and at any roll angle. The data is output from the sensor at 5 Hz, and the target and sensor software have been designed to permit two independent sensors to operate simultaneously (in order to allow for redundancy). The data from the flight experiment includes raw video data from the VGS camera, relative position and attitude measurements from the VGS to the target, independent hand-held laser ranges from the Shuttle Aft Flight Deck to the target, and Remote Manipulator System position data to correlate with the VGS data. The experiment was quite successful and returned much useful information. The experience gained from the design and flight of this experiment will lead to improved video sensors in the future.
    Keywords: Instrumentation and Photography
    Type: AeroSense; 13-17. 1998; Orlando, FL; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 7
    Publication Date: 2019-07-17
    Description: NASA's Marshall Space Flight Center (MSFC) has, for the last ten years, developed various video-based sensors for use in automated docking systems. The latest generation of sensor will operate at rates of up to 100 Hz, determining the relative position (X, Y, and Z) and attitude (Roll, Pitch, and Yaw) between the sensor and a small 3-dimensional target, making it suitable for applications in robotic sensing. The Advanced Video Guidance Sensor (AVGS) is designed to track multiple targets at different ranges and determine the position and attitude of each one. The previous generation of video sensor, the Video Guidance Sensor (VGS), was flown twice on the Space Shuttle to test its performance on orbit. One of the tests performed was determining the relative positions and attitudes between the VGS and its target, which was moved to various positions using the Remote Manipulator System (RMS). The RMS position data and VGS measured data were analyzed after the flights, with good correlation between the position and attitude data of the two data sets. The test using the RMS gives a good idea of the ability of the use of the AVGS as a sensor for end-effector position and attitude determination.
    Keywords: Cybernetics, Artificial Intelligence and Robotics
    Type: Artificial Intelligence, Robotics, and Automation in Space; Jun 18, 2001 - Jun 22, 2001; Montreal; Canada
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 8
    Publication Date: 2019-07-17
    Description: Engineers at the Marshall Space Flight Center (MSFC) have been developing and testing video-based sensors for automated spacecraft guidance for several years. The next generation of Video Guidance Sensor (VGS) is being designed to be faster and more capable than ever. It will have applications to relative position measurement in any field of endeavor. The system works by sequentially firing two different wavelengths of laser diodes at the target (which has retroreflectors) and processing the two images. Since the target only reflects one wavelength, it shows up well in one image and not at all in the other. Because the target's dimensions are known, the relative positions and attitudes of the target and the sensor can be computed from the spots reflected from the target. The current sensor operates at 5 Hz at ranges from 1 to 110 meters with a 20 deg. field-of-view. The Video Guidance Sensor (VGS) developed over the past several years has performed well in testing and met the objective of being used as the terminal guidance sensor for an automated rendezvous and capture system. The first VGS was successfully tested in closed-loop 3-degree-of-freedom (3-DOF) tests in 1989 and then in 6-DOF open-loop tests in 1992 and closed-loop tests in 1993-4. Development and testing continued, and in 1995 approval was given to test the VGS in an experiment on the Space Shuttle. The VGS flew in 1997 and in 1998, performing well during both flight experiments. During the development and testing before, during, and after the flight experiments, numerous areas for improvement were found. The next generation of VGS is being designed to operate at up to 100 Hz tracking rates and at ranges from 0.5 to 200 meters. In addition to its use as a spacecraft guidance sensor, it could be used as an alignment aid for an operator of a remote system (giving position and attitude feedback data), as a feedback system for a robotic arm, or for automated vehicle guidance. The next generation VGS, with its higher tracking rates, smaller size, and lower power could be used in more places than the original VGS, and by using LED's instead of laser diodes, the system would be eye-safe at any range. More potential applications include aerial station keeping (keeping 2 or more autonomous aircraft within particular relative positions), under-water robotics, and the guidance of ground vehicles in predefined areas equipped with sets of targets.
    Keywords: Aircraft Communications and Navigation
    Type: Advances in Navigation Guidance and Control Technology Workshop; Nov 01, 2000 - Nov 02, 2000; Redstone Arsenal, AL; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 9
    Publication Date: 2019-07-12
    Description: Embedded software has been developed specifically for controlling an Advanced Video Guidance Sensor (AVGS). A Video Guidance Sensor is an optoelectronic system that provides guidance for automated docking of two vehicles. Such a system includes pulsed laser diodes and a video camera, the output of which is digitized. From the positions of digitized target images and known geometric relationships, the relative position and orientation of the vehicles are computed. The present software consists of two subprograms running in two processors that are parts of the AVGS. The subprogram in the first processor receives commands from an external source, checks the commands for correctness, performs commanded non-image-data-processing control functions, and sends image data processing parts of commands to the second processor. The subprogram in the second processor processes image data as commanded. Upon power-up, the software performs basic tests of functionality, then effects a transition to a standby mode. When a command is received, the software goes into one of several operational modes (e.g. acquisition or tracking). The software then returns, to the external source, the data appropriate to the command.
    Keywords: Space Communications, Spacecraft Communications, Command and Tracking
    Type: MFS-31865-1 , NASA Tech Briefs, October 2006; 6
    Format: application/pdf
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
  • 10
    Publication Date: 2019-07-18
    Description: This paper describes the current developments in video-based sensors at the Marshall Space Flight Center. The Advanced Video Guidance Sensor is the latest in a line of video-based sensors designed for use in automated docking systems. The X-33, X-34, X-38, and X-40 are all designed to be unpiloted vehicles; such vehicles will require a sensor system that will provide adequate data for the vehicle to accomplish its mission. One of the primary tasks planned for re-usable launch vehicles is to resupply the space station. In order to approach the space station in a self-guided manner, the vehicle must have a reliable and accurate sensor system to provide relative position and attitude information between the vehicle and the space station. The Advanced Video Guidance Sensor is being designed and built to meet this requirement, as well as requirements for other vehicles docking to a variety of target spacecraft. The Advanced Video Guidance Sensor is being designed to allow range and bearing information to be measured at ranges up to 2 km. The sensor will measure 6-degree-of-freedom information (relative positions and attitudes) from approximately 40 meters all the way in to final contact (approximately 1 meter range). The sensor will have a data output rate of 20 Hz during tracking mode, and will be able to acquire a target within one half of a second. The prototype of the sensor will be near completion at the time of the conference.
    Keywords: Spacecraft Design, Testing and Performance
    Type: 20th Digital Avionics Systems Conference; Oct 14, 2001 - Oct 18, 2001; Daytona Beach, FL; United States
    Format: text
    Location Call Number Expected Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. More information can be found here...