NASA Logo

NTRS

NTRS - NASA Technical Reports Server

Back to Results
Advanced Multimodal Solutions for Information PresentationHigh-workload, fast-paced, and degraded sensory environments are the likeliest candidates to benefit from multimodal information presentation. For example, during EVA (Extra-Vehicular Activity) and telerobotic operations, the sensory restrictions associated with a space environment provide a major challenge to maintaining the situation awareness (SA) required for safe operations. Multimodal displays hold promise to enhance situation awareness and task performance by utilizing different sensory modalities and maximizing their effectiveness based on appropriate interaction between modalities. During EVA, the visual and auditory channels are likely to be the most utilized with tasks such as monitoring the visual environment, attending visual and auditory displays, and maintaining multichannel auditory communications. Previous studies have shown that compared to unimodal displays (spatial auditory or 2D visual), bimodal presentation of information can improve operator performance during simulated extravehicular activity on planetary surfaces for tasks as diverse as orientation, localization or docking, particularly when the visual environment is degraded or workload is increased. Tactile displays offer a third sensory channel that may both offload information processing effort and provide a means to capture attention when urgently required. For example, recent studies suggest that including tactile cues may result in increased orientation and alerting accuracy, improved task response time and decreased workload, as well as provide self-orientation cues in microgravity on the ISS (International Space Station). An important overall issue is that context-dependent factors like task complexity, sensory degradation, peripersonal vs. extrapersonal space operations, workload, experience level, and operator fatigue tend to vary greatly in complex real-world environments and it will be difficult to design a multimodal interface that performs well under all conditions. As a possible solution, adaptive systems have been proposed in which the information presented to the user changes as a function of taskcontext-dependent factors. However, this presupposes that adequate methods for detecting andor predicting such factors are developed. Further, research in adaptive systems for aviation suggests that they can sometimes serve to increase workload and reduce situational awareness. It will be critical to develop multimodal display guidelines that include consideration of smart systems that can select the best display method for a particular contextsituation.The scope of the current work is an analysis of potential multimodal display technologies for long duration missions and, in particular, will focus on their potential role in EVA activities. The review will address multimodal (combined visual, auditory andor tactile) displays investigated by NASA, industry, and DoD (Dept. of Defense). It also considers the need for adaptive information systems to accommodate a variety of operational contexts such as crew status (e.g., fatigue, workload level) and task environment (e.g., EVA, habitat, rover, spacecraft). Current approaches to guidelines and best practices for combining modalities for the most effective information displays are also reviewed. Potential issues in developing interface guidelines for the Exploration Information System (EIS) are briefly considered.
Document ID
20180001232
Acquisition Source
Ames Research Center
Document Type
Presentation
Authors
Wenzel, Elizabeth M.
(NASA Ames Research Center Moffett Field, CA United States)
Godfroy-Cooper, Martine
(San Jose State Univ. Moffett Field, CA, United States)
Date Acquired
February 15, 2018
Publication Date
January 22, 2018
Subject Category
Man/System Technology And Life Support
Report/Patent Number
ARC-E-DAA-TN51912
Meeting Information
Meeting: NASA Human Research Program Investigators'' Workshop (HRP IWS 2018)
Location: Galveston, TX
Country: United States
Start Date: January 22, 2018
End Date: January 25, 2018
Sponsors: NASA Johnson Space Center
Funding Number(s)
WBS: WBS 344494.01.01.10
CONTRACT_GRANT: NNX17AE07A
Distribution Limits
Public
Copyright
Public Use Permitted.
Keywords
multimodal display
human-computer interface
No Preview Available