Presentation + Paper
31 May 2022 Factors affecting human understanding of augmented reality visualization of changes detected by an autonomous mobile robot
Author Affiliations +
Abstract
In manned-unmanned teaming scenarios, autonomous unmanned robotic platforms with advanced sensing and compute capabilities will have the ability to perform online change detection. This change detection will consist of metric comparisons of sensor-based spatial information with information collected previously, for the purpose of identifying changes in the environment that could indicate anything from adversarial activity to changes caused by natural phenomena that could affect the mission. This previously collected information will be sourced from a variety of sources, such as satellite, IoT devices, other manned-unmanned teams, or the same robotic platform on a prior mission. While these robotic platforms will be superior to their human operators at detecting changes, the human teammates will for the foreseeable future exceed the abilities of autonomy at interpreting any changes, particularly for relevance to the mission and situational context. For this reason, the ability of a robot to intelligently and properly convey such information to maximize human understanding is essential. In this work, we build upon previous work which presented a mixed reality interface for conveying change detection information from an autonomous robot to a human. We discuss factors affecting human understanding of augmented reality visualization of detected changes, based upon multiple user studies where a user interacts with this system. We believe our findings will be informative to the creation of AR-based communication strategies for manned-unmanned teams performing multi-domain operations.
Conference Presentation
© (2022) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Christopher Reardon, Jason Gregory, Kerstin Haring, and John G. Rogers III "Factors affecting human understanding of augmented reality visualization of changes detected by an autonomous mobile robot", Proc. SPIE 12125, Virtual, Augmented, and Mixed Reality (XR) Technology for Multi-Domain Operations III, 1212503 (31 May 2022); https://doi.org/10.1117/12.2619038
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Visualization

Augmented reality

Environmental sensing

Sensors

Mobile robots

RELATED CONTENT


Back to Top