Paper
20 August 1993 Active sensor fusion for mobile robot exploration and navigation
Robert Mandelbaum, Max Mintz
Author Affiliations +
Proceedings Volume 2059, Sensor Fusion VI; (1993) https://doi.org/10.1117/12.150231
Event: Optical Tools for Manufacturing and Advanced Automation, 1993, Boston, MA, United States
Abstract
In this paper we present a paradigm for active sensor fusion and feature integration for the purposes of exploration of a static environment, and subsequent navigation therein. We describe the feature grid representation of the environment which is extensible to a wide range of sensor modalities, allows efficient access of information, and supports inter-agent cooperation. In particular, we have developed a testbed with which we investigate the fusion of data from acoustic range sensors with data from a structured-light sensor capable of delineating object extent. The acoustic sensors are employed for primary detection and localization of an object, as well as for extraction of geometric features such as planar surfaces and corners. Once an object has been identified, an active sensing strategy is invoked, causing the mobile robot to follow a trajectory which brings the object into view of the structured-light sensor. In this way, relatively accurate information can be accumulated regarding object position, extent and orientation.
© (1993) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Robert Mandelbaum and Max Mintz "Active sensor fusion for mobile robot exploration and navigation", Proc. SPIE 2059, Sensor Fusion VI, (20 August 1993); https://doi.org/10.1117/12.150231
Lens.org Logo
CITATIONS
Cited by 3 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Sensor fusion

Feature extraction

Raster graphics

Environmental sensing

Reflectivity

Active sensors

Back to Top