Paper
9 October 1998 Neural sensor fusion for spatial visualization on a mobile robot
Siegfried Martens, Gail A. Carpenter, Paolo Gaudiano
Author Affiliations +
Proceedings Volume 3523, Sensor Fusion and Decentralized Control in Robotic Systems; (1998) https://doi.org/10.1117/12.326991
Event: Photonics East (ISAM, VVDC, IEMB), 1998, Boston, MA, United States
Abstract
An ARTMAP neural network is used to integrate visual information and ultrasonic sensory information on a B14 mobile robot. Training samples for the neural network are acquired without human intervention. Sensory snapshots are retrospectively associated with the distance to the wall, provided by on-board odometry as the robot travels in a straight line. The goal is to produce a more accurate measure of distance than is provided by the raw sensors. The neural network effectively combines sensory sources both within and between modalities. The improved distance percept is used to produce occupancy grid visualizations of the robot's environment. The maps produced point to specific problems of raw sensory information processing and demonstrate the benefits of using a neural network system for sensor fusion.
© (1998) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Siegfried Martens, Gail A. Carpenter, and Paolo Gaudiano "Neural sensor fusion for spatial visualization on a mobile robot", Proc. SPIE 3523, Sensor Fusion and Decentralized Control in Robotic Systems, (9 October 1998); https://doi.org/10.1117/12.326991
Lens.org Logo
CITATIONS
Cited by 5 scholarly publications.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Sensors

Visualization

Neural networks

Sensor fusion

Distance measurement

Infrared sensors

Ultrasonics

Back to Top