Paper
9 May 2006 Vehicle 3D pose tracking using distributed aperture sensors
Taragay Oskiper, Rakesh Kumar, John Fields, Supun Samarasekera
Author Affiliations +
Abstract
In this paper, we present solutions for tracking the 3D pose (location and orientation) of robot or vehicle undergoing general motion (6 degrees of freedom, rotation and translation) based on video streams captured by distributed aperture passive sensor system. A novel algorithm for multi-camera visual odometry is described. Previous published methods for visual odometry have used video streams from 1, 2 or 3 cameras in a monocular binocular or trinocular configurations. In this paper, we present general methods and results for visual odometry for a fixed configuration or known configuration of an arbitrary number of cameras. The images from the different cameras may have no overlap what so ever. The relative pose and configuration of the cameras comprising the distributed aperture system is assumed to be pre-calibrated and known at any time instant. We demonstrate that we can very accurately and robustly track the vehicle pose using the distributed aperture system.
© (2006) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Taragay Oskiper, Rakesh Kumar, John Fields, and Supun Samarasekera "Vehicle 3D pose tracking using distributed aperture sensors", Proc. SPIE 6230, Unmanned Systems Technology VIII, 62301X (9 May 2006); https://doi.org/10.1117/12.666407
Lens.org Logo
CITATIONS
Cited by 1 scholarly publication.
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Cameras

Imaging systems

Visualization

Sensors

Global Positioning System

Polishing

Video

Back to Top