An architecture for sensor fusion in a mobile robot

S Shafer, A Stentz, C Thorpe - Proceedings. 1986 IEEE …, 1986 - ieeexplore.ieee.org
S Shafer, A Stentz, C Thorpe
Proceedings. 1986 IEEE International Conference on Robotics and …, 1986ieeexplore.ieee.org
This paper describes sensor fusion in the context of an autonomous mobile robot. The
requirements of a complex mission, real-world operation, and real-time control dictate many
facets of the system architecture. The hardware architecture must include both general-
purpose and special-purpose computers, and multiple sensors of various modalities (vision,
range, etc.). The software architecture must allow modular development of a parallel system
that supports many perceptual modalities and navigation planning tasks, but at the same …
This paper describes sensor fusion in the context of an autonomous mobile robot. The requirements of a complex mission, real-world operation, and real-time control dictate many facets of the system architecture. The hardware architecture must include both general-purpose and special-purpose computers, and multiple sensors of various modalities (vision, range, etc.). The software architecture must allow modular development of a parallel system that supports many perceptual modalities and navigation planning tasks, but at the same time enforces global consistency regarding position and orientation of the vehicle and sensors. We are building such a system at CMU, called the NAVLAB system, based on a commercial truck with computer controls and studded with cameras and other sensors. This paper describes the software architecture of the NAVLAB, consisting of two parts: a "whiteboard" system called CODGER that is similar to a blackboard but supports parallelism in the knowledge source modules, and an organized collection of perceptual and navigational modules tied together by the CODGER system. In general, the system philosophy is to provide as much top-down guidance as possible, and to exploit sensor modality differences to produce complementary rather than competing perceptual processes in the system. In this way, the limitations of each sensor modality are compensated for as much as possible by other sensors or by higher level knowledge. The NAVLAB is being produced as part of the DARPA Strategic Computing Initiative, in conjunction with the Autonomous Land Vehicle project.
ieeexplore.ieee.org
Showing the best result for this search. See all results