74,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
  • Broschiertes Buch

Rough terrain robotics is a fast evolving field of research and a lot of effort is deployed towards enabling a greater level of autonomy for outdoor vehicles. This book demonstrates how the accuracy of 3D position tracking can be improved by considering rover locomotion in rough terrain as a holistic problem. Although the selection of appropriate sensors is crucial to accurately track the rover's position, it is not the only aspect to consider. Indeed, the use of an unadapted locomotion concept severely affects the signal to noise ratio of the sensors, which leads to poor motion estimates. In…mehr

Produktbeschreibung
Rough terrain robotics is a fast evolving field of research and a lot of effort is deployed towards enabling a greater level of autonomy for outdoor vehicles. This book demonstrates how the accuracy of 3D position tracking can be improved by considering rover locomotion in rough terrain as a holistic problem. Although the selection of appropriate sensors is crucial to accurately track the rover's position, it is not the only aspect to consider. Indeed, the use of an unadapted locomotion concept severely affects the signal to noise ratio of the sensors, which leads to poor motion estimates. In this work, a mechanical structure allowing smooth motion across obstacles with limited wheel slip is used. In particular, it enables the use of odometry and inertial sensors to improve the position estimation in rough terrain. A method for computing 3D motion increments based on the wheel encoders and chassis state sensors is developed. Because it accounts for the kinematics of the rover, this method provides better results than the standard approach. To further improve the accuracy of the position tracking and the rover's climbing performance, a controller minimizing wheel slip is developed. The algorithm runs online and can be adapted to any kind of passive wheeled rover. Finally, sensor fusion using 3D-Odometry, inertial sensors and visual motion estimation based on stereovision is presented. The experimental results demonstrate how each sensor contributes to increase the accuracy and robustness of the 3D position estimation.