Mobile robots are playing an increasingly important role in our world. Remotely operated vehicles are in everyday use for hazardous tasks such as charting and cleaning up hazardous waste spills, construction work of tunnels and high rise buildings, and underwater inspection of oil drilling platforms in the ocean. A whole host of further applications, however, beckons robots capable of autonomous operation without or with very little intervention of human operators. Such robots of the future will explore distant planets, map the ocean floor, study the flow of pollutants and carbon dioxide…mehr
Mobile robots are playing an increasingly important role in our world. Remotely operated vehicles are in everyday use for hazardous tasks such as charting and cleaning up hazardous waste spills, construction work of tunnels and high rise buildings, and underwater inspection of oil drilling platforms in the ocean. A whole host of further applications, however, beckons robots capable of autonomous operation without or with very little intervention of human operators. Such robots of the future will explore distant planets, map the ocean floor, study the flow of pollutants and carbon dioxide through our atmosphere and oceans, work in underground mines, and perform other jobs we cannot even imagine; perhaps even drive our cars and walk our dogs. The biggest technical obstacles to building mobile robots are vision and navigation-enabling a robot to see the world around it, to plan and follow a safe path through its environment, and to execute its tasks. At the Carnegie Mellon Robotics Institute, we are studying those problems both in isolation and by building complete systems. Since 1980, we have developed a series of small indoor mobile robots, some experimental, and others for practical applicationr Our outdoor autonomous mobile robot research started in 1984, navigating through the campus sidewalk network using a small outdoor vehicle called the Terregator. In 1985, with the advent of DARPA's Autonomous Land Vehicle Project, we constructed a computer controlled van with onboard sensors and researchers. In the fall of 1987, we began the development of a six-legged Planetary Rover.Hinweis: Dieser Artikel kann nur an eine deutsche Lieferadresse ausgeliefert werden.
Produktdetails
Produktdetails
The Springer International Series in Engineering and Computer Science 93
1. Introduction.- 1.1. Mobile Robots.- 1.2. Overview.- 1.3. Acknowledgments.- 2. Color Vision for Road Following.- 2.1. Introduction.- 2.2. SCARF.- 2.3. UNSCARF.- 2.4. Results and Conclusions.- 2.5. References.- 3. Explicit Models for Robot Road Following.- 3.1 Implicit Models Considered Harmful.- 3.2 Systems, Models, and Assumptions.- 3.3 FERMI.- 3.4 References.- 4. An Approach to Knowledge-Based Interpretation of Outdoor Natural Color Road Scenes.- 4.1. Abstract.- 4.2. Introduction.- 4.3. Related Work.- 4.4. Adjustable Explicit Scene Models and the Interpretation Cycle.- 4.5. System Overview.- 4.6. Results of the Road Scene Interpretation.- 4.7. The Road Scene Interpretation System in Detail.- 4.8. Future Work.- 4.9. Conclusion.- 4.10. Acknowledgement.- 4.11. References.- 5. Neural Network Based Autonomous Navigation.- 5.1. Introduction.- 5.2. Network Architecture.- 5.3. Training And Performance.- 5.4. Network Representation.- 5.5. Discussion And Extensions.- 5.6. Conclusion.- 5.7. References.- 6. Car Recognition for the CMU Navlab.- 6.1 Introduction.- 6.2 Related work.- 6.3 The LASSIE object recognition program.- 6.4 Results.- 6.5 Directions for future work.- 6.6 Summary.- 6.7 References.- 7. Building and Navigating Maps of Road Scenes Using Active Range and Reflectance Data.- 7.1. Introduction.- 7.2. Following roads using active reflectance images.- 7.3. Building maps from range and reflectance images.- 7.4. Map-based road following.- 7.5. Conclusion.- 7.6. References.- 8. 3-D Vision Techniques for Autonomous Vehicles.- 8.1. Introduction.- 8.2. Active range and reflectance sensing.- 8.3. Terrain representations.- 8.4. Combining multiple terrain maps.- 8.5. Combining range and intensity data.- 8.6. Conclusion.- 8.7. References.- 9. The CODGER System for Mobile RobotNavigation.- 9.1 Introduction.- 9.2 Overview of the CODGER System.- 9.3 Data Storage and Transfer.- 9.4 Geometric Representation and Reasoning.- 9.5 Conclusions.- 9.6 References.- 10. The Driving Pipeline: A Driving Control Scheme for Mobile Robots.- 10.1 Introduction.- 10.2 Processing Steps and Driving Unit.- 10.3 Continuous Motion, Adaptive Control, and the Driving Pipeline.- 10.4 The Driving Pipeline in Action: Experimental Results.- 10.5 Conclusion.- 10.6 References.- 11. Multi-Resolution Constraint Modeling for Mobile Robot Planning.- 11.1 Introduction.- 11.2 The Local Navigation Problem.- 11.3 Finding Trajectories.- 11.4 Experiments.- 11.5 Conclusions.- 11.6 Acknowledgements.- 11.7 References.- 12. Navlab: An Autonomous Navigation Testbed.- 12.1 Introduction.- 12.2 Controller.- 12.3 Vehicle Shell.- 12.4 Locomotion.- 12.5 Electrical System.- 12.6 Telemetry.- 12.7 Perceptive Sensing and Computing.- 12.8 Conclusion.- 13. Vehicle and Path Models for Autonomous Navigation.- 13.1 Introduction.- 13.2 Vehicle Representation.- 13.3 Path Representation.- 13.4 Path Tracking.- 13.5 Results.- 13.6 Conclusions.- 13.7 References.- 14. The Warp Machine on Navlab.- 14.1 Introduction.- 14.2 History of the Warp Machine on Navlab.- 14.3 FIDO.- 14.4 SCARF.- 14.5 ALVINN.- 14.6 Evaluation of the Warp Machine on Navlab.- 14.7 Conclusions.- 14.8 References.- 15. Outdoor Visual Navigation for Autonomous Robots.- 15.1 Introduction.- 15.2 Example Systems.- 15.3 Discussion and Conclusions.- 15.4 Acknowledgements.- 15.5 References.
1. Introduction.- 1.1. Mobile Robots.- 1.2. Overview.- 1.3. Acknowledgments.- 2. Color Vision for Road Following.- 2.1. Introduction.- 2.2. SCARF.- 2.3. UNSCARF.- 2.4. Results and Conclusions.- 2.5. References.- 3. Explicit Models for Robot Road Following.- 3.1 Implicit Models Considered Harmful.- 3.2 Systems, Models, and Assumptions.- 3.3 FERMI.- 3.4 References.- 4. An Approach to Knowledge-Based Interpretation of Outdoor Natural Color Road Scenes.- 4.1. Abstract.- 4.2. Introduction.- 4.3. Related Work.- 4.4. Adjustable Explicit Scene Models and the Interpretation Cycle.- 4.5. System Overview.- 4.6. Results of the Road Scene Interpretation.- 4.7. The Road Scene Interpretation System in Detail.- 4.8. Future Work.- 4.9. Conclusion.- 4.10. Acknowledgement.- 4.11. References.- 5. Neural Network Based Autonomous Navigation.- 5.1. Introduction.- 5.2. Network Architecture.- 5.3. Training And Performance.- 5.4. Network Representation.- 5.5. Discussion And Extensions.- 5.6. Conclusion.- 5.7. References.- 6. Car Recognition for the CMU Navlab.- 6.1 Introduction.- 6.2 Related work.- 6.3 The LASSIE object recognition program.- 6.4 Results.- 6.5 Directions for future work.- 6.6 Summary.- 6.7 References.- 7. Building and Navigating Maps of Road Scenes Using Active Range and Reflectance Data.- 7.1. Introduction.- 7.2. Following roads using active reflectance images.- 7.3. Building maps from range and reflectance images.- 7.4. Map-based road following.- 7.5. Conclusion.- 7.6. References.- 8. 3-D Vision Techniques for Autonomous Vehicles.- 8.1. Introduction.- 8.2. Active range and reflectance sensing.- 8.3. Terrain representations.- 8.4. Combining multiple terrain maps.- 8.5. Combining range and intensity data.- 8.6. Conclusion.- 8.7. References.- 9. The CODGER System for Mobile RobotNavigation.- 9.1 Introduction.- 9.2 Overview of the CODGER System.- 9.3 Data Storage and Transfer.- 9.4 Geometric Representation and Reasoning.- 9.5 Conclusions.- 9.6 References.- 10. The Driving Pipeline: A Driving Control Scheme for Mobile Robots.- 10.1 Introduction.- 10.2 Processing Steps and Driving Unit.- 10.3 Continuous Motion, Adaptive Control, and the Driving Pipeline.- 10.4 The Driving Pipeline in Action: Experimental Results.- 10.5 Conclusion.- 10.6 References.- 11. Multi-Resolution Constraint Modeling for Mobile Robot Planning.- 11.1 Introduction.- 11.2 The Local Navigation Problem.- 11.3 Finding Trajectories.- 11.4 Experiments.- 11.5 Conclusions.- 11.6 Acknowledgements.- 11.7 References.- 12. Navlab: An Autonomous Navigation Testbed.- 12.1 Introduction.- 12.2 Controller.- 12.3 Vehicle Shell.- 12.4 Locomotion.- 12.5 Electrical System.- 12.6 Telemetry.- 12.7 Perceptive Sensing and Computing.- 12.8 Conclusion.- 13. Vehicle and Path Models for Autonomous Navigation.- 13.1 Introduction.- 13.2 Vehicle Representation.- 13.3 Path Representation.- 13.4 Path Tracking.- 13.5 Results.- 13.6 Conclusions.- 13.7 References.- 14. The Warp Machine on Navlab.- 14.1 Introduction.- 14.2 History of the Warp Machine on Navlab.- 14.3 FIDO.- 14.4 SCARF.- 14.5 ALVINN.- 14.6 Evaluation of the Warp Machine on Navlab.- 14.7 Conclusions.- 14.8 References.- 15. Outdoor Visual Navigation for Autonomous Robots.- 15.1 Introduction.- 15.2 Example Systems.- 15.3 Discussion and Conclusions.- 15.4 Acknowledgements.- 15.5 References.
Es gelten unsere Allgemeinen Geschäftsbedingungen: www.buecher.de/agb
Impressum
www.buecher.de ist ein Internetauftritt der buecher.de internetstores GmbH
Geschäftsführung: Monica Sawhney | Roland Kölbl | Günter Hilger
Sitz der Gesellschaft: Batheyer Straße 115 - 117, 58099 Hagen
Postanschrift: Bürgermeister-Wegele-Str. 12, 86167 Augsburg
Amtsgericht Hagen HRB 13257
Steuernummer: 321/5800/1497
USt-IdNr: DE450055826