29,99 €
inkl. MwSt.
Versandkostenfrei*
Versandfertig in 6-10 Tagen
payback
15 °P sammeln
  • Broschiertes Buch

Generative Adversarial Networks (GANs) have had tremendous applications in Computer Vision. Yet, in the context of space science and planetary exploration the door is open for major advances. We introduce tools to handle planetary data from the mission Chang'E-4 and present a framework for Neural Style Transfer using Cycle-consistency from rendered images. We also introduce a new real-time pipeline for Simultaneous Localization and Mapping (SLAM) and Visual Inertial Odometry (VIO) in the context of planetary rovers. We leverage prior information of the location of the lander to propose an…mehr

Produktbeschreibung
Generative Adversarial Networks (GANs) have had tremendous applications in Computer Vision. Yet, in the context of space science and planetary exploration the door is open for major advances. We introduce tools to handle planetary data from the mission Chang'E-4 and present a framework for Neural Style Transfer using Cycle-consistency from rendered images. We also introduce a new real-time pipeline for Simultaneous Localization and Mapping (SLAM) and Visual Inertial Odometry (VIO) in the context of planetary rovers. We leverage prior information of the location of the lander to propose an object-level SLAM approach that optimizes pose and shape of the lander together with camera trajectories of the rover. As a further refinement step, we propose to use techniques of interpolation between adjacent temporal samples; videlicet synthesizing non-existing images to improve the overall accuracy of the system. The experiments are conducted in the context of the Iris Lunar Rover, a nano-rover that will be deployed in lunar terrain in 2021 as the flagship of Carnegie Mellon, being the first unmanned rover of America to be on the Moon.

Generative Adversarial Networks (GANs) have had tremendous applications in Computer Vision. Yet, in the context of space science and planetary exploration the door is open for major advances. We introduce tools to handle planetary data from the mission Chang'E-4 and present a framework for Neural Style Transfer using Cycle-consistency from rendered images.

We also introduce a new real-time pipeline for Simultaneous Localization and Mapping (SLAM) and Visual Inertial Odometry (VIO) in the context of planetary rovers. We leverage prior information of the location of the lander to propose an object-level SLAM approach that optimizes pose and shape of the lander together with camera trajectories of the rover. As a further refinementstep, we propose to use techniques of interpolation between adjacent temporal samples; videlicet synthesizing non-existing images to improve the overall accuracy of the system.

The experiments are conducted in the context of the Iris Lunar Rover, a nano-rover that will be deployed in lunar terrain in 2021 as the flagship of Carnegie Mellon, being the first unmanned rover of America to be on the Moon.

Autorenporträt
J. de Curtò possui um duplo mestrado em Telecomunicações da Universitat Politècnica de Catalunya e Universitat Autònoma de Barcelona. De Curtò prosseguiu estudos de pós-graduação em Engenharia Eléctrica e Informática na Universidade da Cidade de Hong Kong e na Carnegie Mellon. Teve muitos compromissos de investigação entre eles na ETH Zürich e CUHK.