Go to the content. | Move to the navigation | Go to the site search | Go to the menu | Contacts | Accessibility

| Create Account

Chiodini, Sebastiano (2017) Visual odometry and vision system measurements based algorithms for rover navigation. [Ph.D. thesis]

Full text disponibile come:

[img]
Preview
PDF Document (Tesi Dottorato di Ricerca) - Other
19Mb

Abstract (english)

Planetary exploration rovers should be capable of operating autonomously also for long paths with minimal human input. Control operations must be minimized in order to reduce traverse time, optimize the resources allocated for telecommunications and maximize the scientific output of the mission.

Knowing the goal position and considering the vehicle dynamics, control algorithms have to provide the appropriate inputs to actuators. Path planning algorithms use three-dimensional models of the surrounding terrain in order to safely avoid obstacles. Moreover, rovers, for the sample and return missions planned for the next years, have to demonstrate the capability to return to a previously visited place for sampling scientific data or to return a sample to an ascent vehicle.

Motion measurement is a fundamental task in rover control, and planetary environment presents some specific issues. Wheel odometry has wide uncertainty due to slippage of wheels on a sandy surface, inertial measurement has drift problems and GPS-like positioning systems is not available on extraterrestrial planets. Vision systems have demonstrated to be reliable and accurate motion tracking measurement methods. One of these methods is stereo Visual Odometry. Stereo-processing allows estimation of the three-dimensional location of landmarks observed by a pair of cameras by means of triangulation. Point cloud matching between two subsequent frames allows stereo-camera motion computation. Thanks to Visual SLAM (Simultaneous Localization and Mapping) techniques a rover is able to reconstruct a consistent map of the environment and to localize itself with reference to this map. SLAM technique presents two main advantages: the map of the environment construction and a more accurate motion tracking, thanks to the solutions of a large minimization problem which involves multiple camera poses and measurements of map landmarks.

After rover touchdown, one of the key tasks requested to the operations center is the accurate measurement of the rover position on the inertial and fixed coordinate systems, such as the J2000 frame and the Mars Body-Fixed (MBF) frame. For engineering and science operations, high precision global localization and detailed Digital Elevation Models (DEM) of the landing site are crucial.

The first part of this dissertation treats the problem of localizing a rover with respect to a satellite geo-referenced and ortho-rectified images, and the localization with respect to a digital elevation model (DEM) realized starting from satellite images A sensitivity analysis of the Visual Position Estimator for Rover (VIPER) algorithm outputs is presented. By comparing the local skyline, extracted form a panoramic image, and a skyline rendered from a Digital Elevation Model (DEM), the algorithm retrieve the camera position and orientation relative to the DEM map. This algorithm has been proposed as part of the localization procedure realized by the Rover Operation Control Center (ROCC), located in ALTEC, to localize ExoMars 2020 rover after landing and as initialization and verification of rover guidance and navigation outputs. Images from Mars Exploration Rover mission and HiRISE DEM have been used to test the algorithm performances.

During rover traverse, Visual Odometry methods could be used as an asset to refine the path estimation. The second part of this dissertation treats an experimental analysis of how landmark distributions in a scene, as observed by a stereo-camera, affect Visual Odometry measurement performances. Translational and rotational tests have been performed in many different positions in an indoor environment. The Visual Odometry algorithm, which has been implemented, firstly guesses motion by a linear 3D-to-3D method embedded within a RANdom SAmple Consensus (RANSAC) process to remove outliers. Then, motion estimation is computed from the inliers by minimizing the Euclidean distance between the triangulated landmarks.

The last part of this dissertation has been developed in collaboration with NASA Jet Propulsion Laboratory and presents an innovative visual localization method for hopping and tumbling platforms. These new mobility systems for the exploration of comets, asteroids, and other small Solar System bodies, require new approaches for localization. The choice of a monocular onboard camera for perception is constrained by the rover’s limited weight and size. Visual localization near the surface of small bodies is difficult due to large scale changes, frequent occlusions, high-contrast, rapidly changing shadows and relatively featureless terrains.

A synergistic localization and mapping approach between the mother spacecraft and the deployed hopping/tumbling daughter-craft rover has been studied and developed. We have evaluated various open-source visual SLAM algorithms. Between them, ORB-SLAM2 has been chosen and adapted for this application. The possibility to save the map made by orbiter observations and re-load it for rover localization has been introduced. Moreover, now it is possible to fuse the map with other orbiter sensor pose measurement.

Collaborative localization method accuracy has been estimated. A series of realistic images of an asteroid mockup have been captured and a Vicon system has been used in order to give the trajectory ground truth. In addition, we had evaluated this method robustness to illumination changes.

Abstract (italian)

I rover marziani e, più in generale, i robot per l’esplorazione di asteroidi e piccoli corpi celesti, richiedono un alto livello di autonomia. Il controllo da parte di un operatore deve essere ridotto al minimo, al fine di ridurre i tempi di percorrenza, ottimizzare le risorse allocate per le tele-comunicazioni e massimizzare l’output scientifico della missione.

Conoscendo la posizione obiettivo e considerando la dinamica del veicolo, gli algoritmi di controllo forniscono gli input adeguati agli attuatori. Algoritmi di pianificazione della traiettoria, sfruttando modelli tridimensionali del terreno circostante, evitano gli ostacoli con ampi margini di sicurezza. Inoltre i rover per le missioni di sample and return, previste per i prossimi anni, devono dimostrare la capacità di tornare in un luogo già visitato per il campionamento di dati scientifici o per riportare i campioni raccolti ad un veicolo di risalita.

In tutte queste task la stima del moto risulta essere fondamentale. La stima del moto su altri pianeti ha la sua peculiarità. L’odometria tramite encoder, infatti, presenta elevate incertezze a causa dello slittamento delle ruote su superfici sabbiose o scivolose; i sistemi di navigazione inerziale, nel caso della dinamica lenta dei rover, presentano derive non tollerabili per una stima accurata dell’assetto; infine non sono disponibili sistemi di posizionamento globale analoghi al GPS.

Sistemi della stima del moto basati su telecamere hanno dimostrato, già con le missioni MER della NASA, di essere affidabili e accurati. Uno di questi sistemi è l’odometria visuale stereo. In questo algoritmo il moto è stimato calcolando la roto-traslazione di due nuvole di punti misurate a due istanti successivi. La nuvola di punti è generata tramite triangolazione di punti salienti presenti nelle due immagini. Grazie a tecniche di Simultaneous Localization and Mapping (SLAM) si dà la capacità ad un rover di costruire una mappa dell’ambiente circostante e di localizzarsi rispetto ad essa. Le tecniche di SLAM presentano due vantaggi: la costruzione della mappa e una stima della traiettoria più accurata, grazie alla soluzione di problemi di minimizzazione che coinvolgono la stima di più posizioni e landmark allo stesso tempo.

Subito dopo l’atterraggio, una delle task principali che devono essere svolte dal centro operativo per il controllo di rover è il calcolo accurato della posizione del lander/rover rispetto al sisma di riferimento inerziale e il sistema di riferimento solidale al pianeta, come il sistema J2000 e il Mars Body-Fixed (MBF) frame. Sia per le operazioni scientifiche che ingegneristiche risulta fondamentale la localizzazione accurata rispetto a immagini satellitari
e a modelli tridimensionali della zona di atterraggio.

Nella prima parte della tesi viene trattato il problema della localizzazione di un rover rispetto ad un’immagine satellitare geo referenziata e orto rettificata e la localizzazione rispetto ad un modello di elevazione digitale (DEM), realizzato da immagini satellitari. È stata svolta l’analisi di una versione modificata dell’algoritmo Visual Position Estimator for Rover (VIPER). L’algoritmo trova la posizione e l’assetto di un rover rispetto ad un DEM, comparando la linea d’orizzonte locale con le linee d’orizzonte calcolate in posizioni a priori del DEM. Queste analisi sono state svolte in collaborazione con ALTEC S.p.A., con lo scopo di definire le operazioni che il Rover Operation Control Center (ROCC) dovrà svolgere per la localizzazione del rover ExoMars 2020. Una volta effettuate le operazioni di localizzazione, questi metodi possono essere nuovamente utilizzati come verifica e correzione della stima della traiettoria.

Nella seconda parte della dissertazione è presentato un metodo di odometria visuale stereo per rover ed un’analisi di come la distribuzione dei landmark triangolati influisca sulla stima del moto. A questo scopo sono stati svolti dei test in laboratorio, variando la distanza della scena. L’algoritmo di odometria visiva implementato è un metodo 3D-to-3D con rimozione dei falsi positivi tramite procedura di RANdom SAmple Consensus. La stima del moto è effettuata minimizzando la distanza euclidea tra le due nuvole di punti.

L’ultima parte di questa dissertazione è stata sviluppata in collaborazione con il Jet Propulsion Laboratory (NASA) e presenta un sistema di localizzazione per rover hopping/tumbling per l’esplorazione di comete e asteroidi. Tali sistemi innovativi richiedono nuovi approcci per la localizzazione. Viste le risorse limitate di spazio, peso e energia disponibile e le limitate capacità computazionali, si è scelto di basare il sistema di localizzazione su una monocamera. La localizzazione visuale in prossimità di una cometa, inoltre, presenta alcune peculiarità che la rendono più difficoltosa. Questo a causa dei grandi cambiamenti di scala che si presentano durante il movimento della piattaforma, le frequenti occlusioni del campo di vista, la presenza di ombre nette che cambiano con il periodo di rotazione dell’asteroide e la caratteristica visiva del terreno, che risulta essere omogeno nel campo del visibile.

È stato proposto un sistema di visual SLAM collaborativo tra il rover tumbling/hopping e il satellite “madre”, che ha portato il rover nell’orbita di rilascio. È stato effettuato lo stato dell’arte dei più recenti algoritmi di visual SLAM open-source e, dopo un’accurata analisi, si è optato per l’utilizzo di ORB-SLAM2, che è stato modificato per far fronte al tipo di applicazione richiesta. È stata introdotta la possibilità di salvare la mappa realizzata dall’orbiter, che viene utilizzata dal rover per la sua localizzazione. È possibile, inoltre, fondere la mappa realizzata da orbiter con altre misure d’assetto provenienti da altri sensori a bordo dell’orbiter.

L’accuratezza di tale metodo è stata valutata utilizzando una sequenza di immagini raccolta in ambiente rappresentativo e utilizzando un sistema di riferimento esterno. Sono state effettuate simulazioni della fase di mappatura dell’asteroide e localizzazione della piattaforma hopping/tumbling e, infine, è stato valutato come migliorare le performances di questo metodo, in seguito al cambiamento delle condizioni di illuminazione.

Statistiche Download
EPrint type:Ph.D. thesis
Tutor:Debei, Stefano
Supervisor:Pertile, Marco
Ph.D. course:Ciclo 29 > Corsi 29 > SCIENZE TECNOLOGIE E MISURE SPAZIALI
Data di deposito della tesi:31 January 2017
Anno di Pubblicazione:31 January 2017
Key Words:Visual Odometry, SLAM, Computer Vision, Field Robotics, Rover, Solar System Exploration
Settori scientifico-disciplinari MIUR:Area 09 - Ingegneria industriale e dell'informazione > ING-IND/12 Misure meccaniche e termiche
Struttura di riferimento:Centri > Centro Interdipartimentale di ricerca di Studi e attività  spaziali "G. Colombo" (CISAS)
Codice ID:10175
Depositato il:15 Nov 2017 09:47
Simple Metadata
Full Metadata
EndNote Format

Bibliografia

I riferimenti della bibliografia possono essere cercati con Cerca la citazione di AIRE, copiando il titolo dell'articolo (o del libro) e la rivista (se presente) nei campi appositi di "Cerca la Citazione di AIRE".
Le url contenute in alcuni riferimenti sono raggiungibili cliccando sul link alla fine della citazione (Vai!) e tramite Google (Ricerca con Google). Il risultato dipende dalla formattazione della citazione.

[1] John P. Grotzinger, Joy Crisp, Ashwin R. Vasavada, Robert C. Anderson, Charles J. Baker, Robert Barry, David F. Blake, Pamela Conrad, Kenneth S. Edgett, Bobak Ferdowski, Ralf Gellert, John B. Gilbert, Matt Golombek, Javier Gómez-Elvira, Donald M. Hassler, Louise Jandura, Maxim Litvak, Paul Mahaffy, Justin Maki, Michael Meyer, Michael C. Malin, Igor Mitrofanov, John J. Simmonds, David Vaniman, Richard V. Welch, and Roger C. Wiens. Mars Science Laboratory Mission and Science Investigation. Space Science Reviews, 170(1):5–56, 2012. Cerca con Google

[2] J. Vago, O. Witasse, H. Svedhem, P. Baglioni, A. Haldemann, G. Gianfiglio, T. Blancquaert, D. McCoy, and R. de Groot. ESA ExoMars program: The next step in exploring Mars. Solar System Research, 49(7):518–528, 2015. Cerca con Google

[3] L. Pratt, D. Beaty, and A. Allwood. The mars astrobiology explorer-cacher (MAX-C): A potential rover mission for 2018. Astrobiology, 10(2):127–163, 2010. Cerca con Google

[4] National Research Council. Vision and Voyages for Planetary Science in the Decade 2013-2022. The National Academies Press, Washington, DC, 2011. Cerca con Google

[5] Joseph A. Starek, Behçet Açikme¸se, Issa A. Nesnas, and Marco Pavone. Spacecraft Autonomy Challenges for Next-Generation Space Missions, pages 1–48. Springer Berlin Heidelberg, Berlin, Heidelberg, 2016. Cerca con Google

[6] David W. Dunham, Robert W. Farquhar, James V. McAdams, Mark Holdridge, Robert Nelson, Karl Whittenburg, Peter Antreasian, Steven Chesley, Clifford Helfrich, William M. Owen, Bobby Williams, Joseph Veverka, and Ann Harch. Implementation of the First Asteroid Landing. Icarus, 159(2):433 – 438, 2002. Cerca con Google

[7] Hajime Yano, T. Kubota, H. Miyamoto, T. Okada, D. Scheeres, Y. Takagi, K. Yoshida, M. Abe, S. Abe, O. Barnouin-Jha, A. Fujiwara, S. Hasegawa, T. Hashimoto, M. Ishiguro, M. Kato, J. Kawaguchi, T. Mukai, J. Saito, S. Sasaki, and M. Yoshikawa. Touchdown of the Hayabusa Spacecraft at the Muses Sea on Itokawa. Science, 312(5778):1350–1353, 2006. Cerca con Google

[8] Jens Biele, Stephan Ulamec, Michael Maibaum, Reinhard Roll, Lars Witte, Eric Jurado, Pablo Muñoz, Walter Arnold, Hans-Ulrich Auster, Carlos Casas, Claudia Faber, Cinzia Fantinati, Felix Finke, Hans-Herbert Fischer, Koen Geurts, Carsten Güttler, Philip Heinisch, Alain Herique, Stubbe Hviid, Günter Kargl, Martin Knapmeyer, Jörg Knollenberg, Wlodek Kofman, Norbert Kömle, Ekkehard Kührt, Valentina Lommatsch, Stefano Mottola, Ramon Pardo de Santayana, Emile Remetean, Frank Scholten, Klaus J. Seidensticker, Holger Sierks, and Tilman Spohn. The landing(s) of Philae and inferences about comet surface mechanical properties. Science, 349(6247), 2015. Cerca con Google

[9] Mission complete Rosetta’s journey ends in daring descent to comet. Accessed: 13/10/2016. Cerca con Google

[10] R.Z. Sagdeev and A.V. Zakharov. Brief history of the Phobos mission. Nature, 341(6243):581–585, 1989. Cerca con Google

[11] Yuichi Tsuda, Makoto Yoshikawa, Masanao Abe, Hiroyuki Minamino, and Satoru Nakazawa. System design of the Hayabusa 2—asteroid sample return mission to 1999 JU3. Acta Astronautica, 91:356 – 362, 2013. Cerca con Google

[12] Mark Maimone, Yang Cheng, and Larry Matthies. Two years of visual odometry on the mars exploration rovers. Journal of Field Robotics, 24(3):169–186, 2007. Cerca con Google

[13] The Rover Team. The pathfinder microrover. Journal of Geophysical Research: Planets, 102(E2):3989–4001, 1997. Cerca con Google

[14] E. Gat, R. Desai, R. Ivlev, J. Loch, and D. P. Miller. Behavior control for robotic exploration of planetary surfaces. IEEE Transactions on Robotics and Automation, 10(4):490–503, Aug 1994. Cerca con Google

[15] R. Volpe. Mars rover navigation results using sun sensor heading determination. In Intelligent Robots and Systems, 1999. IROS ’99. Proceedings. 1999 IEEE/RSJ International Conference on, volume 1, pages 460–467 vol.1, 1999. Cerca con Google

[16] Rongxing Li, Kaichang Di, Larry H Matthies, William M Folkner, Raymond E Arvidson, and Brent A Archinal. Rover localization and landing-site mapping technology for the 2003 mars exploration rover mission. Photogrammetric Engineering & Remote Sensing, 70(1):77–90, 2004. Cerca con Google

[17] Rongxing Li, Steven W Squyres, Raymond E Arvidson, Brent A Archinal, Jim Bell, Yang Cheng, Larry Crumpler, David J Des Marais, Kaichang Di, Todd A Ely, et al. Initial results of rover localization and topographic mapping for the 2003 mars exploration rover mission. Photogrammetric Engineering & Remote Sensing, 71(10):1129–1142, 2005. Cerca con Google

[18] Rongxing Li, Brent A. Archinal, Raymond E. Arvidson, Jim Bell, Philip Christensen, Larry Crumpler, David J. Des Marais, Kaichang Di, Tom Duxbury, Matt Golombek, John Grant, Ronald Greeley, Joe Guinn, Andrew Johnson, Randolph L. Kirk, Mark Maimone, Larry H. Matthies, Mike Malin, Tim Parker, Mike Sims, Shane Thompson, Steven W. Squyres, and Larry A. Soderblom. Spirit rover localization and topographic mapping at the landing site of gusev crater, mars. Journal of Geophysical Research: Planets, 111(E2):n/a–n/a, 2006. E02S06. Cerca con Google

[19] Rongxing Li, Raymond E. Arvidson, Kaichang Di, Matt Golombek, Joe Guinn, Andrew Johnson, Mark Maimone, Larry H. Matthies, Mike Malin, Tim Parker, Steven W. Squyres, and Wesley A. Watters. Opportunity rover localization and topographic mapping at the landing site of Meridiani Planum, Mars. Journal of Geophysical Research: Planets, 112(E2):n/a–n/a, 2007. E02S90. Cerca con Google

[20] T Parker, M Malin, M Golombek, T Duxbury, A Johnson, J Guinn, T McElrath, R Kirk, B Archinal, L Soderblom, et al. Localization, localization, localization. In Lunar and Planetary Science Conference, volume 35, 2004. Cerca con Google

[21] Jeffrey J Biesiadecki, P Chris Leger, and Mark W Maimone. Tradeoffs between directed and autonomous driving on the Mars Exploration Rovers. The International Journal of Robotics Research, 26(1):91–104, 2007. Cerca con Google

[22] Mark W Maimone, P Chris Leger, and Jeffrey J Biesiadecki. Overview of the Mars Exploration Rovers’ autonomous mobility and vision capabilities. In IEEE international conference on robotics and automation (ICRA) space robotics workshop, 2007. Cerca con Google

[23] J Maki, D Thiessen, A Pourangi, P Kobzeff, T Litwin, L Scherr, S Elliott, A Dingizian, and M Maimone. The Mars Science Laboratory engineering cameras. Space science reviews, 170(1-4):77–93, 2012. Cerca con Google

[24] Raymond E. Arvidson, Karl D. Iagnemma, Mark Maimone, Abigail A. Fraeman, Feng Zhou, Matthew C. Heverly, Paolo Bellutta, David Rubin, Nathan T. Stein, John P. Grotzinger, and Ashwin R. Vasavada. Mars Science Laboratory Curiosity Rover megaripple crossings up to sol 710 in Gale Crater. Journal of Field Robotics, pages n/a–n/a, 2016. Cerca con Google

[25] Mark Maimone. Curiouser and curiouser: Surface robotic technology driving Mars rover Curiosity’s exploration of Gale crater. In Robotics and Automation Workshop: on Planetary Rovers (ICRA Workshop), 2013 IEEE International Conference on. IEEE, 2013. Cerca con Google

[26] Andrew E Johnson, Steven B Goldberg, Yang Cheng, and Larry H Matthies. Robust and efficient stereo feature tracking for visual odometry. In Robotics and Automation, 2008. ICRA 2008. IEEE International Conference on, pages 39–46. IEEE, 2008. Cerca con Google

[27] Tetsuo Yoshimitsu, Takashi Kubota, Ichiro Nakatani, Tadashi Adachi, and Hiroaki Saito. Micro-hopping robot for asteroid exploration. Acta Astronautica, 52(2–6):441 – 446, 2003. Selected Proceedings of the 4th {IAA} International conference on L ow Cost Planetary Missions. Cerca con Google

[28] Tetsuo Yoshimitsu, Takashi Kubota, and Ichiro Nakatani. MINERVA rover which became a small artificial solar satellite. 2006. Cerca con Google

[29] R Pardo de Santayana, M Lauer, P Muñoz, and F Castellini. Surface Characterization and Optical Navigation at the Rosetta Flyby of Asteroid Lutetia. In Proceedings of the 24th International Symposium on Space Flight Dynamics (ISSFD), Laurel, MD, 2014. Cerca con Google

[30] R Pardo de Santayana and M Lauer. Optical measurements for Rosetta navigation near the comet. In Proceedings of the 25 th International Symposium on Space Flight Dynamics (ISSFD), Munich, Germany, 2015. Cerca con Google

[31] Mathias Lauer, Ulrich Herfort, Dave Hocken, and Sabine Kielbassa. Optical measurements for the flyby navigation of Rosetta at asteroid Steins. In Proceedings 21st International Symposium on Space Flight Dynamics–21st ISSFD. Toulouse, France, 2009. Cerca con Google

[32] Robert G Reid, Loris Roveda, Issa AD Nesnas, and Marco Pavone. Contact dynamics of internally-actuated platforms for the exploration of small solar system bodies. In Proceedings of the 12th International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-SAIRAS2014), Saint-Hubert, Canada, page 9, 2014. Cerca con Google

[33] B Hockman, Andreas Frick, Issa AD Nesnas, and Marco Pavone. Design, control, and experimentation of internally-actuated rovers for the exploration of low-gravity planetary bodies. In Field and Service Robotics, pages 283–298. Springer, 2016. Cerca con Google

[34] Marco Pavone, Julie Castillo, Jeffrey A. Hoffman, and Issa Nesnas. Spacecraft/Rover Hybrids for the Exploration of Small Solar System Bodies. Technical report, Final report for NASA NIAC Program, 2012. Cerca con Google

[35] R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, ISBN: 0521540518, second edition, 2004. Cerca con Google

[36] Laurent Kneip. Real-Time Scalable Structure from Motion: From Fundamental Geometric Vision to Collaborative Mapping. PhD thesis, University of Zurich, 2012. Cerca con Google

[37] Sameer Agarwal, Noah Snavely, Ian Simon, Steven M Seitz, and Richard Szeliski. Building rome in a day. In 2009 IEEE 12th international conference on computer vision, pages 72–79. IEEE, 2009. Cerca con Google

[38] Stefano Debei, Alessio Aboudan, Giacomo Colombatti, and Marco Pertile. Lutetia surface reconstruction and uncertainty analysis. Planetary and Space Science, 71(1):64–72, 2012. Cerca con Google

[39] Susie Green, Andrew Bevan, and Michael Shapland. A comparative assessment of structure from motion methods for archaeological research. Journal of Archaeological Science, 46:173 – 181, 2014. Cerca con Google

[40] D. Scaramuzza and F. Fraundorfer. Visual Odometry [Tutorial]. IEEE Robotics Automation Magazine, 18(4):80–92, Dec 2011. Cerca con Google

[41] F. Fraundorfer and D. Scaramuzza. Visual Odometry : Part II: Matching, Robustness, Optimization, and Applications. Robotics Automation Magazine, IEEE, 19(2):78–90, June 2012. Cerca con Google

[42] Hugh Durrant-Whyte and Tim Bailey. Simultaneous localization and mapping: part I. IEEE robotics & automation magazine, 13(2):99–110, 2006. Cerca con Google

[43] OpenCV library, http://opencv.org/. Vai! Cerca con Google

[44] Frédéric Devernay and Olivier Faugeras. Straight lines have to be straight. Machine Vision and Applications, 13(1):14–24, 2001. Cerca con Google

[45] Zhengyou Zhang. A flexible new technique for camera calibration. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 22(11):1330–1334, Nov 2000. Cerca con Google

[46] Chris Harris and Mike Stephens. A combined corner and edge detector. In In Proc. of Fourth Alvey Vision Conference, pages 147–151, 1988. Cerca con Google

[47] Jianbo Shi and Carlo Tomasi. Good features to track. In Computer Vision and Pattern Recognition, 1994. Proceedings CVPR’94., 1994 IEEE Computer Society Conference on, pages 593–600. IEEE, 1994. Cerca con Google

[48] Edward Rosten and Tom Drummond. Machine Learning for High-Speed Corner Detection, pages 430–443. Springer Berlin Heidelberg, Berlin, Heidelberg, 2006. Cerca con Google

[49] David G. Lowe. Distinctive image features from scale-invariant keypoints. Interna- tional Journal of Computer Vision, 60:91–110, 2004. Cerca con Google

[50] Herbert Bay, Tinne Tuytelaars, and Luc Van Gool. Surf: Speeded up robust features. In European conference on computer vision, pages 404–417. Springer, 2006. Cerca con Google

[51] Motilal Agrawal, Kurt Konolige, and Morten Rufus Blas. CenSurE: Center Surround Extremas for Realtime Feature Detection and Matching, pages 102–115. Springer Berlin Heidelberg, Berlin, Heidelberg, 2008. Cerca con Google

[52] Michael Calonder, Vincent Lepetit, Christoph Strecha, and Pascal Fua. BRIEF: Binary Robust Independent Elementary Features, pages 778–792. Springer Berlin Heidelberg, Berlin, Heidelberg, 2010. Cerca con Google

[53] S. Leutenegger, M. Chli, and R. Y. Siegwart. BRISK: Binary robust invariant scalable keypoints. In 2011 International Conference on Computer Vision, pages 2548–2555, Nov 2011. Cerca con Google

[54] R. Mur-Artal, J. M. M. Montiel, and J. D. Tardós. ORB-SLAM: A versatile and Accurate Monocular SLAM System. IEEE Transactions on Robotics, 31(5):1147– 1163, Oct 2015. Cerca con Google

[55] Ethan Rublee, Vincent Rabaud, Kurt Konolige, and Gary Bradski. ORB: An efficient alternative to SIFT or SURF. In 2011 International conference on computer vision, pages 2564–2571. IEEE, 2011. Cerca con Google

[56] Hans P Moravec. Obstacle avoidance and navigation in the real world by a seeing robot rover. Technical report, DTIC Document, 1980. Cerca con Google

[57] L. Matthies and S.A. Shafer. Error modeling in stereo navigation. Robotics and Automation, IEEE Journal of, 3(3):239–248, June 1987. Cerca con Google

[58] Clark F. Olson, Larry H. Matthies, Marcel Schoppers, and Mark W. Maimone. Rover navigation using stereo ego-motion. Robotics and Autonomous Systems, 43(4):215 – 229, 2003. Cerca con Google

[59] Yang Cheng, M.W. Maimone, and L. Matthies. Visual odometry on the Mars exploration rovers - a tool to ensure accurate driving and science imaging. Robotics Automation Magazine, IEEE, 13(2):54–62, June 2006. Cerca con Google

[60] D. Nister, O. Naroditsky, and J. Bergen. Visual odometry. In Computer Vision and Pattern Recognition, 2004. CVPR 2004. Proceedings of the 2004 IEEE Computer Society Conference on, volume 1, pages I–652–I–659 Vol.1, June 2004. Cerca con Google

[61] L. Kneip and H. Li. Efficient computation of relative pose for multi-camera systems. In 2014 IEEE Conference on Computer Vision and Pattern Recognition, pages 446–453, June 2014. Cerca con Google

[62] Andrew I Comport, Ezio Malis, and Patrick Rives. Accurate quadrifocal tracking for robust 3d visual odometry. In Proceedings 2007 IEEE International Conference on Robotics and Automation, pages 40–45. IEEE, 2007. Cerca con Google

[63] K.S. Arun, T.S. Huang, and S.D. Blostein. Least-squares fitting of two 3-D point sets. Pattern Analysis and Machine Intelligence, IEEE Transactions on, PAMI-9(5):698– 700, Sept 1987. Cerca con Google

[64] Berthold K. P. Horn, H.M. Hilden, and Shariar Negahdaripour. Closed-form solution of absolute orientation using orthonormal matrices. JOURNAL OF THE OPTICAL SOCIETY AMERICA, 5(7):1127–1135, 1988. Cerca con Google

[65] Yi Ma, Stefano Soatto, Jana Kosecka, and Shankar Sastry. An invitation to 3-d vision: from images to geometric models, volume 26. Springer Verlag, 2004. Cerca con Google

[66] D. Nister. An efficient solution to the five-point relative pose problem. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 26(6):756–770, June 2004. Cerca con Google

[67] L. Kneip, D. Scaramuzza, and R. Siegwart. A novel parametrization of the perspectivethree- point problem for a direct computation of absolute camera position and orientation. In Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on, Cerca con Google

pages 2969–2976, June 2011. Cerca con Google

[68] Vincent Lepetit, Francesc Moreno-Noguer, and Pascal Fua. EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision, 81(2):155, 2008. Cerca con Google

[69] Martin A Fischler and Robert C Bolles. Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Communications of the ACM, 24(6):381–395, 1981. Cerca con Google

[70] Joint Committee for Guides in Metrology. JCGM 100: Evaluation of measurement data - guide to the expression of uncertainty in measurement. Technical report, JCGM, 2008. Cerca con Google

[71] R. Kümmerle, G. Grisetti, H. Strasdat, K. Konolige, and W. Burgard. g2o: A general framework for graph optimization. In Robotics and Automation (ICRA), 2011 IEEE International Conference on, pages 3607–3613, May 2011. Cerca con Google

[72] Manolis IA Lourakis and Antonis A Argyros. SBA: A software package for generic sparse bundle adjustment. ACM Transactions on Mathematical Software (TOMS), 36(1):2, 2009. Cerca con Google

[73] Jose-Luis Blanco. A tutorial on SE(3) transformation parameterizations and onmanifold optimization. University of Malaga, Tech. Rep, 2010. Cerca con Google

[74] Sameer Agarwal, Keir Mierle, and Others. Ceres Solver. http://ceres-solver.org. Vai! Cerca con Google

[75] T. Bailey and H. Durrant-Whyte. Simultaneous localization and mapping (SLAM): part II. IEEE Robotics Automation Magazine, 13(3):108–117, Sept 2006. Cerca con Google

[76] C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza, J. Neira, I. Reid, and J. J. Leonard. Past, Present, and Future of Simultaneous Localization and Mapping: Toward the Robust-Perception Age. IEEE Transactions on Robotics, 32(6):1309–1332, Dec 2016. Cerca con Google

[77] Bruno Siciliano and Oussama Khatib. Springer handbook of robotics. Springer, 2016. Cerca con Google

[78] Michael Montemerlo, Sebastian Thrun, Daphne Koller, Ben Wegbreit, et al. Fast-SLAM: A factored solution to the simultaneous localization and mapping problem. In Aaai/iaai, pages 593–598, 2002. Cerca con Google

[79] G. Klein and D. Murray. Parallel Tracking and Mapping for Small AR Workspaces. In Mixed and Augmented Reality, 2007. ISMAR 2007. 6th IEEE and ACM International Symposium on, pages 225–234, Nov 2007. Cerca con Google

[80] L. Riazuelo, Javier Civera, and J.M.M. Montiel. C2TAM: A cloud framework for cooperative tracking and mapping. Robotics and Autonomous Systems, 62(4):401 – 413, 2014. Cerca con Google

[81] Matia Pizzoli, Christian Forster, and Davide Scaramuzza. REMODE: Probabilistic, monocular dense reconstruction in real time. In Robotics and Automation (ICRA), 2014 IEEE International Conference on, pages 2609–2616. IEEE, 2014. Cerca con Google

[82] R. A. Newcombe, S. J. Lovegrove, and A. J. Davison. DTAM: Dense tracking and mapping in real-time. In 2011 International Conference on Computer Vision, pages 2320–2327, Nov 2011. Cerca con Google

[83] Jakob Engel, Thomas Schöps, and Daniel Cremers. LSD-SLAM: Large-Scale Direct Monocular SLAM, pages 834–849. Springer International Publishing, Cham, 2014. Cerca con Google

[84] Alejo Concha and Javier Civera. DPPTAM: Dense piecewise planar tracking and mapping from a monocular sequence. In Intelligent Robots and Systems (IROS), 2015 IEEE/RSJ International Conference on, pages 5686–5693. IEEE, 2015. Cerca con Google

[85] Christian Forster, Matia Pizzoli, and Davide Scaramuzza. SVO: Fast semi-direct monocular visual odometry. In Robotics and Automation (ICRA), 2014 IEEE Interna- tional Conference on, pages 15–22. IEEE, 2014. Cerca con Google

[86] D. Galvez-López and J. D. Tardos. Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Transactions on Robotics, 28(5):1188–1197, Oct 2012. Cerca con Google

[87] R. Prakash, P. D. Burkhart, A. Chen, K. A. Comeaux, C. S. Guernsey, D. M. Kipp, L. V. Lorenzoni, G. F. Mendeck, R. W. Powell, T. P. Rivellini, A. M. S. Martin, S. W. Sell, A. D. Steltzner, and D. W. Way. Mars Science Laboratory entry, descent, and landing system overview. In 2008 IEEE Aerospace Conference, pages 1–18, March 2008. Cerca con Google

[88] Rongxing Li, Shaojun He, Yunhang Chen, Min Tang, Pingbo Tang, Kaichang Di, Larry Matthies, Raymond E. Arvidson, Steven W. Squyres, Larry S. Crumpler, Tim Parker, and Michael Sims. MER Spirit rover localization: Comparison of ground image–and orbital image–based methods and science applications. Journal of Geophysical Research: Planets, 116(E7):n/a–n/a, 2011. E00F16. Cerca con Google

[89] David E Smith, Maria T Zuber, Sean C Solomon, Roger J Phillips, James W Head, James B Garvin, W Bruce Banerdt, Duane O Muhleman, Gordon H Pettengill, Gregory A Neumann, et al. The global topography of mars and implications for surface evolution. Science, 284(5419):1495–1503, 1999. Cerca con Google

[90] Douglass A. Alexander, Robert G. Deen, Paul M. Andres, Payam Zamani, Helen B. Mortensen, Amy C. Chen, Michael K. Cayanan, Jeffrey R. Hall, Vadim S. Klochko, Oleg Pariser, Carol L. Stanley, Charles K. Thompson, and Gary M. Yagi. Processing of Mars Exploration Rover imagery for science and operations planning. Journal of Geophysical Research: Planets, 111(E2):n/a–n/a, 2006. E02S02. Cerca con Google

[91] Rongxing Li, Kaichang Di, Jue Wang, Xutong Niu, Sanchit Agarwal, Evgenia Brodyagina, Erik Oberg, and Ju Won Hwangbo. A WebGIS for spatial data processing, analysis, and distribution for the MER 2003 mission. Photogrammetric Engineering & Remote Sensing, 73(6):671–680, 2007. Cerca con Google

[92] Alfred S McEwen, Eric M Eliason, James W Bergstrom, Nathan T Bridges, Candice J Hansen, W Alan Delamere, John A Grant, Virginia C Gulick, Kenneth E Herkenhoff, Laszlo Keszthelyi, et al. Mars Reconnaissance Orbiter’s High Resolution Imaging Science Experiment (HiRISE). Journal of Geophysical Research: Planets, 112(E5), 2007. Cerca con Google

[93] Fengliang Xu. Mapping and localization for extraterrestrial robotic explorations. PhD thesis, Citeseer, 2004. Cerca con Google

[94] F. Stein and G. Medioni. Map-based localization using the panoramic horizon. IEEE Transactions on Robotics and Automation, 11(6):892–896, Dec 1995. Cerca con Google

[95] Patrick J.F. Carle, Paul T. Furgale, and Timothy D. Barfoot. Long-range rover localization by matching LIDAR scans to orbital elevation maps. Journal of Field Robotics, 27(3):344–370, 2010. Cerca con Google

[96] R. L. Kirk, E. Howington-Kraus, M. R. Rosiek, J. A. Anderson, B. A. Archinal, K. J. Becker, D. A. Cook, D. M. Galuszka, P. E. Geissler, T. M. Hare, I. M. Holmberg, L. P. Keszthelyi, B. L. Redding, W. A. Delamere, D. Gallagher, J. D. Chapel, E. M. Eliason, R. King, and A. S. McEwen. Ultrahigh resolution topographic mapping of Mars with MRO HiRISE stereo images: Meter-scale slopes of candidate Phoenix landing sites. Journal of Geophysical Research: Planets (1991–2012), 113(E3), 3 2008. Cerca con Google

[97] R. L. Kirk, E. Howington-Kraus, M. R. Rosiek, J. A. Anderson, B. A. Archinal, K. J. Becker, D. A. Cook, D. M. Galuszka, P. E. Geissler, T. M. Hare, I. M. Holmberg, L. P. Keszthelyi, B. L. Redding, W. A. Delamere, D. Gallagher, J. D. Chapel, E. M. Eliason, R. King, and A. S. McEwen. Ultrahigh resolution topographic mapping of mars with MRO HiRISE stereo images: Meter-scale slopes of candidate Phoenix landing sites. Journal of Geophysical Research: Planets, 113(E3):n/a–n/a, 2008. Cerca con Google

[98] M Pertile, S Debei, and E Lorenzini. Uncertainty analysis of a stereo system performing ego-motion measurements in a simulated planetary environment. Journal of Physics: Conference Series, 459(1):012056, 2013. Cerca con Google

[99] Paul Furgale, Pat Carle, John Enright, and Timothy D Barfoot. The Devon Island rover navigation dataset. The International Journal of Robotics Research, 2012. Cerca con Google

[100] S. Chiodini, M. Pertile, and S. Debei. Visual odometry system performance for different landmark average distances. In 2016 IEEE Metrology for Aerospace (MetroAeroSpace), pages 382–387, June 2016. Cerca con Google

[101] Richard Hartley and Andrew Zisserman. Multiple view geometry in computer vision. Cambridge university press, 2003. Cerca con Google

[102] M. Pertile, S. Chiodini, and S. Debei. Comparison of visual odometry systems suitable for planetary exploration. In Metrology for Aerospace (MetroAeroSpace), 2014 IEEE, pages 232–237, May 2014. Cerca con Google

[103] Marco Pertile, Sebastiano Chiodini, Stefano Debei, and Enrico Lorenzini. Uncertainty comparison of three visual odometry systems in different operative conditions. Measurement, 78:388 – 396, 2016. Cerca con Google

[104] V. Peretroukhin, J. Kelly, and T.D. Barfoot. Optimizing camera perspective for stereo visual odometry. In Computer and Robot Vision (CRV), 2014 Canadian Conference on, pages 1–7, May 2014. Cerca con Google

[105] G. Dubbelman and F.C.A. Groen. Bias reduction for stereo based motion estimation with applications to large scale visual odometry. In Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on, pages 2222–2229, June 2009. Cerca con Google

[106] Joint Committee for Guides in Metrology. JCGM 101: Evaluation of measurement data - supplement 1 to the "guide to the expression of uncertainty in measurement" - propagation of distributions using a monte carlo method. Technical report, JCGM, 2008. Cerca con Google

[107] J. Delaune, G. Le Besnerais, T. Voirin, J.L. Farges, and C. Bourdarias. Visual–inertial navigation for pinpoint planetary landing using scale-based landmark matching. Robotics and Autonomous Systems, 78:63 – 82, 2016. Cerca con Google

[108] Hauke Strasdat, JMM Montiel, and Andrew J Davison. Scale drift-aware large scale monocular SLAM. Robotics: Science and Systems VI, 2010. Cerca con Google

[109] Berthold KP Horn. Closed-form solution of absolute orientation using unit quaternions. JOSA A, 4(4):629–642, 1987. Cerca con Google

Download statistics

Solo per lo Staff dell Archivio: Modifica questo record