Matching of ground-based LiDAR and aerial image data for mobile robot localization in densely forested environments
- Resource Type
- Conference
- Authors
- Hussein, Marwan; Renner, Matthew; Watanabe, Masaaki; Iagnemma, Karl
- Source
- 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on. :1432-1437 Nov, 2013
- Subject
- Robotics and Control Systems
Vegetation
Laser radar
Accuracy
Global Positioning System
Estimation
Iterative closest point algorithm
Image resolution
- Language
- ISSN
- 2153-0858
2153-0866
We present a vision based method for the autonomous geolocation of ground vehicles and unmanned mobile robots in forested environments. The method provides an estimate of the global horizontal position of a vehicle strictly based on finding a geometric match between a map of observed tree stems, scanned in 3D by sensors onboard the vehicle, to another stem map generated from the structure of tree crowns observed in overhead imagery of the forest canopy. This method can be used in real-time as a complement to the Global Positioning System (GPS) in areas where signal coverage is inadequate due to attenuation by the forest canopy, or due to intentional denied access. The method presented in this paper has two key properties that are significant: i) It does not require a priori knowledge of the area surrounding the robot. ii) It uses the geometry of detected tree stems as the only input to determine horizontal geoposition.