SimpleMapping: Real-Time Visual-Inertial Dense Mapping with Deep Multi-View Stereo
- Resource Type
- Conference
- Authors
- Xin, Yingye; Zuo, Xingxing; Lu, Dongyue; Leutenegger, Stefan
- Source
- 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) ISMAR Mixed and Augmented Reality (ISMAR), 2023 IEEE International Symposium on. :273-282 Oct, 2023
- Subject
- Computing and Processing
General Topics for Engineers
Signal Processing and Analysis
Tracking loops
Three-dimensional displays
Simultaneous localization and mapping
Pose estimation
Real-time systems
6-DOF
Robustness
Real-Time System
Dense Mapping
Depth Completion
Multi-view Stereo
- Language
- ISSN
- 2473-0726
We present a real-time visual-inertial dense mapping method capable of performing incremental 3D mesh reconstruction with high quality using only sequential monocular images and inertial measurement unit (IMU) readings. 6-DoF camera poses are estimated by a robust feature-based visual-inertial odometry (VIO), which also generates noisy sparse 3D map points as a by-product. We propose a sparse point aided multi-view stereo neural network (SPA-MVSNet) that can effectively leverage the informative but noisy sparse points from the VIO system. The sparse depth from VIO is firstly completed by a single-view depth completion network. This dense depth map, although naturally limited in accuracy, is then used as a prior to guide our MVS network in the cost volume generation and regularization for accurate dense depth prediction. Predicted depth maps of keyframe images by the MVS network are incrementally fused into a global map using TSDF-Fusion. We extensively evaluate both the proposed SPA-MVSNet and the entire dense mapping system on several public datasets as well as our own dataset, demonstrating the system’s impressive generalization capabilities and its ability to deliver high-quality 3D reconstruction online. Our proposed dense mapping system achieves a 39.7% improvement in F-score over existing systems when evaluated on the challenging scenarios of the EuRoC dataset.