Dense Map Representations

Scalable Volumentric TSDF and Occupancy Mapping (SRL at Imperial College)

More recently, we have been exploring alternative map representations, such as octrees encoding Truncated Signed Distance Fields (TSDF) or occupancy values in a volumetric manner — which ideally lend themselves to efficient memory usage, fast access and spatial scalability, while they can be immediately interfaced with robotic motion planning. Furthermore, the hierarchical structure allows for the adoption of adaptive resolution mapping, i.e. we would like to represent fine details, when the camera is close to structure, but maintain the ability to map more corsely, when it is far away. This way, aliasing artefacts can be minimised and tracking as well as mapping speed gains are obtained.

Current collaborators:

Former collaborators:

Collaboration within the ORCA Hub:


Export as PDF, XML, TEX or BIB

2019
Conference and Workshop Papers
[]Adaptive-resolution octree-based volumetric SLAM (E Vespa, N Funk, PH Kelly and S Leutenegger), In 2019 International Conference on 3D Vision (3DV), 2019.  [bibtex]
2018
Journal Articles
[]Efficient octree-based volumetric SLAM supporting signed-distance and occupancy mapping (E Vespa, N Nikolov, M Grimm, L Nardi, PH Kelly and S Leutenegger), In IEEE Robotics and Automation Letters, IEEE, volume 3, 2018.  [bibtex]
Powered by bibtexbrowser
Export as PDF, XML, TEX or BIB

Dense RGB-D Surfel Mapping (Dyson Robotics Lab at Imperial College)

In work with Thomas Whelan, myself, Renato Salas-Moreno, Ben Glocker and Andrew Davison, we perform RGB-D SLAM with both local and large-scale loop-closures where a dense surfel-map is aligned and deformed in real-time, in order to continuously improve the reconstruction consistency.

As an extension, we also propose to use inertial measurements in the tracking step of ElasticFusion, which can be combined in a probabilistally meaningful way with photometric and geometric cost terms. Similarly, as the availability of acceleration measurements renders the gravity direction globally observable, we may include additional (soft) constraints in the map deformations, such that they remain consistent with gravity. Alternatively, we have also worked with integrating wheel odometry and arm kinematics from a mobile manipulator robot.

Current collaborators:

Former collaborators:


Export as PDF, XML, TEX or BIB

2019
Conference and Workshop Papers
[]KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (C Houseago, M Bloesch and S Leutenegger), In 2019 International Conference on Robotics and Automation (ICRA), 2019.  [bibtex]
2017
Conference and Workshop Papers
[]Dense rgb-d-inertial slam with map deformations (T Laidlow, M Bloesch, W Li and S Leutenegger), In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017.  [bibtex]
2016
Journal Articles
[]ElasticFusion: Real-time dense SLAM and light source estimation (T Whelan, RF Salas-Moreno, B Glocker, AJ Davison and S Leutenegger), In The International Journal of Robotics Research, SAGE Publications Sage UK: London, England, volume 35, 2016.  [bibtex]
2015
Conference and Workshop Papers
[]ElasticFusion: Dense SLAM Without A Pose Graph (T Whelan, S Leutenegger, RF. Salas-Moreno, B Glocker and AJ. Davison), In Robotics: Science and Systems, 2015.  [bibtex]
Powered by bibtexbrowser
Export as PDF, XML, TEX or BIB