Dense rgb-d-inertial slam with map deformations (bibtex)
by T Laidlow, M Bloesch, W Li and S Leutenegger
Reference:
Dense rgb-d-inertial slam with map deformations (T Laidlow, M Bloesch, W Li and S Leutenegger), In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017. 
Bibtex Entry:
@inproceedings{laidlow2017dense,
 title = {Dense rgb-d-inertial slam with map deformations},
 author = {T Laidlow and M Bloesch and W Li and S Leutenegger},
 booktitle = {2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
 pages = {6741--6748},
 year = {2017},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
Dense rgb-d-inertial slam with map deformations (bibtex)
Dense rgb-d-inertial slam with map deformations (bibtex)
by T Laidlow, M Bloesch, W Li and S Leutenegger
Reference:
Dense rgb-d-inertial slam with map deformations (T Laidlow, M Bloesch, W Li and S Leutenegger), In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017. 
Bibtex Entry:
@inproceedings{laidlow2017dense,
 title = {Dense rgb-d-inertial slam with map deformations},
 author = {T Laidlow and M Bloesch and W Li and S Leutenegger},
 booktitle = {2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
 pages = {6741--6748},
 year = {2017},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
Dense rgb-d-inertial slam with map deformations (bibtex)
Dense rgb-d-inertial slam with map deformations (bibtex)
by T Laidlow, M Bloesch, W Li and S Leutenegger
Reference:
Dense rgb-d-inertial slam with map deformations (T Laidlow, M Bloesch, W Li and S Leutenegger), In 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2017. 
Bibtex Entry:
@inproceedings{laidlow2017dense,
 title = {Dense rgb-d-inertial slam with map deformations},
 author = {T Laidlow and M Bloesch and W Li and S Leutenegger},
 booktitle = {2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
 pages = {6741--6748},
 year = {2017},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
research:densemaprepresentations

Dense Map Representations

Scalable Volumentric TSDF and Occupancy Mapping (SRL at Imperial College)

More recently, we have been exploring alternative map representations, such as octrees encoding Truncated Signed Distance Fields (TSDF) or occupancy values in a volumetric manner — which ideally lend themselves to efficient memory usage, fast access and spatial scalability, while they can be immediately interfaced with robotic motion planning. Furthermore, the hierarchical structure allows for the adoption of adaptive resolution mapping, i.e. we would like to represent fine details, when the camera is close to structure, but maintain the ability to map more corsely, when it is far away. This way, aliasing artefacts can be minimised and tracking as well as mapping speed gains are obtained.

Current collaborators:

Former collaborators:

Collaboration within the ORCA Hub:

Dense RGB-D Surfel Mapping (Dyson Robotics Lab at Imperial College)

In work with Thomas Whelan, myself, Renato Salas-Moreno, Ben Glocker and Andrew Davison, we perform RGB-D SLAM with both local and large-scale loop-closures where a dense surfel-map is aligned and deformed in real-time, in order to continuously improve the reconstruction consistency.

As an extension, we also propose to use inertial measurements in the tracking step of ElasticFusion, which can be combined in a probabilistally meaningful way with photometric and geometric cost terms. Similarly, as the availability of acceleration measurements renders the gravity direction globally observable, we may include additional (soft) constraints in the map deformations, such that they remain consistent with gravity. Alternatively, we have also worked with integrating wheel odometry and arm kinematics from a mobile manipulator robot.

Current collaborators:

Former collaborators: