KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (bibtex)
by C Houseago, M Bloesch and S Leutenegger
Reference:
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (C Houseago, M Bloesch and S Leutenegger), In 2019 International Conference on Robotics and Automation (ICRA), 2019. 
Bibtex Entry:
@inproceedings{houseago2019ko,
 title = {KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking},
 author = {C Houseago and M Bloesch and S Leutenegger},
 booktitle = {2019 International Conference on Robotics and Automation (ICRA)},
 pages = {4054--4060},
 year = {2019},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (bibtex)
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (bibtex)
by C Houseago, M Bloesch and S Leutenegger
Reference:
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (C Houseago, M Bloesch and S Leutenegger), In 2019 International Conference on Robotics and Automation (ICRA), 2019. 
Bibtex Entry:
@inproceedings{houseago2019ko,
 title = {KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking},
 author = {C Houseago and M Bloesch and S Leutenegger},
 booktitle = {2019 International Conference on Robotics and Automation (ICRA)},
 pages = {4054--4060},
 year = {2019},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (bibtex)
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (bibtex)
by C Houseago, M Bloesch and S Leutenegger
Reference:
KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking (C Houseago, M Bloesch and S Leutenegger), In 2019 International Conference on Robotics and Automation (ICRA), 2019. 
Bibtex Entry:
@inproceedings{houseago2019ko,
 title = {KO-Fusion: dense visual SLAM with tightly-coupled kinematic and odometric tracking},
 author = {C Houseago and M Bloesch and S Leutenegger},
 booktitle = {2019 International Conference on Robotics and Automation (ICRA)},
 pages = {4054--4060},
 year = {2019},
 organization = {IEEE},
 keywords = {multisensorslam, denseslam, elasticfusion},
}
Powered by bibtexbrowser
research:densemaprepresentations

Dense Map Representations

Scalable Volumentric TSDF and Occupancy Mapping (SRL at Imperial College)

More recently, we have been exploring alternative map representations, such as octrees encoding Truncated Signed Distance Fields (TSDF) or occupancy values in a volumetric manner — which ideally lend themselves to efficient memory usage, fast access and spatial scalability, while they can be immediately interfaced with robotic motion planning. Furthermore, the hierarchical structure allows for the adoption of adaptive resolution mapping, i.e. we would like to represent fine details, when the camera is close to structure, but maintain the ability to map more corsely, when it is far away. This way, aliasing artefacts can be minimised and tracking as well as mapping speed gains are obtained.

Current collaborators:

Former collaborators:

Collaboration within the ORCA Hub:

Dense RGB-D Surfel Mapping (Dyson Robotics Lab at Imperial College)

In work with Thomas Whelan, myself, Renato Salas-Moreno, Ben Glocker and Andrew Davison, we perform RGB-D SLAM with both local and large-scale loop-closures where a dense surfel-map is aligned and deformed in real-time, in order to continuously improve the reconstruction consistency.

As an extension, we also propose to use inertial measurements in the tracking step of ElasticFusion, which can be combined in a probabilistally meaningful way with photometric and geometric cost terms. Similarly, as the availability of acceleration measurements renders the gravity direction globally observable, we may include additional (soft) constraints in the map deformations, such that they remain consistent with gravity. Alternatively, we have also worked with integrating wheel odometry and arm kinematics from a mobile manipulator robot.

Current collaborators:

Former collaborators: