A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (bibtex)
by J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart
Reference:
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart), In 2014 IEEE international conference on robotics and automation (ICRA), 2014. 
Bibtex Entry:
@inproceedings{nikolic2014synchronized,
 title = {A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM},
 author = {J Nikolic and J Rehder and M Burri and P Gohl and S Leutenegger and PT Furgale and R Siegwart},
 booktitle = {2014 IEEE international conference on robotics and automation (ICRA)},
 pages = {431--437},
 year = {2014},
 organization = {IEEE},
 keywords = {multisensorslam},
}
Powered by bibtexbrowser
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (bibtex)
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (bibtex)
by J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart
Reference:
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart), In 2014 IEEE international conference on robotics and automation (ICRA), 2014. 
Bibtex Entry:
@inproceedings{nikolic2014synchronized,
 title = {A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM},
 author = {J Nikolic and J Rehder and M Burri and P Gohl and S Leutenegger and PT Furgale and R Siegwart},
 booktitle = {2014 IEEE international conference on robotics and automation (ICRA)},
 pages = {431--437},
 year = {2014},
 organization = {IEEE},
 keywords = {multisensorslam},
}
Powered by bibtexbrowser
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (bibtex)
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (bibtex)
by J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart
Reference:
A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM (J Nikolic, J Rehder, M Burri, P Gohl, S Leutenegger, PT Furgale and R Siegwart), In 2014 IEEE international conference on robotics and automation (ICRA), 2014. 
Bibtex Entry:
@inproceedings{nikolic2014synchronized,
 title = {A synchronized visual-inertial sensor system with FPGA pre-processing for accurate real-time SLAM},
 author = {J Nikolic and J Rehder and M Burri and P Gohl and S Leutenegger and PT Furgale and R Siegwart},
 booktitle = {2014 IEEE international conference on robotics and automation (ICRA)},
 pages = {431--437},
 year = {2014},
 organization = {IEEE},
 keywords = {multisensorslam},
}
Powered by bibtexbrowser
research:multisensorslam

Multi-Sensor SLAM

Keyframe-Based Visual-Inertial Odometry and SLAM Using Nonlinear Optimisation

Here, we fuse inertial measurements with visual measurements: due to the complementary characteristics of these sensing modalities, they have become a popular choice for accurate SLAM in mobile robotics. While historically the problem has been addressed with filtering, advancements in visual estimation suggest that non-linear optimisation offers superior accuracy, while still tractable in complexity thanks to the sparsity of the underlying problem. Taking inspiration from these findings, we formulate a probabilistic cost function that combines reprojection error of landmarks and inertial terms. We ensure real-time operation by limiting the optimisation to a bounded window of keyframes by applying various marginalisation strategies. Keyframes may be spaced in time by arbitrary intervals, while old measurements are still kept as linearised error terms.

Former collaborators:

Optical Flow and SLAM with Event Cameras (Imperial College)

Event cameras are novel camera systems that sense intensity change independently per pixel and report these events of change — brighter or darker by a specific amount — with a very accurate timestamp. As such, they are inspired from biology (retina) and offer the potential to overcome difficulties with motion blur or dynamic range that standard frame-based cameras face.

We have been looking at two different challenges: first, we tried to simply reconstruct both video and optical flow from the events: the approach should be able to deal with any scene content. Second, we tackled reconstruction of semi-dense depth and intensity keyframes along with general camera motion, where the scene is assumed to be static — effectively SLAM with an event camera.

Former collaborators: