Seminar: Multi-Modal Perception for Mobile Robotics

Seminar Description

An accurate and robust state estimation of the robot's environment is at the core of many robotic tasks such as obstacle avoidance or planning. These tasks are essential for robotic applications in real-world scenarios, such as industrial inspection or agricultural robotics. Recent advances have brought new types of sensors in the field, each unlocking new ways for improving the performance of the perception system, but also coming with a set of challenges. Finding ways to fuse them wisely with conventional sensors is a key challenge in the current robotics research.

In the course of this seminar, we will cover topics from the previously introduced research areas. The presented works will range from classical and widely used publications to very recent developments driven by the advancements of deep learning. We will discuss how new sensor types (e.g., event cameras) or fusing new sensor modalities (e.g., LiDAR, GPS) can enhance the performance of classical visual-only approaches.

Organization

General

Preliminary Meeting

Important Note

We require some form of pre-registration (in addition to the registration via the TUM matching system). Thus, students that are not able to attend the preliminary meeting, please send us an e-mail that you are interested in the seminar such that we can provide you with the necessary information and material for the registration for the seminar

Material

Course related content can be found here