Catch Me, If You Can! A Mediated Perception Approach Towards Fully Autonomous Drone Racing

Florian Ölsner, Stefan Milz
Proceedings of the NeurIPS 2019 Competition and Demonstration Track, PMLR 123:90-99, 2020.

Abstract

Automated flight, e.g. first person view drone racing is a challenging task involving many sub-problems like monocular object detection, 3D pose estimation, mapping, optimal path planning and collision avoidance. Treating this problem, we propose an intuitive solution for the NeurIPS (2019) Game of Drones competition, especially the perception focused tier. We formulate a modular system composed of three layers: machine learning based perception, mapping and planning. Fundamental is a robust gate detection for target guidance accompanied with a monocular depth estimation for collision avoidance. The estimated targets are used to create and update the 3D gate positions within a map. Rule based trajectory planning is finally used for optimal flying. Our approach runs in real-time on a state of the art GPU and is able to robustly navigate through different simulated race tracks under challenging conditions, e.g. high speeds, confusing gate positioning and irregular shapes.Our approach ranks on the 3rd place on the final leader board. In this paper we present our system design in detail and provide additional experimental results.

Cite this Paper


BibTeX
@InProceedings{pmlr-v123-olsner20a, title = {Catch Me, If You Can! A Mediated Perception Approach Towards Fully Autonomous Drone Racing}, author = {\"Olsner, Florian and Milz, Stefan}, booktitle = {Proceedings of the NeurIPS 2019 Competition and Demonstration Track}, pages = {90--99}, year = {2020}, editor = {Escalante, Hugo Jair and Hadsell, Raia}, volume = {123}, series = {Proceedings of Machine Learning Research}, month = {08--14 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v123/olsner20a/olsner20a.pdf}, url = {https://proceedings.mlr.press/v123/olsner20a.html}, abstract = {Automated flight, e.g. first person view drone racing is a challenging task involving many sub-problems like monocular object detection, 3D pose estimation, mapping, optimal path planning and collision avoidance. Treating this problem, we propose an intuitive solution for the NeurIPS (2019) Game of Drones competition, especially the perception focused tier. We formulate a modular system composed of three layers: machine learning based perception, mapping and planning. Fundamental is a robust gate detection for target guidance accompanied with a monocular depth estimation for collision avoidance. The estimated targets are used to create and update the 3D gate positions within a map. Rule based trajectory planning is finally used for optimal flying. Our approach runs in real-time on a state of the art GPU and is able to robustly navigate through different simulated race tracks under challenging conditions, e.g. high speeds, confusing gate positioning and irregular shapes.Our approach ranks on the 3rd place on the final leader board. In this paper we present our system design in detail and provide additional experimental results.} }
Endnote
%0 Conference Paper %T Catch Me, If You Can! A Mediated Perception Approach Towards Fully Autonomous Drone Racing %A Florian Ölsner %A Stefan Milz %B Proceedings of the NeurIPS 2019 Competition and Demonstration Track %C Proceedings of Machine Learning Research %D 2020 %E Hugo Jair Escalante %E Raia Hadsell %F pmlr-v123-olsner20a %I PMLR %P 90--99 %U https://proceedings.mlr.press/v123/olsner20a.html %V 123 %X Automated flight, e.g. first person view drone racing is a challenging task involving many sub-problems like monocular object detection, 3D pose estimation, mapping, optimal path planning and collision avoidance. Treating this problem, we propose an intuitive solution for the NeurIPS (2019) Game of Drones competition, especially the perception focused tier. We formulate a modular system composed of three layers: machine learning based perception, mapping and planning. Fundamental is a robust gate detection for target guidance accompanied with a monocular depth estimation for collision avoidance. The estimated targets are used to create and update the 3D gate positions within a map. Rule based trajectory planning is finally used for optimal flying. Our approach runs in real-time on a state of the art GPU and is able to robustly navigate through different simulated race tracks under challenging conditions, e.g. high speeds, confusing gate positioning and irregular shapes.Our approach ranks on the 3rd place on the final leader board. In this paper we present our system design in detail and provide additional experimental results.
APA
Ölsner, F. & Milz, S.. (2020). Catch Me, If You Can! A Mediated Perception Approach Towards Fully Autonomous Drone Racing. Proceedings of the NeurIPS 2019 Competition and Demonstration Track, in Proceedings of Machine Learning Research 123:90-99 Available from https://proceedings.mlr.press/v123/olsner20a.html.

Related Material