A direct visual servoing‐based framework for the 2016 IROS Autonomous Drone Racing Challenge
This paper presents a framework for navigating in obstacle‐dense environments as posed in the 2016 International Conference on Intelligent Robots and Systems (IROS) Autonomous Drone Racing Challenge. Our framework is based on direct visual servoing and leg‐by‐leg planning to navigate in a complex en...
Saved in:
Published in | Journal of field robotics Vol. 35; no. 1; pp. 146 - 166 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Hoboken
Wiley Subscription Services, Inc
01.01.2018
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents a framework for navigating in obstacle‐dense environments as posed in the 2016 International Conference on Intelligent Robots and Systems (IROS) Autonomous Drone Racing Challenge. Our framework is based on direct visual servoing and leg‐by‐leg planning to navigate in a complex environment filled with many similar frame‐shaped obstacles to fly through. Our indoor navigation method relies on the velocity measurement by an optical flow sensor since the position measurements from GPS or external cameras are not available. For precision navigation through a sequence of obstacles, a center point–matching method is used with the depth information from the onboard stereo camera. The guidance points is directly generated in three‐dimensional space using the two‐dimensional image data to avoid accumulating the error from the sensor drift. The proposed framework is implemented on a quadrotor‐based aerial vehicle, which carries an onboard vision‐processing computer for self‐contained operation. Using the proposed method, our drone was able to finished in first place in the world‐premier IROS Autonomous Drone Racing Challenge. |
---|---|
ISSN: | 1556-4959 1556-4967 |
DOI: | 10.1002/rob.21743 |