Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone

Micro Aerial Vehicles (FOV) are very suitable for flying in indoor environments, but autonomous navigation is challenging due to their strict hardware limitations. This paper presents a highly efficient computer vision algorithm called Edge-FS for the determination of velocity and depth. It runs at...

Full description

Saved in:
Bibliographic Details
Published inIEEE robotics and automation letters Vol. 2; no. 2; pp. 1070 - 1076
Main Authors McGuire, Kimberly, de Croon, Guido, De Wagter, Christophe, Tuyls, Karl, Kappen, Hilbert
Format Journal Article
LanguageEnglish
Published IEEE 01.04.2017
Subjects
Online AccessGet full text
ISSN2377-3766
2377-3766
DOI10.1109/LRA.2017.2658940

Cover

Loading…
Abstract Micro Aerial Vehicles (FOV) are very suitable for flying in indoor environments, but autonomous navigation is challenging due to their strict hardware limitations. This paper presents a highly efficient computer vision algorithm called Edge-FS for the determination of velocity and depth. It runs at 20 Hz on a 4 g stereo camera with an embedded STM32F4 microprocessor (168 MHz, 192 kB) and uses edge distributions to calculate optical flow and stereo disparity. The stereo-based distance estimates are used to scale the optical flow in order to retrieve the drone's velocity. The velocity and depth measurements are used for fully autonomous flight of a 40 g pocket drone only relying on on-board sensors. This method allows the MAV to control its velocity and avoid obstacles.
AbstractList Micro Aerial Vehicles (FOV) are very suitable for flying in indoor environments, but autonomous navigation is challenging due to their strict hardware limitations. This paper presents a highly efficient computer vision algorithm called Edge-FS for the determination of velocity and depth. It runs at 20 Hz on a 4 g stereo camera with an embedded STM32F4 microprocessor (168 MHz, 192 kB) and uses edge distributions to calculate optical flow and stereo disparity. The stereo-based distance estimates are used to scale the optical flow in order to retrieve the drone's velocity. The velocity and depth measurements are used for fully autonomous flight of a 40 g pocket drone only relying on on-board sensors. This method allows the MAV to control its velocity and avoid obstacles.
Author De Wagter, Christophe
Tuyls, Karl
Kappen, Hilbert
de Croon, Guido
McGuire, Kimberly
Author_xml – sequence: 1
  givenname: Kimberly
  surname: McGuire
  fullname: McGuire, Kimberly
  email: k.n.mcguire@tudelft.nl
  organization: Fac. of Aerosp. Eng., Delft Univ. of Technol., Delft, Netherlands
– sequence: 2
  givenname: Guido
  surname: de Croon
  fullname: de Croon, Guido
  email: g.c.h.e.decroon@tudelft.nl
  organization: Fac. of Aerosp. Eng., Delft Univ. of Technol., Delft, Netherlands
– sequence: 3
  givenname: Christophe
  surname: De Wagter
  fullname: De Wagter, Christophe
  email: c.dewagter@tudelft.nl
  organization: Fac. of Aerosp. Eng., Delft Univ. of Technol., Delft, Netherlands
– sequence: 4
  givenname: Karl
  surname: Tuyls
  fullname: Tuyls, Karl
  email: k.tuyls@liverpool.ac.uk
  organization: Dept. of Comput. Sci., Univ. of Liverpool, Liverpool, UK
– sequence: 5
  givenname: Hilbert
  surname: Kappen
  fullname: Kappen, Hilbert
  email: B.Kappen@science.ru.nl
  organization: Fac. of Sci., Radboud Univ. of Nijmegen, Nijmegen, Netherlands
BookMark eNp9kE9LAzEQxYNUsNbeBS_5Aq3JJpvsHpfaqlCo-KfXJZudQHS7KUmq9Nu72xYRD55mmJnf8N67RIPWtYDQNSVTSkl-u3wupgmhcpqINMs5OUPDhEk5YVKIwa_-Ao1DeCeE0DSRLE-HaDc3xmoLbcSrbbRaNXjRuC-s2hq_RPDg8NoG61psnMdraJy2cY_nIdqNiv28v1xVISrdAC4-na1VqwEfNrjYRde6jdsF_OT0B0R85zvpV-jcqCbA-FRH6G0xf509TJar-8dZsZxoJmmcMKGVVCwjkhHNNeVUSl0ZJmjOOWW6c6RlClVFap7lBnJqBK-ymrOE1NCRIySOf7V3IXgwZaf-IDt6ZZuSkrLPr-zyK_v8ylN-HUj-gFvfGfb7_5CbI2IB4OdcZowRkbJvoTl90g
CODEN IRALC6
CitedBy_id crossref_primary_10_1016_j_robot_2023_104533
crossref_primary_10_3390_drones5020052
crossref_primary_10_1109_OJCOMS_2021_3106274
crossref_primary_10_1016_j_ifacol_2019_12_529
crossref_primary_10_1016_j_engappai_2022_105036
crossref_primary_10_3390_s21186223
crossref_primary_10_1098_rstb_2021_0449
crossref_primary_10_1109_ACCESS_2020_3007315
crossref_primary_10_1139_dsa_2023_0019
crossref_primary_10_1109_ACCESS_2021_3055939
crossref_primary_10_1109_JAS_2021_1004249
crossref_primary_10_1109_JSEN_2020_2978334
crossref_primary_10_1109_TCYB_2019_2905570
crossref_primary_10_1109_LRA_2023_3337991
crossref_primary_10_1177_0142331220947507
crossref_primary_10_1109_ACCESS_2019_2958045
crossref_primary_10_1002_msd2_12098
crossref_primary_10_3390_biomimetics8040350
crossref_primary_10_1007_s11554_018_0752_5
crossref_primary_10_1126_scirobotics_aaw9710
crossref_primary_10_1109_LGRS_2024_3419925
crossref_primary_10_3390_su16052105
crossref_primary_10_1016_j_cja_2020_03_036
crossref_primary_10_1016_j_procs_2018_07_099
crossref_primary_10_1016_j_image_2023_116921
crossref_primary_10_1142_S2301385018500097
crossref_primary_10_1002_rob_21956
crossref_primary_10_3390_s22176703
crossref_primary_10_1109_LRA_2019_2956380
crossref_primary_10_3389_frobt_2020_00018
crossref_primary_10_1016_j_robot_2019_04_007
crossref_primary_10_1142_S0218348X22400734
crossref_primary_10_3390_drones8020033
crossref_primary_10_1016_j_trpro_2018_12_003
crossref_primary_10_1109_JSEN_2018_2865306
crossref_primary_10_1080_18824889_2022_2087413
crossref_primary_10_1007_s11063_021_10442_9
crossref_primary_10_1088_1757_899X_851_1_012017
crossref_primary_10_3390_jimaging7100217
crossref_primary_10_1109_TCSVT_2022_3156653
crossref_primary_10_1016_j_iot_2023_100883
crossref_primary_10_1007_s00170_020_05945_z
crossref_primary_10_1109_LRA_2024_3504318
crossref_primary_10_1109_TII_2018_2869622
crossref_primary_10_3390_s17030571
crossref_primary_10_1109_ACCESS_2021_3097945
crossref_primary_10_1137_20M1328130
crossref_primary_10_1109_TMM_2021_3096083
crossref_primary_10_1109_JSEN_2022_3229421
crossref_primary_10_1109_JSEN_2017_2728724
crossref_primary_10_1371_journal_pone_0303160
crossref_primary_10_1109_LRA_2024_3349814
crossref_primary_10_1109_LRA_2024_3496336
crossref_primary_10_3390_aerospace10020183
crossref_primary_10_1109_TSUSC_2018_2810952
crossref_primary_10_3390_agronomy11061069
crossref_primary_10_1109_LRA_2021_3100153
crossref_primary_10_1016_j_measurement_2020_108911
crossref_primary_10_1109_TMM_2019_2929934
crossref_primary_10_1109_MCE_2019_2892280
crossref_primary_10_3390_rs14153824
crossref_primary_10_24003_emitter_v10i2_700
crossref_primary_10_1007_s11554_021_01187_8
crossref_primary_10_1109_LRA_2018_2789841
crossref_primary_10_1109_LRA_2022_3190097
crossref_primary_10_16984_saufenbilder_674122
crossref_primary_10_3233_JCM_204501
crossref_primary_10_1016_j_mtelec_2023_100071
crossref_primary_10_1109_ACCESS_2021_3065926
crossref_primary_10_1109_LRA_2021_3062317
crossref_primary_10_1109_TIE_2024_3429611
crossref_primary_10_1109_TCAD_2022_3163629
crossref_primary_10_1109_JIOT_2019_2917066
crossref_primary_10_1109_ACCESS_2020_3020632
crossref_primary_10_2478_auseme_2023_0002
crossref_primary_10_1109_ACCESS_2023_3292302
crossref_primary_10_1109_ACCESS_2022_3181989
crossref_primary_10_1002_advs_202405160
crossref_primary_10_1016_j_jag_2022_102739
crossref_primary_10_1109_LCSYS_2024_3401025
crossref_primary_10_1016_j_vehcom_2022_100552
crossref_primary_10_1109_ACCESS_2019_2952173
crossref_primary_10_1109_JIOT_2020_3027095
crossref_primary_10_3390_aerospace9070332
crossref_primary_10_1007_s12652_021_03352_0
crossref_primary_10_1109_LRA_2019_2913077
crossref_primary_10_3390_jimaging6080078
crossref_primary_10_1007_s10514_018_9760_3
crossref_primary_10_1016_j_icte_2020_03_007
crossref_primary_10_1088_1361_6552_ac44fc
crossref_primary_10_20965_jrm_2018_p0363
Cites_doi 10.1016/j.robot.2009.02.001
10.1109/IVS.2011.5940405
10.1109/ISCAS.2003.1205152
10.1109/MRA.2008.919023
10.1109/ICRA.2014.6907589
10.1117/12.571550
10.1109/ICRA.2014.6907418
10.1177/0278364915578646
10.1109/ICRA.2015.7138979
10.1109/TRO.2009.2018972
10.1109/ICRA.2013.6630807
10.1098/rspb.1980.0057
10.1109/TPAMI.2007.1166
10.1109/CVPR.2010.5539798
10.1007/s10514-015-9494-4
10.1016/0004-3702(81)90024-2
10.1109/ICCV.2005.104
10.1109/ICRA.2013.6630805
10.1109/ICRA.2016.7487496
ContentType Journal Article
DBID 97E
RIA
RIE
AAYXX
CITATION
DOI 10.1109/LRA.2017.2658940
DatabaseName IEEE Xplore (IEEE)
IEEE All-Society Periodicals Package (ASPP) 1998–Present
IEEE Electronic Library (IEL)
CrossRef
DatabaseTitle CrossRef
DatabaseTitleList
Database_xml – sequence: 1
  dbid: RIE
  name: IEEE Xplore
  url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Engineering
EISSN 2377-3766
EndPage 1076
ExternalDocumentID 10_1109_LRA_2017_2658940
7833065
Genre orig-research
GroupedDBID 0R~
97E
AAJGR
AARMG
AASAJ
AAWTH
ABAZT
ABQJQ
ABVLG
ACGFS
AGQYO
AGSQL
AHBIQ
AKJIK
AKQYR
ALMA_UNASSIGNED_HOLDINGS
ATWAV
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
EBS
EJD
IFIPE
IPLJI
JAVBF
KQ8
M43
M~E
O9-
OCL
RIA
RIE
AAYXX
CITATION
RIG
ID FETCH-LOGICAL-c371t-36ca7a380730c4c14177cbf36194413c766c75ebb0d489fe91f64b8d4320de7a3
IEDL.DBID RIE
ISSN 2377-3766
IngestDate Tue Jul 01 03:53:52 EDT 2025
Thu Apr 24 23:13:27 EDT 2025
Tue Aug 26 16:57:02 EDT 2025
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 2
Language English
License https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c371t-36ca7a380730c4c14177cbf36194413c766c75ebb0d489fe91f64b8d4320de7a3
OpenAccessLink http://resolver.tudelft.nl/uuid:caf22050-b8fd-469f-9b2d-337fcd7b4571
PageCount 7
ParticipantIDs crossref_primary_10_1109_LRA_2017_2658940
crossref_citationtrail_10_1109_LRA_2017_2658940
ieee_primary_7833065
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2017-April
2017-4-00
PublicationDateYYYYMMDD 2017-04-01
PublicationDate_xml – month: 04
  year: 2017
  text: 2017-April
PublicationDecade 2010
PublicationTitle IEEE robotics and automation letters
PublicationTitleAbbrev LRA
PublicationYear 2017
Publisher IEEE
Publisher_xml – name: IEEE
References ref13
ref12
ref15
briod (ref20) 0
ref14
ref11
ref2
ref1
ref16
ref19
ref18
bouguet (ref10) 2001; 5
smeur (ref26) 2015; 38
tijmons (ref5) 2016
ref24
ref23
ref25
farnebäck (ref7) 0
ref22
ref21
dunkley (ref17) 0
shi (ref8) 0
ref9
ref4
ref3
ref6
References_xml – ident: ref16
  doi: 10.1016/j.robot.2009.02.001
– start-page: 2
  year: 0
  ident: ref17
  article-title: Visual-inertial navigation for a camera-equipped 25g nano-quadrotor
  publication-title: IROS2014 Aerial Open Source Robotics Workshop
– ident: ref3
  doi: 10.1109/IVS.2011.5940405
– ident: ref18
  doi: 10.1109/ISCAS.2003.1205152
– ident: ref19
  doi: 10.1109/MRA.2008.919023
– ident: ref4
  doi: 10.1109/ICRA.2014.6907589
– year: 2016
  ident: ref5
  article-title: Obstacle avoidance strategy using onboard stereo vision on a flapping wing MAV
  publication-title: arXiv 1604 00833
– ident: ref12
  doi: 10.1117/12.571550
– ident: ref22
  doi: 10.1109/ICRA.2014.6907418
– ident: ref14
  doi: 10.1177/0278364915578646
– ident: ref24
  doi: 10.1109/ICRA.2015.7138979
– ident: ref15
  doi: 10.1109/TRO.2009.2018972
– ident: ref23
  doi: 10.1109/ICRA.2013.6630807
– ident: ref25
  doi: 10.1098/rspb.1980.0057
– ident: ref2
  doi: 10.1109/TPAMI.2007.1166
– ident: ref1
  doi: 10.1109/CVPR.2010.5539798
– year: 0
  ident: ref20
  article-title: Optic-flow based control of a 46g quadrotor
  publication-title: Proc IEEE/RSJ Int Conf Robot Autom
– volume: 5
  start-page: 1
  year: 2001
  ident: ref10
  article-title: Pyramidal implementation of the affine lucas kanade feature tracker description of the algorithm
  publication-title: Intel Corporation
– ident: ref21
  doi: 10.1007/s10514-015-9494-4
– ident: ref6
  doi: 10.1016/0004-3702(81)90024-2
– ident: ref9
  doi: 10.1109/ICCV.2005.104
– volume: 38
  start-page: 450
  year: 2015
  ident: ref26
  article-title: Adaptive incremental nonlinear dynamic inversion for attitude control of micro air vehicles
  publication-title: J Guid Control Dyn
– ident: ref11
  doi: 10.1109/ICRA.2013.6630805
– start-page: 593
  year: 0
  ident: ref8
  article-title: Good features to track
  publication-title: Proc IEEE Conf Comput Vis Pattern Recognit
– ident: ref13
  doi: 10.1109/ICRA.2016.7487496
– start-page: 363
  year: 0
  ident: ref7
  article-title: Two-frame motion estimation based on polynomial expansion
  publication-title: Proc Scand Conf Image Anal
SSID ssj0001527395
Score 2.4442348
Snippet Micro Aerial Vehicles (FOV) are very suitable for flying in indoor environments, but autonomous navigation is challenging due to their strict hardware...
SourceID crossref
ieee
SourceType Enrichment Source
Index Database
Publisher
StartPage 1070
SubjectTerms Aerial systems: Perception and autonomy
autonomous vehicle navigation
Cameras
Drones
Image edge detection
micro/nano robots
Navigation
Optical imaging
Optical sensors
visual-based navigation
Title Efficient Optical Flow and Stereo Vision for Velocity Estimation and Obstacle Avoidance on an Autonomous Pocket Drone
URI https://ieeexplore.ieee.org/document/7833065
Volume 2
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV3dT9swED9RnsbDgME0Pob8sJdJpI0bx04eK2hVoXVMY1S8RbF9kRCoqUoCEg_87ZydtKsmNPEWJXeJlTv7zue73wF849byPFUm4FEhApGHNKViNEEitcyNRdlHF4ec_JTja3FxE99swOmqFgYRffIZdt2lP8u3paldqKynksg1Ou9AhzZuTa3W33iKQxJL4-VJZJj2fvweuNQt1e2TlU1ddGPN8qy1UvGWZLQNk-UYmgSSu25d6a55_gee8b2D3IGPrUvJBo0O7MIGzj7B1hrQ4B7UQ48UQYzscu6j12x0Xz6xfGbZFf1ZLNnUF5kz8mHZFMnCkXfOhjT_m9JGT3mpyZWkT7DBY3lrnbow_4QN6srVRpT1A_tF6ytW7HxRznAfrkfDP2fjoG24EJhI8SqIpMlV7iDoo9AIwwVXyugicpEOMnZGSWlUjFqHViRpgSkvpNCJFVE_tEicn2FzRq__AkyTI2oFtw7uXgi0OS9yKYg3pjUEbXIAvaUwMtOikbumGPeZ35WEaUbiy5z4slZ8B_B9xTFvkDj-Q7vnBLOia2Vy-PbtI_jgmJt8nGPYrBY1fiVXo9In0Jm8DE-8pr0CcQvTaA
linkProvider IEEE
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Rb9QwDLbGeAAeYGwgxmDkYS-T6F1zTZP28TTudMDdNsE27a1qEldCTNdptCDx67HT3nFCCPFWtXYa1W7sOPZngCPpvSxz4yKZVCpSZUy_VIouyrTVpfOoR8hxyMWpnl2qD9fp9Ra8XdfCIGJIPsMBX4azfF-7lkNlQ5Ml3Oj8HtxPuRi3q9b6HVFhLLE8XZ1Fxvlw_mnMyVtmMCI7m3N8Y8P2bDRTCbZk-gQWq1l0KSRfB21jB-7nHwCN_zvNHXjcO5Vi3GnBU9jC5S482oAa3IN2ErAiiFGc3Yb4tZje1D9EufTiM31brMVVKDMX5MWKKyQbR_65mNAK0BU3BsozS84kvUKMv9dfPCuMCE_EuG24OqJuv4lzWmGxEe_u6iU-g8vp5OJkFvUtFyKXGNlEiXalKRmEPomdclJJY5ytEo51kLlzRmtnUrQ29irLK8xlpZXNvEpGsUfifA7bSxr-BQhLrqhX0jPgvVLoS1mVWhFvSqsI-mwfhithFK7HI-e2GDdF2JfEeUHiK1h8RS--fThec9x2WBz_oN1jwazpepm8_PvtN_BgdrGYF_P3px8P4CEP1GXnvILt5q7F1-R4NPYw6NsvnV7Vfw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Efficient+Optical+Flow+and+Stereo+Vision+for+Velocity+Estimation+and+Obstacle+Avoidance+on+an+Autonomous+Pocket+Drone&rft.jtitle=IEEE+robotics+and+automation+letters&rft.au=McGuire%2C+Kimberly&rft.au=de+Croon%2C+Guido&rft.au=De+Wagter%2C+Christophe&rft.au=Tuyls%2C+Karl&rft.date=2017-04-01&rft.issn=2377-3766&rft.eissn=2377-3766&rft.volume=2&rft.issue=2&rft.spage=1070&rft.epage=1076&rft_id=info:doi/10.1109%2FLRA.2017.2658940&rft.externalDBID=n%2Fa&rft.externalDocID=10_1109_LRA_2017_2658940
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=2377-3766&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=2377-3766&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=2377-3766&client=summon