Automatic generation and detection of highly reliable fiducial markers under occlusion
This paper presents a fiducial marker system specially appropriated for camera pose estimation in applications such as augmented reality and robot localization. Three main contributions are presented. First, we propose an algorithm for generating configurable marker dictionaries (in size and number...
Saved in:
Published in | Pattern recognition Vol. 47; no. 6; pp. 2280 - 2292 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Kidlington
Elsevier Ltd
01.06.2014
Elsevier |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | This paper presents a fiducial marker system specially appropriated for camera pose estimation in applications such as augmented reality and robot localization. Three main contributions are presented. First, we propose an algorithm for generating configurable marker dictionaries (in size and number of bits) following a criterion to maximize the inter-marker distance and the number of bit transitions. In the process, we derive the maximum theoretical inter-marker distance that dictionaries of square binary markers can have. Second, a method for automatically detecting the markers and correcting possible errors is proposed. Third, a solution to the occlusion problem in augmented reality applications is shown. To that aim, multiple markers are combined with an occlusion mask calculated by color segmentation. The experiments conducted show that our proposal obtains dictionaries with higher inter-marker distances and lower false negative rates than state-of-the-art systems, and provides an effective solution to the occlusion problem.
•We propose an algorithm for generating configurable marker dictionaries.•We derive the maximum theoretical inter-marker distance.•A method for automatically detecting the markers and correcting errors is proposed.•A solution to the occlusion problem in augmented reality applications is shown. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-1 content type line 23 |
ISSN: | 0031-3203 1873-5142 |
DOI: | 10.1016/j.patcog.2014.01.005 |