Real-time markerless tracking for augmented reality: the virtual visual servoing framework
Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for...
Saved in:
Published in | IEEE transactions on visualization and computer graphics Vol. 12; no. 4; pp. 615 - 628 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
United States
IEEE
01.07.2006
The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Institute of Electrical and Electronics Engineers |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. In this paper, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking. |
---|---|
AbstractList | Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. In this paper, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking. Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking.Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision system. Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, nonlinear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders, and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating an M-estimator into the visual control law via an iteratively reweighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination, and mistracking. Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency at a reasonable cost. In order to address these issues, a real-time, robust and efficient 3D model-based tracking algorithm is proposed for a 'video see through' monocular vision system. The tracking of objects in the scene amounts to calculating the pose between the camera and the objects. Virtual objects can then be projected into the scene using the pose. Here, non-linear pose estimation is formulated by means of a virtual visual servoing approach. In this context, the derivation of point-to-curves interaction matrices are given for different 3D geometrical primitives including straight lines, circles, cylinders and spheres. A local moving edges tracker is used in order to provide real-time tracking of points normal to the object contours. Robustness is obtained by integrating a M-estimator into the visual control law via an iteratively re-weighted least squares implementation. This approach is then extended to address the 3D model-free augmented reality problem. The method presented in this paper has been validated on several complex image sequences including outdoor environments. Results show the method to be robust to occlusion, changes in illumination and mistracking. |
Author | Chaumette, F. Marchand, E. Comport, A.I. Pressigout, M. |
Author_xml | – sequence: 1 givenname: A.I. surname: Comport fullname: Comport, A.I. organization: IRISA-INRIA Rennes, France – sequence: 2 givenname: E. surname: Marchand fullname: Marchand, E. organization: IRISA-INRIA Rennes, France – sequence: 3 givenname: M. surname: Pressigout fullname: Pressigout, M. organization: IRISA-INRIA Rennes, France – sequence: 4 givenname: F. surname: Chaumette fullname: Chaumette, F. organization: IRISA-INRIA Rennes, France |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/16805268$$D View this record in MEDLINE/PubMed https://inria.hal.science/inria-00161250$$DView record in HAL |
BookMark | eNqF0s1rFDEUAPAgFduu3rwJMnjQg86aZPLprSzaCguCVA9eQiZ906Y7M6lJZkv_-2bc-kFBhcDL4ffCe3nvEO2NYQSEnhK8JATrt6dfV8dLirFYSvUAHRDNSI05FnvljqWsqaBiHx2mdIkxYUzpR2ifCIU5FeoAffsMtq-zH6AabNxA7CGlKkfrNn48r7oQKzudDzBmOKtisT7fvKvyBVRbH_Nk-xLTHBLEbfiREu0A1yFuHqOHne0TPLmLC_Tlw_vT1Um9_nT8cXW0rh1reK5bYYlWtAXVEQYYUyY7p4kSuJNtK9pSs5SyUwAMi046pZnunIPGEiKdts0Cvd69e2F7cxV96ePGBOvNydHa-DF6a0rnglCOt6ToVzt9FcP3CVI2g08O-t6OEKZklBZEUk5kkS__KYXikknG_gupwopj1hT44h68DFMcy-cYJThteFPOAj2_Q1M7wNmvjn7OrIA3O-BiSClC95tgM6-EmVfCzCth5MzpPe58ttmHsUzZ939LerZL8gDwRwkNayhvbgFDpsCN |
CODEN | ITVGEA |
CitedBy_id | crossref_primary_10_1016_j_cad_2008_10_009 crossref_primary_10_1109_TCYB_2014_2337652 crossref_primary_10_1007_s42486_018_0002_8 crossref_primary_10_1007_s11012_010_9411_z crossref_primary_10_1007_s00534_009_0199_y crossref_primary_10_1016_j_cmpb_2021_106296 crossref_primary_10_1016_j_aei_2020_101149 crossref_primary_10_1111_j_1467_8667_2012_00798_x crossref_primary_10_3390_app11010228 crossref_primary_10_1007_s12555_021_0417_1 crossref_primary_10_1109_ACCESS_2022_3156697 crossref_primary_10_1007_s00138_012_0451_3 crossref_primary_10_5909_JEB_2011_16_4_614 crossref_primary_10_1108_02602281011022751 crossref_primary_10_3390_ijgi7030112 crossref_primary_10_1109_JETCAS_2014_2298920 crossref_primary_10_1007_s12008_023_01629_6 crossref_primary_10_1109_TAES_2020_3044116 crossref_primary_10_1007_s12213_013_0065_5 crossref_primary_10_1016_j_autcon_2014_05_005 crossref_primary_10_1109_TCSVT_2016_2527219 crossref_primary_10_1109_LRA_2018_2835515 crossref_primary_10_1109_LRA_2018_2798700 crossref_primary_10_1243_09544054JEM1458 crossref_primary_10_1007_s10514_019_09886_9 crossref_primary_10_1080_0951192X_2024_2314793 crossref_primary_10_3390_s100807303 crossref_primary_10_1016_j_cviu_2013_02_002 crossref_primary_10_1177_0278364914561101 crossref_primary_10_3390_drones9010021 crossref_primary_10_1007_s00138_024_01642_z crossref_primary_10_1007_s10055_011_0206_x crossref_primary_10_1016_j_irbm_2008_12_007 crossref_primary_10_1016_j_proeng_2014_02_196 crossref_primary_10_1088_1742_6596_2402_1_012018 crossref_primary_10_1109_TCYB_2013_2272735 crossref_primary_10_1016_j_aei_2016_05_004 crossref_primary_10_3390_computers8020029 crossref_primary_10_1115_1_4041531 crossref_primary_10_1007_s00138_008_0138_y crossref_primary_10_1109_TRO_2010_2061290 crossref_primary_10_1016_j_bspc_2007_07_006 crossref_primary_10_1109_TMECH_2014_2305916 crossref_primary_10_3169_itej_75_297 crossref_primary_10_5392_JKCA_2010_10_4_066 crossref_primary_10_1109_TBME_2011_2176940 crossref_primary_10_1109_TVCG_2022_3203094 crossref_primary_10_1016_j_patcog_2018_08_002 crossref_primary_10_3390_s20102997 crossref_primary_10_1016_j_procs_2013_11_009 crossref_primary_10_1109_TCE_2016_7613201 crossref_primary_10_1002_cav_1805 crossref_primary_10_1007_s10846_011_9594_0 crossref_primary_10_1109_TAES_2020_2996074 crossref_primary_10_1111_j_1467_8659_2009_01551_x crossref_primary_10_1177_0278364910376033 crossref_primary_10_1002_rob_21531 crossref_primary_10_1016_j_paerosci_2017_07_001 crossref_primary_10_1108_AA_02_2015_012 crossref_primary_10_1109_ACCESS_2019_2959809 crossref_primary_10_1007_s11042_020_09678_9 crossref_primary_10_3390_ijgi5120244 crossref_primary_10_1061__ASCE_ME_1943_5479_0000315 crossref_primary_10_1109_TCSVT_2019_2900802 crossref_primary_10_3390_s19235084 crossref_primary_10_1109_TRO_2013_2264865 crossref_primary_10_1007_s00530_015_0488_z crossref_primary_10_1016_j_autcon_2012_09_002 crossref_primary_10_3390_robotics6030018 crossref_primary_10_1016_j_jnca_2014_07_029 crossref_primary_10_1016_j_mechatronics_2011_09_009 crossref_primary_10_1061__ASCE_CP_1943_5487_0000489 crossref_primary_10_2514_1_G004794 crossref_primary_10_1016_j_procs_2013_11_010 crossref_primary_10_2139_ssrn_3945943 crossref_primary_10_1109_TSMC_2016_2641951 crossref_primary_10_1186_s13639_014_0019_6 crossref_primary_10_1016_j_imavis_2013_10_007 crossref_primary_10_1109_TIP_2012_2199124 crossref_primary_10_3390_s23020752 crossref_primary_10_1080_0144929X_2019_1660805 crossref_primary_10_1016_j_actaastro_2018_01_026 crossref_primary_10_1088_1742_6596_1192_1_012055 crossref_primary_10_1145_3699756 crossref_primary_10_1088_1748_3182_9_2_025010 crossref_primary_10_3390_buildings12020140 crossref_primary_10_2514_1_I010555 crossref_primary_10_1007_s11263_022_01579_8 crossref_primary_10_3389_frobt_2018_00095 crossref_primary_10_1016_j_ins_2016_02_053 crossref_primary_10_1007_s11042_017_5424_0 crossref_primary_10_1007_s00170_020_05768_y crossref_primary_10_1016_j_cviu_2013_08_005 crossref_primary_10_1108_IR_03_2024_0083 crossref_primary_10_3389_frobt_2016_00007 crossref_primary_10_1142_S0218001418540216 crossref_primary_10_1016_j_ifset_2019_102178 crossref_primary_10_3182_20110828_6_IT_1002_02970 crossref_primary_10_1007_s11370_017_0230_0 crossref_primary_10_1007_s00138_007_0100_4 crossref_primary_10_1007_s10851_016_0700_6 crossref_primary_10_1016_j_cag_2011_04_007 crossref_primary_10_3389_frobt_2016_00005 crossref_primary_10_1109_TRO_2010_2098623 crossref_primary_10_1109_TAES_2022_3184660 crossref_primary_10_1177_0278364907080477 crossref_primary_10_1177_0954410019864754 crossref_primary_10_1016_j_patcog_2015_10_023 crossref_primary_10_3390_app122312022 crossref_primary_10_1080_00207543_2020_1859636 crossref_primary_10_1177_0278364914550891 crossref_primary_10_1016_j_actaastro_2015_07_025 crossref_primary_10_1016_j_rcim_2021_102185 crossref_primary_10_1109_ACCESS_2019_2932835 crossref_primary_10_1016_j_autcon_2023_105008 crossref_primary_10_3390_s23063028 crossref_primary_10_1163_156855309X420101 crossref_primary_10_5772_58928 crossref_primary_10_1109_TVCG_2010_262 crossref_primary_10_1109_TVCG_2013_94 crossref_primary_10_3390_robotics10020082 crossref_primary_10_1177_0278364909356601 crossref_primary_10_7315_CADCAM_2013_374 crossref_primary_10_1007_s10846_016_0376_6 crossref_primary_10_1016_j_actaastro_2014_11_003 crossref_primary_10_1109_ACCESS_2019_2946921 crossref_primary_10_1007_s00371_015_1098_7 crossref_primary_10_1109_TMECH_2023_3275854 crossref_primary_10_1007_s10846_014_0123_9 crossref_primary_10_1109_TIM_2008_925712 crossref_primary_10_1016_j_robot_2012_05_009 crossref_primary_10_1007_s10055_023_00853_5 crossref_primary_10_1049_trit_2017_0017 crossref_primary_10_1016_j_eswa_2017_03_060 crossref_primary_10_1109_TCYB_2015_2495157 crossref_primary_10_1109_TMECH_2013_2290742 crossref_primary_10_1016_j_autcon_2012_12_017 crossref_primary_10_3390_s16040489 crossref_primary_10_1109_TVCG_2015_2513408 crossref_primary_10_3390_sym8050037 crossref_primary_10_4018_IJCICG_2018010102 crossref_primary_10_32604_cmes_2022_019214 crossref_primary_10_1016_j_rcim_2016_03_005 crossref_primary_10_1007_s11760_014_0673_0 crossref_primary_10_1080_15599612_2015_1034903 crossref_primary_10_1109_TIE_2020_3036243 crossref_primary_10_1109_TIP_2020_3038356 crossref_primary_10_1177_1729881421990320 |
Cites_doi | 10.1109/2945.675647 10.1109/ICCV.2003.1238654 10.1109/70.143350 10.1002/j.1538-7305.1985.tb00436.x 10.1109/70.163786 10.1109/34.134043 10.1023/b:visi.0000029664.99615.94 10.1109/ISMAR.2002.1115123 10.1109/ISMAR.2002.1115078 10.1023/A:1008078328650 10.1109/ROBOT.2005.1570544 10.1007/978-3-642-87512-0_6 10.1109/TPAMI.2004.109 10.1109/IWAR.1999.803801 10.1109/MRA.2005.1577023 10.1109/34.41381 10.1109/MCG.2002.1046627 10.1137/S0036144598345802 10.1109/34.862199 10.1016/0734-189X(89)90052-2 10.1111/1467-8659.t01-1-00588 10.1080/03610927708827533 10.1109/ICCV.2003.1238341 10.1109/ISMAR.2003.1240694 10.1109/MCG.2004.1297006 10.1109/TPAMI.2002.1017620 10.1109/38.963459 10.1109/ISAR.2000.880934 10.1109/34.24782 10.1007/978-0-387-21779-6 10.1109/34.216725 10.1007/BF01440844 10.1109/21.44063 10.1109/ICCV.1999.791229 10.1016/0167-8655(84)90007-2 10.1017/cbo9780511811685 10.1006/ciun.1994.1060 10.1142/WSSRIS 10.1109/CVPR.2004.1315170 10.1016/0004-3702(87)90070-1 10.1007/BF01450852 10.1162/pres.1997.6.4.355 10.1109/38.920621 10.1109/34.41365 10.1109/ISAR.2000.880935 10.1002/SERIES1345 10.1109/cvpr.1994.323794 10.1145/358669.358692 10.1109/MCG.2002.1046628 10.1109/MCG.2003.1242385 10.1109/70.538972 |
ContentType | Journal Article |
Copyright | Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2006 Distributed under a Creative Commons Attribution 4.0 International License |
Copyright_xml | – notice: Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2006 – notice: Distributed under a Creative Commons Attribution 4.0 International License |
DBID | 97E RIA RIE AAYXX CITATION CGR CUY CVF ECM EIF NPM 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 F28 FR3 1XC VOOES |
DOI | 10.1109/TVCG.2006.78 |
DatabaseName | IEEE All-Society Periodicals Package (ASPP) 2005–Present IEEE All-Society Periodicals Package (ASPP) 1998–Present IEEE/IET Electronic Library (IEL) CrossRef Medline MEDLINE MEDLINE (Ovid) MEDLINE MEDLINE PubMed Computer and Information Systems Abstracts Electronics & Communications Abstracts Technology Research Database ProQuest Computer Science Collection Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Academic Computer and Information Systems Abstracts Professional MEDLINE - Academic ANTE: Abstracts in New Technology & Engineering Engineering Research Database Hyper Article en Ligne (HAL) Hyper Article en Ligne (HAL) (Open Access) |
DatabaseTitle | CrossRef MEDLINE Medline Complete MEDLINE with Full Text PubMed MEDLINE (Ovid) Technology Research Database Computer and Information Systems Abstracts – Academic Electronics & Communications Abstracts ProQuest Computer Science Collection Computer and Information Systems Abstracts Advanced Technologies Database with Aerospace Computer and Information Systems Abstracts Professional MEDLINE - Academic Engineering Research Database ANTE: Abstracts in New Technology & Engineering |
DatabaseTitleList | Technology Research Database MEDLINE - Academic Technology Research Database Computer and Information Systems Abstracts MEDLINE |
Database_xml | – sequence: 1 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 2 dbid: EIF name: MEDLINE url: https://proxy.k.utb.cz/login?url=https://www.webofscience.com/wos/medline/basic-search sourceTypes: Index Database – sequence: 3 dbid: RIE name: IEEE Electronic Library (IEL) url: https://proxy.k.utb.cz/login?url=https://ieeexplore.ieee.org/ sourceTypes: Publisher |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering Computer Science |
EISSN | 1941-0506 |
EndPage | 628 |
ExternalDocumentID | oai_HAL_inria_00161250v1 2340916011 16805268 10_1109_TVCG_2006_78 1634325 |
Genre | orig-research Evaluation Studies Journal Article |
GroupedDBID | --- -~X .DC 0R~ 29I 4.4 53G 5GY 5VS 6IK 97E AAJGR AARMG AASAJ AAWTH ABAZT ABQJQ ABVLG ACGFO ACIWK AENEX AETIX AGQYO AGSQL AHBIQ AI. AIBXA AKJIK AKQYR ALLEH ALMA_UNASSIGNED_HOLDINGS ATWAV BEFXN BFFAM BGNUA BKEBE BPEOZ CS3 DU5 EBS EJD F5P HZ~ H~9 IEDLZ IFIPE IFJZH IPLJI JAVBF LAI M43 O9- OCL P2P PQQKQ RIA RIE RNI RNS RZB TN5 VH1 AAYOK AAYXX CITATION RIG CGR CUY CVF ECM EIF NPM PKN RIC Z5M 7SC 7SP 8FD JQ2 L7M L~C L~D 7X8 F28 FR3 1XC VOOES |
ID | FETCH-LOGICAL-c435t-b6a1982be8f14e00247fc91860f7bb6b144777f8ee406f7c8949fcce3a117c9a3 |
IEDL.DBID | RIE |
ISSN | 1077-2626 |
IngestDate | Wed Aug 27 07:00:07 EDT 2025 Thu Jul 10 18:14:16 EDT 2025 Thu Jul 10 17:16:45 EDT 2025 Fri Jul 11 07:13:25 EDT 2025 Mon Jun 30 04:04:16 EDT 2025 Wed Feb 19 01:46:17 EST 2025 Tue Jul 01 05:23:01 EDT 2025 Thu Apr 24 22:50:29 EDT 2025 Tue Aug 26 16:40:12 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 4 |
Keywords | model-based tracking model-free tracking real-time Augmented reality robust estimators virtual visual servoing |
Language | English |
License | https://ieeexplore.ieee.org/Xplorehelp/downloads/license-information/IEEE.html Distributed under a Creative Commons Attribution 4.0 International License: http://creativecommons.org/licenses/by/4.0 |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c435t-b6a1982be8f14e00247fc91860f7bb6b144777f8ee406f7c8949fcce3a117c9a3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 ObjectType-Article-2 ObjectType-Feature-1 content type line 23 ObjectType-Undefined-1 ObjectType-Feature-3 |
ORCID | 0000-0001-7096-5236 0000-0002-1238-4385 0000-0001-9969-8258 |
OpenAccessLink | https://inria.hal.science/inria-00161250 |
PMID | 16805268 |
PQID | 865235335 |
PQPubID | 23500 |
PageCount | 14 |
ParticipantIDs | proquest_miscellaneous_28085043 proquest_journals_865235335 proquest_miscellaneous_68574744 ieee_primary_1634325 crossref_primary_10_1109_TVCG_2006_78 crossref_citationtrail_10_1109_TVCG_2006_78 hal_primary_oai_HAL_inria_00161250v1 pubmed_primary_16805268 proquest_miscellaneous_896172517 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2006-07-01 |
PublicationDateYYYYMMDD | 2006-07-01 |
PublicationDate_xml | – month: 07 year: 2006 text: 2006-07-01 day: 01 |
PublicationDecade | 2000 |
PublicationPlace | United States |
PublicationPlace_xml | – name: United States – name: New York |
PublicationTitle | IEEE transactions on visualization and computer graphics |
PublicationTitleAbbrev | TVCG |
PublicationTitleAlternate | IEEE Trans Vis Comput Graph |
PublicationYear | 2006 |
Publisher | IEEE The Institute of Electrical and Electronics Engineers, Inc. (IEEE) Institute of Electrical and Electronics Engineers |
Publisher_xml | – name: IEEE – name: The Institute of Electrical and Electronics Engineers, Inc. (IEEE) – name: Institute of Electrical and Electronics Engineers |
References | ref13 ref12 ref15 ref52 ref11 ref55 ref54 Sundareswaran (ref53) ref17 ref16 ref19 ref18 Seo (ref48) 2000; 6 ref51 ref50 ref46 ref45 ref47 ref42 ref41 ref44 ref43 ref49 ref8 ref7 ref9 ref4 ref3 ref6 ref5 Faugeras (ref14) ref40 ref35 ref34 ref36 ref31 ref30 ref33 ref32 ref2 ref1 ref39 ref38 ref24 ref23 ref26 ref25 ref20 ref22 ref21 ref28 ref27 Dhome (ref10) ref29 Ma (ref37) 2004 |
References_xml | – ident: ref30 doi: 10.1109/2945.675647 – ident: ref7 doi: 10.1109/ICCV.2003.1238654 – ident: ref13 doi: 10.1109/70.143350 – ident: ref42 doi: 10.1002/j.1538-7305.1985.tb00436.x – ident: ref47 doi: 10.1109/70.163786 – ident: ref35 doi: 10.1109/34.134043 – ident: ref33 doi: 10.1023/b:visi.0000029664.99615.94 – ident: ref5 doi: 10.1109/ISMAR.2002.1115123 – ident: ref55 doi: 10.1109/ISMAR.2002.1115078 – ident: ref25 doi: 10.1023/A:1008078328650 – ident: ref6 doi: 10.1109/ROBOT.2005.1570544 – ident: ref43 doi: 10.1007/978-3-642-87512-0_6 – ident: ref54 doi: 10.1109/TPAMI.2004.109 – ident: ref45 doi: 10.1109/IWAR.1999.803801 – ident: ref40 doi: 10.1109/MRA.2005.1577023 – ident: ref32 doi: 10.1109/34.41381 – ident: ref46 doi: 10.1109/MCG.2002.1046627 – ident: ref52 doi: 10.1137/S0036144598345802 – ident: ref36 doi: 10.1109/34.862199 – ident: ref22 doi: 10.1016/0734-189X(89)90052-2 – ident: ref39 doi: 10.1111/1467-8659.t01-1-00588 – ident: ref21 doi: 10.1080/03610927708827533 – ident: ref44 doi: 10.1109/ICCV.2003.1238341 – ident: ref27 doi: 10.1109/ISMAR.2003.1240694 – ident: ref41 doi: 10.1109/MCG.2004.1297006 – start-page: 475 volume-title: Proc. European Conf. Computer Vision ident: ref10 article-title: Determination of the Attitude of Modelled Objects of Revolution in Monocular Perspective Vision – ident: ref12 doi: 10.1109/TPAMI.2002.1017620 – ident: ref2 doi: 10.1109/38.963459 – ident: ref26 doi: 10.1109/ISAR.2000.880934 – ident: ref4 doi: 10.1109/34.24782 – volume-title: An Invitation to 3-D Vision year: 2004 ident: ref37 doi: 10.1007/978-0-387-21779-6 – volume-title: Proc. IEEE Int’l Workshop Augmented Reality ident: ref53 article-title: Visual Servoing-Based Augmented Reality – ident: ref28 doi: 10.1109/34.216725 – ident: ref8 doi: 10.1007/BF01440844 – ident: ref18 doi: 10.1109/21.44063 – ident: ref38 doi: 10.1109/ICCV.1999.791229 – ident: ref16 doi: 10.1016/0167-8655(84)90007-2 – ident: ref19 doi: 10.1017/cbo9780511811685 – ident: ref29 doi: 10.1006/ciun.1994.1060 – ident: ref20 doi: 10.1142/WSSRIS – ident: ref31 doi: 10.1109/CVPR.2004.1315170 – ident: ref34 doi: 10.1016/0004-3702(87)90070-1 – ident: ref9 doi: 10.1007/BF01450852 – ident: ref1 doi: 10.1162/pres.1997.6.4.355 – ident: ref3 doi: 10.1109/38.920621 – ident: ref11 doi: 10.1109/34.41365 – ident: ref51 doi: 10.1109/ISAR.2000.880935 – volume: 6 start-page: 346 issue: 4 year: 2000 ident: ref48 article-title: Real-Time Camera Calibration for Virtual Studio publication-title: IEEE Trans. Visualization and Computer Graphics – start-page: 240 volume-title: Proc. Int’l Workshop Machine Vision and Machine Intelligence ident: ref14 article-title: Camera Calibration for 3D Computer Vision – ident: ref23 doi: 10.1002/SERIES1345 – ident: ref49 doi: 10.1109/cvpr.1994.323794 – ident: ref15 doi: 10.1145/358669.358692 – ident: ref50 doi: 10.1109/MCG.2002.1046628 – ident: ref17 doi: 10.1109/MCG.2003.1242385 – ident: ref24 doi: 10.1109/70.538972 |
SSID | ssj0014489 |
Score | 2.355908 |
Snippet | Tracking is a very important research subject in a real-time augmented reality context. The main requirements for trackers are high accuracy and little latency... In order to address these issues, a real-time, robust, and efficient 3D model-based tracking algorithm is proposed for a "video see through" monocular vision... |
SourceID | hal proquest pubmed crossref ieee |
SourceType | Open Access Repository Aggregation Database Index Database Enrichment Source Publisher |
StartPage | 615 |
SubjectTerms | Algorithms Augmented reality Cameras Computer Graphics Computer Science Computer Systems Computer Vision and Pattern Recognition Costs Delay Feedback Image Enhancement - methods Image Interpretation, Computer-Assisted - methods Image Processing Imaging, Three-Dimensional - methods Information Storage and Retrieval - methods Layout Machine vision Mathematical models model-based tracking model-free tracking Real time Real time systems Robotics Robust control robust estimators Robustness Signal Processing, Computer-Assisted Studies Three dimensional Tracking User-Computer Interface virtual visual servoing Vision systems Visual Visual servoing |
Title | Real-time markerless tracking for augmented reality: the virtual visual servoing framework |
URI | https://ieeexplore.ieee.org/document/1634325 https://www.ncbi.nlm.nih.gov/pubmed/16805268 https://www.proquest.com/docview/865235335 https://www.proquest.com/docview/28085043 https://www.proquest.com/docview/68574744 https://www.proquest.com/docview/896172517 https://inria.hal.science/inria-00161250 |
Volume | 12 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjZ1Lb9QwEIBH3Z7gwKs8Qnn4UE6QbZw4sc2tqigrBBxQiyouke3YUAFZtLvpgV_PjL0bCmIlTomUSWRnbPkbz3gG4ACJgCtb2Vzi-pULU9S5NVzkOmivOtF5aWKA7PtmdibenNfnO_BiPAvjvY_BZ35Kt9GX383dQFtlh8gOoirrCUzQcEtntUaPAZoZOsUXyrxESh-D3PXh6cfj18ntIGN5vkbFJCd_rESTLxQHGQusbGfNuOac3IR3m9amUJOv02Flp-7nX4kc_7c7t-DGGj7ZURott2HH93fg-pWUhHvw6QOSY04V59l3itwhX_ySrRbG0Z46Q8RlZvgcM3l2DIGTKP4lQ4pklxcLOouC1yVdaLd3Hl_ZhH_dhbOTV6fHs3xdfyF3CFGr3DaGa1VarwIXnlZzGZzmqimCtLax-M-llEF5j1QQpFNa6OCcrwzn0mlT3YPdft77B8BkEURHwOG0FlwEfD2UVtdlp0NhjMjg-UYXrVsnJ6caGd_aaKQUuiUlUs3MppUqg2ej9I-UlGOL3AGqdRShTNqzo7ftRY9zu42wiwB4yTPYI-X8_lTSSwb7m3HQrif2slUNWu6IyPj06fgUZyS5WUzv58OyLRWlARTVdolG1WjFCew12yKhNJFlzWUG99MIvNK6NHwf_rvV-3At7RJRRPEj2F0tBv8YuWlln8QJ8wtochNZ |
linkProvider | IEEE |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjZ1Lb9QwEICtthygB17lEQrUh3KCbOPEiW1uVUVZYNsD2qKKi2U7NlSFLNpNeuDXM-PshoJYiVMiZRLZGVvz2TOeIWQfiIBJW9hUgP1KucnK1BrGUxWUlzWvvTAxQPa0Gp_x9-fl-QZ5NZyF8d7H4DM_wtvoy69nrsOtsgNgB17k5Sa5AXa_zPvTWoPPABYaqo8wFGkOnD6EuauD6aejt73jQcQCfZWMaU7-sEWbXzESMpZYWU-b0eoc3yEnq_b2wSaXo661I_fzr1SO_9uhu-T2Ej_pYT9e7pEN39wn29eSEu6Qzx-BHVOsOU-_Y-wOeuMXtJ0bh7vqFCCXmu5LzOVZU0BO5PjXFDiSXl3M8TQKXBd4wf3eWXxlFQD2gJwdv5kejdNlBYbUAUa1qa0MUzK3XgbGPdpzEZxissqCsLay8M-FEEF6D1wQhJOKq-CcLwxjwilTPCRbzazxjwkVWeA1IodTijMe4PWQW1XmtQqZMTwhL1e60G6ZnhyrZHzTcZmSKY1KxKqZlRYyIS8G6R99Wo41cvug1kEEc2mPDyf6ooHZrSPuAgJesYTsoHJ-f6rXS0J2V-NAL6f2QssK1u4AyfB0b3gKcxIdLabxs26hc4mJAHmxXqKSJazjOPSarpGQCtmyZCIhj_oReK11_fB98u9W75Gb4-nJRE_enX7YJbf6PSOML35Kttp5558BRbX2eZw8vwDkBRaj |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Real-Time+Markerless+Tracking+for+Augmented+Reality%3A+The+Virtual+Visual+Servoing+Framework&rft.jtitle=IEEE+transactions+on+visualization+and+computer+graphics&rft.au=Comport%2C+A+I&rft.au=Marchand%2C+E&rft.au=Pressigout%2C+M&rft.au=Chaumette%2C+F&rft.date=2006-07-01&rft.issn=1077-2626&rft.volume=12&rft.issue=4&rft.spage=615&rft.epage=628&rft_id=info:doi/10.1109%2FTVCG.2006.78&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1077-2626&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1077-2626&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1077-2626&client=summon |