Real-time tracking of facial features in unconstrained video

A method for locating and tracking facial features in an unconstrained video sequence includes: in a face-detecting process, delineating, with region-bounding coordinates, the face of the subject within an image selected from the sequence; detecting, in the selected image, a small set of landmarks,...

Full description

Saved in:
Bibliographic Details
Format Patent
LanguageEnglish
Published 16.03.2021
Online AccessGet full text

Cover

Abstract A method for locating and tracking facial features in an unconstrained video sequence includes: in a face-detecting process, delineating, with region-bounding coordinates, the face of the subject within an image selected from the sequence; detecting, in the selected image, a small set of landmarks, corresponding to the face of the subject, using a convolutional neural network, trained to take as input an image region corresponding to the face of the-subject and to return a set of coordinates at computational speeds approximating real time; projectively fitting a three-dimensional character model to the detected landmarks, and using the fitted model to estimate physical locations of additional landmarks, so as to provide a complete hypothesized set of facial landmarks; and in a feature tracker process, updating the hypothesized set of facial landmarks to improve convergence between predicted feature locations and their actual physical locations based on data sampled from the selected image.
AbstractList A method for locating and tracking facial features in an unconstrained video sequence includes: in a face-detecting process, delineating, with region-bounding coordinates, the face of the subject within an image selected from the sequence; detecting, in the selected image, a small set of landmarks, corresponding to the face of the subject, using a convolutional neural network, trained to take as input an image region corresponding to the face of the-subject and to return a set of coordinates at computational speeds approximating real time; projectively fitting a three-dimensional character model to the detected landmarks, and using the fitted model to estimate physical locations of additional landmarks, so as to provide a complete hypothesized set of facial landmarks; and in a feature tracker process, updating the hypothesized set of facial landmarks to improve convergence between predicted feature locations and their actual physical locations based on data sampled from the selected image.
BookMark eNrjYmDJy89L5WSwCUpNzNEtycxNVSgpSkzOzsxLV8hPU0hLTM5MzFFIS00sKS1KLVbIzFMozUvOzysGKsrMS01RKMtMSc3nYWBNS8wpTuWF0twMCm6uIc4euqXFBYklqXklxfHpRYkgytDA0sTSzMTSmAglAKwEMnU
ContentType Patent
CorporateAuthor Image Metrics, Ltd
CorporateAuthor_xml – name: Image Metrics, Ltd
DBID EFH
DatabaseName USPTO Issued Patents
DatabaseTitleList
Database_xml – sequence: 1
  dbid: EFH
  name: USPTO Issued Patents
  url: http://www.uspto.gov/patft/index.html
  sourceTypes: Open Access Repository
DeliveryMethod fulltext_linktorsrc
ExternalDocumentID 10949649
GroupedDBID EFH
ID FETCH-uspatents_grants_109496493
IEDL.DBID EFH
IngestDate Sun Mar 05 22:34:44 EST 2023
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-uspatents_grants_109496493
OpenAccessLink https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/10949649
ParticipantIDs uspatents_grants_10949649
PatentNumber 10949649
PublicationCentury 2000
PublicationDate 20210316
PublicationDateYYYYMMDD 2021-03-16
PublicationDate_xml – month: 03
  year: 2021
  text: 20210316
  day: 16
PublicationDecade 2020
PublicationYear 2021
References Xu et al. (2012/0130717) 20120500
Blanz et al. (6556196) 20030400
Partis et al. (2013/0148852) 20130600
Jo et al. (2012/0308123) 20121200
Marquardt (5659625) 19970800
Grau (2018/0158246) 20180600
Yang et al. (6580810) 20030600
Tuzel et al. (9633250) 20170400
Suzuki et al. (7916904) 20110300
Lee (2014/0307063) 20141000
Kinoshita (2008/0130961) 20080600
Rogers et al. (9104908) 20150800
Geng et al. (2006/0023923) 20060200
Wang (2019/0026538) 20190100
Patton et al. (6396599) 20020500
Walker, Jr. et al. (7619638) 20091100
Zhou (2015/0347822) 20151200
Whitehill (2015/0324632) 20151100
Fan (2018/0300880) 20181000
Accomazzi et al. (2005/0169536) 20050800
Gleeson-May (2018/0012092) 20180100
Srirangam Narashiman (2019/0279009) 20190900
Bourdev (2015/0139485) 20150500
Jones et al. (7099510) 20060800
Takano et al. (8064648) 20111100
Havaldar (2011/0110561) 20110500
Fu (2019/0014884) 20190100
Chen et al. (2014/0035901) 20140200
Rubinstenn et al. (7634103) 20091200
Utsugi (6502583) 20030100
Black et al. (2010/0111370) 20100500
Goto (8107672) 20120100
Zhang (2019/0138787) 20190500
Park et al. (2009/0153569) 20090600
Epps et al. (2010/0302258) 20101200
Corazza et al. (2013/0235045) 20130900
Ishikawa (5933527) 19990800
Legagneur et al. (8498456) 20130700
Georgescu (2019/0206056) 20190700
Yano et al. (8082926) 20111200
Li (2019/0147224) 20190500
McNamara et al. (2010/0271368) 20101000
Tomono et al. (5016282) 19910500
Linford et al. (5854850) 19981200
Lu (2020/0202854) 20200600
Rad (2017/0262962) 20170900
Ye et al. (2014/0022249) 20140100
Erdem (7127081) 20061000
Brailovskiy (10489912) 20191100
Takano et al. (6091836) 20000700
Saita et al. (6719565) 20040400
Li (2006/0133672) 20060600
References_xml – year: 20061000
  ident: 7127081
– year: 20111200
  ident: 8082926
– year: 20190700
  ident: 2019/0206056
– year: 20020500
  ident: 6396599
– year: 20120100
  ident: 8107672
– year: 20050800
  ident: 2005/0169536
– year: 20030600
  ident: 6580810
– year: 20030100
  ident: 6502583
– year: 20100500
  ident: 2010/0111370
– year: 20091200
  ident: 7634103
– year: 20190100
  ident: 2019/0014884
– year: 20130700
  ident: 8498456
– year: 20200600
  ident: 2020/0202854
– year: 20151200
  ident: 2015/0347822
– year: 20190500
  ident: 2019/0138787
– year: 19981200
  ident: 5854850
– year: 20170900
  ident: 2017/0262962
– year: 20191100
  ident: 10489912
– year: 20120500
  ident: 2012/0130717
– year: 20170400
  ident: 9633250
– year: 20181000
  ident: 2018/0300880
– year: 20101000
  ident: 2010/0271368
– year: 20150500
  ident: 2015/0139485
– year: 20140100
  ident: 2014/0022249
– year: 20080600
  ident: 2008/0130961
– year: 20151100
  ident: 2015/0324632
– year: 20180600
  ident: 2018/0158246
– year: 20140200
  ident: 2014/0035901
– year: 20121200
  ident: 2012/0308123
– year: 20111100
  ident: 8064648
– year: 20190100
  ident: 2019/0026538
– year: 20130900
  ident: 2013/0235045
– year: 20110500
  ident: 2011/0110561
– year: 20150800
  ident: 9104908
– year: 19990800
  ident: 5933527
– year: 20030400
  ident: 6556196
– year: 20040400
  ident: 6719565
– year: 19910500
  ident: 5016282
– year: 20180100
  ident: 2018/0012092
– year: 20060200
  ident: 2006/0023923
– year: 20060600
  ident: 2006/0133672
– year: 20190500
  ident: 2019/0147224
– year: 20060800
  ident: 7099510
– year: 20130600
  ident: 2013/0148852
– year: 20190900
  ident: 2019/0279009
– year: 20000700
  ident: 6091836
– year: 20110300
  ident: 7916904
– year: 20101200
  ident: 2010/0302258
– year: 19970800
  ident: 5659625
– year: 20091100
  ident: 7619638
– year: 20090600
  ident: 2009/0153569
– year: 20141000
  ident: 2014/0307063
Score 3.236888
Snippet A method for locating and tracking facial features in an unconstrained video sequence includes: in a face-detecting process, delineating, with region-bounding...
SourceID uspatents
SourceType Open Access Repository
Title Real-time tracking of facial features in unconstrained video
URI https://image-ppubs.uspto.gov/dirsearch-public/print/downloadPdf/10949649
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1NSwMxEB1sEbQnRcValRy8RsqaZBvwJi6LoBRR6K1svkTQbNlN8e87k6p40WsSkiHJ5M0MkzcAF6j-CKs-8KmRigvlCo4jA7oq2msRxEzmgP79g6qfxd1CLr4ohfJfmHdUI75CWfrLdb9KbU6uxOd9c_B8Q_5MHIGR2Ac-4lvbuLkLqP1aaCX0AAZ45cguquoR7OAcaLPF1P9CjWoPtue5dR-2fDyA60c0yjgVc2epayzFqFkbWGgoas2CzxSbPXuNDLGGzDaq3uAdo59y7SGw6vbppuY_Sy1fOsphWX6LdHUEQ_Tl_TEw4REhrZAyGCtcWZogClNOTelssOj-j2H85zQn__RNYLegZAtKNFOnMEzd2p8hWiZznnfiE1MTdko
linkProvider USPTO
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LS8QwEB50FR8nRcVdXzl4jSw1TTfgbbXU11JEYW-laRIR3HZps_j3ncmqeNFrEiZDwmS-DDPfAJyj-aNbtY4PdSy5kCbiuNLhV0VZJZwYxSGg_ziR2Yu4m8bTL0qhUAszQzPic9Slu1h0c9-E5Ep83pcXz5fkz8QRWBP7wEf93pQmNw6tXwklhVqFNUmsc4SM0mwbNlEKorbad7_8RroD63kY3YUVW-_B1RPCMk7t3Jlvy4qi1KxxzJUUt2bOBpLNjr3VDL0NATfq32ANo1q5Zh9YevM8zvjPVsVrS1ksxbdSlwfQw9-8PQQmLGpXiTh2uhImSbQTkU6GOjGVq9Qo6UP_TzGDf-bOYCO_TouH28n9EWxFlHlBWWfyGHq-XdgTdJ1en4ZD-QREuXlA
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Apatent&rft.title=Real-time+tracking+of+facial+features+in+unconstrained+video&rft.number=10949649&rft.date=2021-03-16&rft.externalDBID=n%2Fa&rft.externalDocID=10949649