A Meta-Learning Approach for Few-Shot Face Forgery Segmentation and Classification
The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged images created with known methods. However, when encountering unseen forgery methods, the technology performs poorly. Recently, one suggested...
Saved in:
Published in | Sensors (Basel, Switzerland) Vol. 23; no. 7; p. 3647 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Switzerland
MDPI AG
31.03.2023
MDPI |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged images created with known methods. However, when encountering unseen forgery methods, the technology performs poorly. Recently, one suggested approach to tackle this problem is to use a hand-crafted generator of forged images to create a range of fake images, which can then be used to train the neural network. However, the aforementioned method has limited detection performance when encountering unseen forging techniques that the hand-craft generator has not accounted for. To overcome the limitations of existing methods, in this paper, we adopt a meta-learning approach to develop a highly adaptive detector for identifying new forging techniques. The proposed method trains a forged image detector using meta-learning techniques, making it possible to fine-tune the detector with only a few new forged samples. The proposed method inputs a small number of the forged images to the detector and enables the detector to adjust its weights based on the statistical features of the input forged images, allowing the detection of forged images with similar characteristics. The proposed method achieves significant improvement in detecting forgery methods, with IoU improvements ranging from 35.4% to 127.2% and AUC improvements ranging from 2.0% to 48.9%, depending on the forgery method. These results show that the proposed method significantly improves detection performance with only a small number of samples and demonstrates better performance compared to current state-of-the-art methods in most scenarios. |
---|---|
AbstractList | The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged images created with known methods. However, when encountering unseen forgery methods, the technology performs poorly. Recently, one suggested approach to tackle this problem is to use a hand-crafted generator of forged images to create a range of fake images, which can then be used to train the neural network. However, the aforementioned method has limited detection performance when encountering unseen forging techniques that the hand-craft generator has not accounted for. To overcome the limitations of existing methods, in this paper, we adopt a meta-learning approach to develop a highly adaptive detector for identifying new forging techniques. The proposed method trains a forged image detector using meta-learning techniques, making it possible to fine-tune the detector with only a few new forged samples. The proposed method inputs a small number of the forged images to the detector and enables the detector to adjust its weights based on the statistical features of the input forged images, allowing the detection of forged images with similar characteristics. The proposed method achieves significant improvement in detecting forgery methods, with IoU improvements ranging from 35.4% to 127.2% and AUC improvements ranging from 2.0% to 48.9%, depending on the forgery method. These results show that the proposed method significantly improves detection performance with only a small number of samples and demonstrates better performance compared to current state-of-the-art methods in most scenarios. The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged images created with known methods. However, when encountering unseen forgery methods, the technology performs poorly. Recently, one suggested approach to tackle this problem is to use a hand-crafted generator of forged images to create a range of fake images, which can then be used to train the neural network. However, the aforementioned method has limited detection performance when encountering unseen forging techniques that the hand-craft generator has not accounted for. To overcome the limitations of existing methods, in this paper, we adopt a meta-learning approach to develop a highly adaptive detector for identifying new forging techniques. The proposed method trains a forged image detector using meta-learning techniques, making it possible to fine-tune the detector with only a few new forged samples. The proposed method inputs a small number of the forged images to the detector and enables the detector to adjust its weights based on the statistical features of the input forged images, allowing the detection of forged images with similar characteristics. The proposed method achieves significant improvement in detecting forgery methods, with IoU improvements ranging from 35.4% to 127.2% and AUC improvements ranging from 2.0% to 48.9%, depending on the forgery method. These results show that the proposed method significantly improves detection performance with only a small number of samples and demonstrates better performance compared to current state-of-the-art methods in most scenarios.The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged images created with known methods. However, when encountering unseen forgery methods, the technology performs poorly. Recently, one suggested approach to tackle this problem is to use a hand-crafted generator of forged images to create a range of fake images, which can then be used to train the neural network. However, the aforementioned method has limited detection performance when encountering unseen forging techniques that the hand-craft generator has not accounted for. To overcome the limitations of existing methods, in this paper, we adopt a meta-learning approach to develop a highly adaptive detector for identifying new forging techniques. The proposed method trains a forged image detector using meta-learning techniques, making it possible to fine-tune the detector with only a few new forged samples. The proposed method inputs a small number of the forged images to the detector and enables the detector to adjust its weights based on the statistical features of the input forged images, allowing the detection of forged images with similar characteristics. The proposed method achieves significant improvement in detecting forgery methods, with IoU improvements ranging from 35.4% to 127.2% and AUC improvements ranging from 2.0% to 48.9%, depending on the forgery method. These results show that the proposed method significantly improves detection performance with only a small number of samples and demonstrates better performance compared to current state-of-the-art methods in most scenarios. |
Audience | Academic |
Author | Lin, Yih-Kai Yen, Ting-Yu |
AuthorAffiliation | Department of Computer Science and Artificial Intelligence, National Pingtung University, No. 4-18 Minsheng Road, Pingtung City 90003, Taiwan |
AuthorAffiliation_xml | – name: Department of Computer Science and Artificial Intelligence, National Pingtung University, No. 4-18 Minsheng Road, Pingtung City 90003, Taiwan |
Author_xml | – sequence: 1 givenname: Yih-Kai orcidid: 0000-0001-6509-0017 surname: Lin fullname: Lin, Yih-Kai – sequence: 2 givenname: Ting-Yu surname: Yen fullname: Yen, Ting-Yu |
BackLink | https://www.ncbi.nlm.nih.gov/pubmed/37050708$$D View this record in MEDLINE/PubMed |
BookMark | eNptks1uEzEUhS1URNvAghdAI7GBxbT-98wKRRGBSkFIFNbWHY-dOJrYwZ6A-vY4TVuawsrW9bmffa7POToJMViEXhN8wViLLzNlWDHJ1TN0RjjldUMpPnm0P0XnOa8xpoyx5gU6ZQoLrHBzhr5Nqy92hHphIQUfltV0u00RzKpyMVVz-7u-XsWxmoOx1TympU031bVdbmwYYfQxVBD6ajZAzt55c1t6iZ47GLJ9dbdO0I_5x--zz_Xi66er2XRRG8HUWIPEvTNUKkG4oV2vWkIsFx0FAz1X0FsmJFWYuEYZg_uuEYTgRrTCFNNtzybo6sDtI6z1NvkNpBsdwevbQnmshjR6M1gNnXSKCIeVlJx1GAAcA8OZM9JJ0RbWhwNru-s2tjfFXoLhCHp8EvxKL-MvTTBuW1KmP0Hv7ggp_tzZPOqNz8YOAwQbd1nTBmNJy9R5kb59Il3HXQplVpqqtlUN5Zz9VS2hOPDBxXKx2UP1VHHZ8kaIPeviP6r9CO3GmxIT50v9qOHNY6cPFu8jUQSXB4FJMedknTb-8NeF7IfiWO9Dpx9CVzreP-m4h_6r_QPNUNMP |
CitedBy_id | crossref_primary_10_3390_s23146430 crossref_primary_10_3390_s23218763 |
Cites_doi | 10.1109/TBIOM.2022.3143404 10.1007/978-1-4615-5529-2 10.1145/3082031.3083247 10.1109/CVPR46437.2021.00572 10.1145/2909827.2930786 10.1109/CVPR52688.2022.01816 10.1109/MMSP.2013.6659337 10.1007/978-3-319-50835-1_22 10.1186/s12880-015-0068-x 10.1145/3306346.3323035 10.18653/v1/2021.findings-emnlp.96 10.1109/CVPR.2016.262 10.1109/CVPR52688.2022.01436 10.1109/ICASSP.2014.6854801 10.1109/ICIP40778.2020.9191042 10.3390/rs14215368 10.1109/BTAS46853.2019.9185974 10.1109/TNNLS.2022.3185795 10.1109/CVPR42600.2020.00327 |
ContentType | Journal Article |
Copyright | COPYRIGHT 2023 MDPI AG 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. 2023 by the authors. 2023 |
Copyright_xml | – notice: COPYRIGHT 2023 MDPI AG – notice: 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. – notice: 2023 by the authors. 2023 |
DBID | AAYXX CITATION NPM 3V. 7X7 7XB 88E 8FI 8FJ 8FK ABUWG AFKRA AZQEC BENPR CCPQU DWQXO FYUFA GHDGH K9. M0S M1P PHGZM PHGZT PIMPY PJZUB PKEHL PPXIY PQEST PQQKQ PQUKI PRINS 7X8 5PM DOA |
DOI | 10.3390/s23073647 |
DatabaseName | CrossRef PubMed ProQuest Central (Corporate) Health & Medical Collection ProQuest Central (purchase pre-March 2016) Medical Database (Alumni Edition) Hospital Premium Collection Hospital Premium Collection (Alumni Edition) ProQuest Central (Alumni) (purchase pre-March 2016) ProQuest Central (Alumni) ProQuest Central UK/Ireland ProQuest Central Essentials ProQuest Central ProQuest One Community College ProQuest Central Proquest Health Research Premium Collection Health Research Premium Collection (Alumni) ProQuest Health & Medical Complete (Alumni) ProQuest Health & Medical Collection Medical Database ProQuest Central Premium ProQuest One Academic (New) Publicly Available Content Database ProQuest Health & Medical Research Collection ProQuest One Academic Middle East (New) ProQuest One Health & Nursing ProQuest One Academic Eastern Edition (DO NOT USE) ProQuest One Academic ProQuest One Academic UKI Edition ProQuest Central China MEDLINE - Academic PubMed Central (Full Participant titles) DOAJ Directory of Open Access Journals |
DatabaseTitle | CrossRef PubMed Publicly Available Content Database ProQuest One Academic Middle East (New) ProQuest Central Essentials ProQuest Health & Medical Complete (Alumni) ProQuest Central (Alumni Edition) ProQuest One Community College ProQuest One Health & Nursing ProQuest Central China ProQuest Central ProQuest Health & Medical Research Collection Health Research Premium Collection Health and Medicine Complete (Alumni Edition) ProQuest Central Korea Health & Medical Research Collection ProQuest Central (New) ProQuest Medical Library (Alumni) ProQuest One Academic Eastern Edition ProQuest Hospital Collection Health Research Premium Collection (Alumni) ProQuest Hospital Collection (Alumni) ProQuest Health & Medical Complete ProQuest Medical Library ProQuest One Academic UKI Edition ProQuest One Academic ProQuest One Academic (New) ProQuest Central (Alumni) MEDLINE - Academic |
DatabaseTitleList | PubMed Publicly Available Content Database CrossRef MEDLINE - Academic |
Database_xml | – sequence: 1 dbid: DOA name: DOAJ Directory of Open Access Journals url: https://www.doaj.org/ sourceTypes: Open Website – sequence: 2 dbid: NPM name: PubMed url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed sourceTypes: Index Database – sequence: 3 dbid: BENPR name: ProQuest Central url: https://www.proquest.com/central sourceTypes: Aggregation Database |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Engineering |
EISSN | 1424-8220 |
ExternalDocumentID | oai_doaj_org_article_ab6f715f076643b0aaaf3ac43fc6f659 PMC10099107 A746948554 37050708 10_3390_s23073647 |
Genre | Journal Article |
GrantInformation_xml | – fundername: Ministry of Science and Technology grantid: MOST-109-2221-E-153-003 |
GroupedDBID | --- 123 2WC 53G 5VS 7X7 88E 8FE 8FG 8FI 8FJ AADQD AAHBH AAYXX ABDBF ABUWG ACUHS ADBBV ADMLS AENEX AFKRA AFZYC ALIPV ALMA_UNASSIGNED_HOLDINGS BENPR BPHCQ BVXVI CCPQU CITATION CS3 D1I DU5 E3Z EBD ESX F5P FYUFA GROUPED_DOAJ GX1 HH5 HMCUK HYE IAO ITC KQ8 L6V M1P M48 MODMG M~E OK1 OVT P2P P62 PHGZM PHGZT PIMPY PQQKQ PROAC PSQYO RNS RPM TUS UKHRP XSB ~8M 3V. ABJCF ARAPS HCIFZ KB. M7S NPM PDBOC 7XB 8FK AZQEC DWQXO K9. PJZUB PKEHL PPXIY PQEST PQUKI PRINS 7X8 5PM PUEGO |
ID | FETCH-LOGICAL-c537t-a60dfc267514c2bd7911e45b2acad47ade3562701f87cc0db851108595c3399d3 |
IEDL.DBID | M48 |
ISSN | 1424-8220 |
IngestDate | Wed Aug 27 01:07:23 EDT 2025 Thu Aug 21 18:38:03 EDT 2025 Fri Jul 11 07:14:02 EDT 2025 Fri Jul 25 20:23:09 EDT 2025 Thu Jul 03 03:20:41 EDT 2025 Tue Jul 01 05:45:15 EDT 2025 Wed Feb 19 02:24:20 EST 2025 Tue Jul 01 01:19:59 EDT 2025 Thu Apr 24 23:13:05 EDT 2025 |
IsDoiOpenAccess | true |
IsOpenAccess | true |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 7 |
Keywords | digital forensics segmentation face forgery detection U-Net meta-learning few-shot learning |
Language | English |
License | https://creativecommons.org/licenses/by/4.0 Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c537t-a60dfc267514c2bd7911e45b2acad47ade3562701f87cc0db851108595c3399d3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 content type line 23 |
ORCID | 0000-0001-6509-0017 |
OpenAccessLink | http://journals.scholarsportal.info/openUrl.xqy?doi=10.3390/s23073647 |
PMID | 37050708 |
PQID | 2799782443 |
PQPubID | 2032333 |
ParticipantIDs | doaj_primary_oai_doaj_org_article_ab6f715f076643b0aaaf3ac43fc6f659 pubmedcentral_primary_oai_pubmedcentral_nih_gov_10099107 proquest_miscellaneous_2800625074 proquest_journals_2799782443 gale_infotracmisc_A746948554 gale_infotracacademiconefile_A746948554 pubmed_primary_37050708 crossref_citationtrail_10_3390_s23073647 crossref_primary_10_3390_s23073647 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 20230331 |
PublicationDateYYYYMMDD | 2023-03-31 |
PublicationDate_xml | – month: 3 year: 2023 text: 20230331 day: 31 |
PublicationDecade | 2020 |
PublicationPlace | Switzerland |
PublicationPlace_xml | – name: Switzerland – name: Basel |
PublicationTitle | Sensors (Basel, Switzerland) |
PublicationTitleAlternate | Sensors (Basel) |
PublicationYear | 2023 |
Publisher | MDPI AG MDPI |
Publisher_xml | – name: MDPI AG – name: MDPI |
References | Thies (ref_2) 2019; 38 Korshunov (ref_21) 2022; 4 ref_14 ref_36 ref_35 ref_12 ref_34 ref_11 ref_33 ref_10 ref_32 ref_31 ref_30 Taha (ref_37) 2015; 15 Thies (ref_4) 2019; 61 ref_19 ref_18 ref_17 ref_16 ref_38 Cozzolino (ref_13) 2018; 29 ref_15 ref_25 ref_24 ref_23 ref_22 ref_20 ref_1 ref_3 ref_29 ref_28 ref_27 ref_26 ref_9 ref_8 ref_5 ref_7 ref_6 |
References_xml | – ident: ref_7 – volume: 4 start-page: 386 year: 2022 ident: ref_21 article-title: Improving Generalization of Deepfake Detection With Data Farming and Few-Shot Learning publication-title: IEEE Trans. Biom. Behav. Identity Sci. doi: 10.1109/TBIOM.2022.3143404 – ident: ref_28 – ident: ref_30 – ident: ref_22 doi: 10.1007/978-1-4615-5529-2 – ident: ref_5 – ident: ref_9 doi: 10.1145/3082031.3083247 – ident: ref_32 – ident: ref_26 – ident: ref_19 doi: 10.1109/CVPR46437.2021.00572 – ident: ref_8 doi: 10.1145/2909827.2930786 – ident: ref_11 – ident: ref_1 doi: 10.1109/CVPR52688.2022.01816 – ident: ref_14 doi: 10.1109/MMSP.2013.6659337 – ident: ref_18 – ident: ref_23 – ident: ref_36 doi: 10.1007/978-3-319-50835-1_22 – volume: 15 start-page: 1 year: 2015 ident: ref_37 article-title: Metrics for evaluating 3D medical image segmentation: Analysis, selection, and tool publication-title: BMC Med. Imaging doi: 10.1186/s12880-015-0068-x – volume: 38 start-page: 1 year: 2019 ident: ref_2 article-title: Deferred neural rendering: Image synthesis using neural textures publication-title: ACM Trans. Graph. (TOG) doi: 10.1145/3306346.3323035 – ident: ref_24 doi: 10.18653/v1/2021.findings-emnlp.96 – ident: ref_3 doi: 10.1109/CVPR.2016.262 – ident: ref_6 – ident: ref_25 – ident: ref_31 – ident: ref_33 – ident: ref_27 – ident: ref_35 doi: 10.1109/CVPR52688.2022.01436 – volume: 29 start-page: 669 year: 2018 ident: ref_13 article-title: A patchmatch-based dense-field algorithm for video copy–move detection and localization publication-title: IEEE Trans. Circuits Syst. Video Technol. – ident: ref_15 doi: 10.1109/ICASSP.2014.6854801 – ident: ref_29 doi: 10.1109/ICIP40778.2020.9191042 – volume: 61 start-page: 143 year: 2019 ident: ref_4 article-title: Face2Face: Real-time facial reenactment publication-title: IT-Inf. Technol. – ident: ref_12 doi: 10.3390/rs14215368 – ident: ref_10 doi: 10.1109/BTAS46853.2019.9185974 – ident: ref_34 doi: 10.1109/TNNLS.2022.3185795 – ident: ref_38 – ident: ref_17 – ident: ref_20 – ident: ref_16 doi: 10.1109/CVPR42600.2020.00327 |
SSID | ssj0023338 |
Score | 2.3956685 |
Snippet | The technology for detecting forged images is good at detecting known forgery methods. It trains neural networks using many original and corresponding forged... |
SourceID | doaj pubmedcentral proquest gale pubmed crossref |
SourceType | Open Website Open Access Repository Aggregation Database Index Database Enrichment Source |
StartPage | 3647 |
SubjectTerms | Analysis Classification Deep learning Detectors digital forensics face forgery detection few-shot learning Forgery Medical imaging equipment meta-learning Methods Neural networks segmentation Sensors Technology application U-Net |
SummonAdditionalLinks | – databaseName: DOAJ Directory of Open Access Journals dbid: DOA link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3Pi9UwEA6yJz2Iv62uEkXQS9mmSZP2-BTLIqwH14W9hWSS7gpun7hd_PedSfPKKwpevDbT0kxnOt8HM18YexOF0QAQSh1qJCiiCqXXjUSqEiW0fuhEGhI7-ayPz9Sn8-Z876gv6gmb5YFnxx05rwcjmgH5NhZPXznnBulAyQH0oJs0uoc1b0emMtWSyLxmHSGJpP7omtqdSSl9VX2SSP-fv-K9WrTuk9wrPP09djcjRr6Z3_Q-uxXHB-zOno7gQ_Zlw0_i5MoslnrBN1kpnCMk5X38VZ5ebifeO4i836Y5aH4aL67y3NHI3Rh4Oh6TGofSpUfsrP_49cNxmQ9LKKGRZiqdrsIANeJ_oaD2weBfLKrG1w5cUMaFKBHqmEoMrQGogieoldTNAF3UBfmYHYzbMT5lvIU2dnVUnVSghBcu1sEEEman4ULtCvZu50QLWUmcDrT4bpFRkL_t4u-CvV5Mf8zyGX8zek9fYjEgxet0AR1icxzYf8VBwd7Sd7SUl_gy4PJ4AW6JFK7sxiidlHBUwQ5XlphPsF7eRYLN-Xxta9Mh3UYoJAv2almmO6lHbYzbG7RpaSAV8TU-4skcOMuWpKlwoWoL1q5CarXn9cr47TKpfQsC8UjSn_0PLz1nt2vMjnmo8pAdTD9v4gtEVZN_mRLoN5yPH00 priority: 102 providerName: Directory of Open Access Journals – databaseName: Health & Medical Collection dbid: 7X7 link: http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3Pb9UwDI5gXOAw8ZvCQAEhwaVa26RJe0IPRDUhjQNj0rtViZNuk6Ade53497HTvO5VIK6NWzWOnfhr7c-MvfW5VgDgUuUKBCh55lKrSoFQxQuobFfnoUjs-Ks6OpVf1uU6fnDbxLTK7Z4YNmo3AH0jPyx0jYAHDyPx4fJXSl2j6O9qbKFxm90h6jJK6dLrG8AlEH9NbEICof3hhpKeiS99cQYFqv6_N-SdE2mZLblz_DT32X6MG_lqWugH7JbvH7J7O2yCj9i3FT_2o0kjZeoZX0W-cI6BKW_87_TkfBh5Y8DzZgjV0PzEn_2M1Uc9N73joUkmpQ-FS4_ZafP5-6ejNLZMSKEUekyNylwHBaKAXEJhnca9zMvSFgaMk9o4LzDg0VneVRogc5YCrsBxBqii2oknbK8fev-M8QoqXxde1kKCzG1ufOG0I3p2KjFUJmHvt0psIfKJU1uLHy3iCtJ3O-s7YW9m0cuJRONfQh9pJWYB4r0OF1AhbXSj1ljV6bzsMq0wlLKZMaYTBqToQHWqrBP2jtaxJe_ElwETiwxwSsRz1a60VIEPRybsYCGJXgXL4a0ltNGrN-2NDSbs9TxMd1KmWu-Ha5SpqCwVo2x8xNPJcOYpCZ3hQFYlrFqY1GLOy5H-4jxwfucUyiNUf_7_93rB7hZo91PR5AHbG6-u_UuMmkb7KrjGH0GAFho priority: 102 providerName: ProQuest |
Title | A Meta-Learning Approach for Few-Shot Face Forgery Segmentation and Classification |
URI | https://www.ncbi.nlm.nih.gov/pubmed/37050708 https://www.proquest.com/docview/2799782443 https://www.proquest.com/docview/2800625074 https://pubmed.ncbi.nlm.nih.gov/PMC10099107 https://doaj.org/article/ab6f715f076643b0aaaf3ac43fc6f659 |
Volume | 23 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwjV1Lb9QwEB71cYEDKu9AuzIICS6BJHbs5ICqLWqokLZCLSvtLXJsp0Uq2T5SFf49M0422ogeuOQQT5L1eCYz38bzDcA7FytpjLGhtAkClDiyYSVTjlDFcZNVdR77IrHZsTyai2-LdLEBqx6bvQJv7oV21E9qfn3x8ffVn310-M-EOBGyf7qhzczEg74J2xiQFPnnTAwfExLOfUNrqukKMR5GHcHQ-NJRWPLs_f--o9eC1HgD5VpEKnbgUZ9Ksmm39o9hwzVP4OEaweBTOJmymWt12LOonrFpTyHOMFdlhbsLT8-XLSu0caxY-gJpdurOfvUFSQ3TjWW-bybtKPKnnsG8OPzx5SjsuyiEJuWqDbWMbG0SBAaxMEllFb7enEirRBtthdLWccyBVBTXmTImshXlYJ72zKCKcsufw1azbNxLYJnJXJ44kXNhRFzF2iVWWWJsp6pDqQP4sFJiaXqKcep0cVEi1CB9l4O-A3g7iF52vBr3CR3QSgwCRIXtT6BCyt6zSl3JWsVpHSmJ2VUVaa1rro3gtZG1TPMA3tM6lmRC-GOM7usOcEpEfVVOlZCeIkcEsDuSREcz4-GVJZQrOy0TlSMOxxyJB_BmGKYrafNa45a3KJNRpSom3niLF53hDFPiKsKBKAsgG5nUaM7jkebnuacBjym7R_T-6j8e_BoeJOgIXTHlLmy117duD7OptprAplooPGbF1wlsHxwefz-Z-H8mJt6L_gIPeh-b |
linkProvider | Scholars Portal |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1Lb9QwEB6VcgAOiDeBAgaB4BLViRM7e0Boeay2tNsDbaW9pY7ttEhtUthUFX-K38iMk2w3AnHr1Z5Y8XjGM5PMfAPw2kVKGmNsKG2MAUrEbVjIVGCo4oTJinIU-SKx2a6cHiRf5-l8DX73tTCUVtnfif6itrWhb-SbsRphwIPGSHw4-xFS1yj6u9q30GjFYtv9usCQbfF-6zOe75s4nnzZ_zQNu64CoUmFakItuS1NjI5ylJi4sArV3SVpEWujbaK0dQJ9AsWjMlPGcFuQT-JhwIxAa24FrnsNrqPh5aRRan4Z4AmM91r0IqTkmwtKsiZ89oHN860B_jYAKxZwmJ25Yu4md-B256eycStYd2HNVffg1gp64X34NmYz1-iwg2g9YuMOn5yhI8wm7iLcO64bNtHGsUntq6_Znjs67aqdKqYry3xTTkpX8kMP4OBKmPkQ1qu6co-BZSZzo9glI5GYJCoi7WKrLMHBU0mj1AG865mYmw6_nNponOQYxxC_8yW_A3i1JD1rQTv-RfSRTmJJQDjbfgAZkndqm-tClipKS64kum4F11qXQptElEaWMh0F8JbOMafbAF_G6K6oAbdEuFr5WCXS4-8kAWwMKFGLzXC6l4S8u0UW-aXMB_ByOU1PUmZc5epzpMmoDBa9elziUSs4yy0JxXGCZwFkA5Ea7Hk4U30_9hjjEYUOEVdP_v9eL-DGdH-2k-9s7W4_hZsx6kBbsLkB683Pc_cMPbameO7VhMHhVevlH-IyUk0 |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwtV1fb9QwDLfGkBA8IP5TGBAQCF6qa5o26T0gdDCqjbEJMSbdW5cm6YYE7eA6TXw1Ph122t6uAvG218SNGseO7db-GeC540oaY2wobYwBCo9sWMpUYKjihMnKasp9kdjuntw6SD7M0_ka_B5qYSitcrgT_UVtG0PfyCexmmLAg8ZITKo-LeLTZv7m5EdIHaToT-vQTqMTkR336wzDt8Xr7U086xdxnL__8m4r7DsMhCYVqg21jGxlYnSaeWLi0ipUfZekZayNtonS1gn0D1TEq0wZE9mS_BMPCWYEWnYrcN1LcFmJlJOOqfl5sCcw9uuQjJAymiwo4Zqw2kf2z7cJ-NsYrFjDcabmiunLb8D13mdls07IbsKaq2_BtRUkw9vwecZ2XavDHq71iM16rHKGTjHL3Vm4f9y0LNfGsbzxldhs3x197yufaqZry3yDTkpd8kN34OBCmHkX1uumdveBZSZz09glU5GYhJdcu9gqS9DwVN4odQCvBiYWpscyp5Ya3wqMaYjfxZLfATxbkp50AB7_InpLJ7EkIMxtP4AMKXoVLnQpK8XTKlIS3bgy0lpXQptEVEZWMp0G8JLOsaCbAV_G6L7AAbdEGFvFTCXSY_EkAWyMKFGjzXh6kISiv1EWxbn8B_B0OU1PUpZc7ZpTpMmoJBY9fFziXic4yy0JFeFElAWQjURqtOfxTP312OONcwojeKQe_P-9nsAV1Mji4_bezkO4GqMKdLWbG7De_jx1j9B5a8vHXksYHF60Wv4Bq-pWgw |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=A+Meta-Learning+Approach+for+Few-Shot+Face+Forgery+Segmentation+and+Classification&rft.jtitle=Sensors+%28Basel%2C+Switzerland%29&rft.au=Lin%2C+Yih-Kai&rft.au=Yen%2C+Ting-Yu&rft.date=2023-03-31&rft.issn=1424-8220&rft.eissn=1424-8220&rft.volume=23&rft.issue=7&rft_id=info:doi/10.3390%2Fs23073647&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1424-8220&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1424-8220&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1424-8220&client=summon |