Gradual Training Method for Denoising Auto Encoders

Stacked denoising auto encoders (DAEs) are well known to learn useful deep representations, which can be used to improve supervised training by initializing a deep network. We investigate a training scheme of a deep DAE, where DAE layers are gradually added and keep adapting as additional layers are...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Kalmanovich, Alexander, Chechik, Gal
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 11.04.2015
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Stacked denoising auto encoders (DAEs) are well known to learn useful deep representations, which can be used to improve supervised training by initializing a deep network. We investigate a training scheme of a deep DAE, where DAE layers are gradually added and keep adapting as additional layers are added. We show that in the regime of mid-sized datasets, this gradual training provides a small but consistent improvement over stacked training in both reconstruction quality and classification error over stacked training on MNIST and CIFAR datasets.
AbstractList Stacked denoising auto encoders (DAEs) are well known to learn useful deep representations, which can be used to improve supervised training by initializing a deep network. We investigate a training scheme of a deep DAE, where DAE layers are gradually added and keep adapting as additional layers are added. We show that in the regime of mid-sized datasets, this gradual training provides a small but consistent improvement over stacked training in both reconstruction quality and classification error over stacked training on MNIST and CIFAR datasets.
Author Kalmanovich, Alexander
Chechik, Gal
Author_xml – sequence: 1
  givenname: Alexander
  surname: Kalmanovich
  fullname: Kalmanovich, Alexander
– sequence: 2
  givenname: Gal
  surname: Chechik
  fullname: Chechik, Gal
BookMark eNqNiksKwjAUAIMoWLV3CLgu5Kt1KVp14677EkyqKeU9zef-KngAVwMzsyBTQHATUggpeVUrIeakjHFgjInNVmgtCyLPwdhsRtoG48HDnV5deqClPQZ6dIA-fuU-J6QN3NC6EFdk1psxuvLHJVmfmvZwqZ4BX9nF1A2YA3xSJ1jNudJ8p-R_1xsgbzWM
ContentType Paper
Copyright 2015. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2015. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central
SciTech Premium Collection (Proquest) (PQ_SDU_P3)
ProQuest Engineering Collection
ProQuest Engineering Database
Publicly Available Content Database
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest One Academic
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-proquest_journals_20811451943
IEDL.DBID BENPR
IngestDate Thu Oct 10 17:08:31 EDT 2024
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-proquest_journals_20811451943
OpenAccessLink https://www.proquest.com/docview/2081145194?pq-origsite=%requestingapplication%
PQID 2081145194
PQPubID 2050157
ParticipantIDs proquest_journals_2081145194
PublicationCentury 2000
PublicationDate 20150411
PublicationDateYYYYMMDD 2015-04-11
PublicationDate_xml – month: 04
  year: 2015
  text: 20150411
  day: 11
PublicationDecade 2010
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2015
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 2.9780107
SecondaryResourceType preprint
Snippet Stacked denoising auto encoders (DAEs) are well known to learn useful deep representations, which can be used to improve supervised training by initializing a...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Coders
Datasets
Noise reduction
Training
Title Gradual Training Method for Denoising Auto Encoders
URI https://www.proquest.com/docview/2081145194
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1bS8MwFD64FsE3r-icI6CvgaZNu-5JvLQbQseQCXsbaS7gSzvb7tXf7knJ9EHYYwgkIQnnfN-58AE8iIkxDGEsVVwKyoXhtETkTCObMhMJPnRqu5GLRTL_4G_reO0Cbq0rq9zbxN5Qq1raGDmS9JRZVdkpf9x-UasaZbOrTkJjAH6ITCHwwH_OFsv33yhLmEwQM0f_DG3vPfJT8Jdiq5szONLVORz3RZeyvYBo1gjbCkVWTqeBFL2eM0EgSV51VX9aIk-edl1Nsso2nzftJdzn2eplTvc7bdxvaDd_Z4-uwENar6-BBEwEHOGACnXAZZJMy9hwoYSRiqdG8hsYHVppeHj6Fk7Qtcc278HYCLyu2ek7dJ9dOYZBms_G7qZwVHxnP4credM
link.rule.ids 783,787,12777,21400,33385,33756,43612,43817
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1NS8NAEB20RfTmJ35UXdDrQtJM0vQkok2jNsVDhN7CZrMLXpo2Sf-_MyHVg9Dzwu6yu8x7M7OPB_CoRta6RGNlgVpJVBZlTsxZetwyUwFddMhq5GQexF_4vvAXXcGt7r5VbmNiG6iLUnONnJL00GVX2TE-rdaSXaO4u9pZaOxDHz3CalaKR9PfGsswGBFj9v6F2RY7omPof6qVqU5gzyxP4aD9cqnrM_CmlWIhlEg7lwaRtG7OgmikeDXL8pvTePG8aUoxWbL0vKrP4SGapC-x3K6UdW-hzv527l1Aj5J6cwnCcZWDRAaKoXFQB8E49y2qQlldYGg1XsFg10zXu4fv4TBOk1k2e5t_3MARgbzPHRDXHUCvqTbmloC0ye_a0_oBlTx5Rw
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Gradual+Training+Method+for+Denoising+Auto+Encoders&rft.jtitle=arXiv.org&rft.au=Kalmanovich%2C+Alexander&rft.au=Chechik%2C+Gal&rft.date=2015-04-11&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422