Beyond Point Estimate: Inferring Ensemble Prediction Variation from Neuron Activation Strength in Recommender Systems

Despite deep neural network (DNN)'s impressive prediction performance in various domains, it is well known now that a set of DNN models trained with the same model specification and the same data can produce very different prediction results. Ensemble method is one state-of-the-art benchmark fo...

Full description

Saved in:
Bibliographic Details
Published inarXiv.org
Main Authors Chen, Zhe, Wang, Yuyan, Lin, Dong, Derek Zhiyuan Cheng, Hong, Lichan, Chi, Ed H, Cui, Claire
Format Paper
LanguageEnglish
Published Ithaca Cornell University Library, arXiv.org 17.08.2020
Subjects
Online AccessGet full text

Cover

Loading…
Abstract Despite deep neural network (DNN)'s impressive prediction performance in various domains, it is well known now that a set of DNN models trained with the same model specification and the same data can produce very different prediction results. Ensemble method is one state-of-the-art benchmark for prediction uncertainty estimation. However, ensembles are expensive to train and serve for web-scale traffic. In this paper, we seek to advance the understanding of prediction variation estimated by the ensemble method. Through empirical experiments on two widely used benchmark datasets MovieLens and Criteo in recommender systems, we observe that prediction variations come from various randomness sources, including training data shuffling, and parameter random initialization. By introducing more randomness into model training, we notice that ensemble's mean predictions tend to be more accurate while the prediction variations tend to be higher. Moreover, we propose to infer prediction variation from neuron activation strength and demonstrate the strong prediction power from activation strength features. Our experiment results show that the average R squared on MovieLens is as high as 0.56 and on Criteo is 0.81. Our method performs especially well when detecting the lowest and highest variation buckets, with 0.92 AUC and 0.89 AUC respectively. Our approach provides a simple way for prediction variation estimation, which opens up new opportunities for future work in many interesting areas (e.g.,model-based reinforcement learning) without relying on serving expensive ensemble models.
AbstractList Despite deep neural network (DNN)'s impressive prediction performance in various domains, it is well known now that a set of DNN models trained with the same model specification and the same data can produce very different prediction results. Ensemble method is one state-of-the-art benchmark for prediction uncertainty estimation. However, ensembles are expensive to train and serve for web-scale traffic. In this paper, we seek to advance the understanding of prediction variation estimated by the ensemble method. Through empirical experiments on two widely used benchmark datasets MovieLens and Criteo in recommender systems, we observe that prediction variations come from various randomness sources, including training data shuffling, and parameter random initialization. By introducing more randomness into model training, we notice that ensemble's mean predictions tend to be more accurate while the prediction variations tend to be higher. Moreover, we propose to infer prediction variation from neuron activation strength and demonstrate the strong prediction power from activation strength features. Our experiment results show that the average R squared on MovieLens is as high as 0.56 and on Criteo is 0.81. Our method performs especially well when detecting the lowest and highest variation buckets, with 0.92 AUC and 0.89 AUC respectively. Our approach provides a simple way for prediction variation estimation, which opens up new opportunities for future work in many interesting areas (e.g.,model-based reinforcement learning) without relying on serving expensive ensemble models.
Author Cui, Claire
Wang, Yuyan
Derek Zhiyuan Cheng
Chi, Ed H
Lin, Dong
Hong, Lichan
Chen, Zhe
Author_xml – sequence: 1
  givenname: Zhe
  surname: Chen
  fullname: Chen, Zhe
– sequence: 2
  givenname: Yuyan
  surname: Wang
  fullname: Wang, Yuyan
– sequence: 3
  givenname: Dong
  surname: Lin
  fullname: Lin, Dong
– sequence: 4
  fullname: Derek Zhiyuan Cheng
– sequence: 5
  givenname: Lichan
  surname: Hong
  fullname: Hong, Lichan
– sequence: 6
  givenname: Ed
  surname: Chi
  middlename: H
  fullname: Chi, Ed H
– sequence: 7
  givenname: Claire
  surname: Cui
  fullname: Cui, Claire
BookMark eNqNis0OATEURhsh8fsON7GWVGsQO2SEjYgRWxnmDhW95bYj8fYmeACr7-Q7pymq5AgroqG07vfGA6XqouP9VUqphiMVRbohihm-HGWwcYYCxD4YmwacwIpyZDZ0hpg82uMNYcOYmVMwjmCfskk_lLOzsMaCS56W8vm9k8BI53ABQ7DFk7MWKUOG5OUDWt8WtTy9eez8tiW6i3g3X_bu7B4F-nC4uoKpVAc10JGU4_5oqP-r3ksWTto
ContentType Paper
Copyright 2020. This work is published under http://creativecommons.org/licenses/by-nc-sa/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Copyright_xml – notice: 2020. This work is published under http://creativecommons.org/licenses/by-nc-sa/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
DBID 8FE
8FG
ABJCF
ABUWG
AFKRA
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
HCIFZ
L6V
M7S
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
DatabaseName ProQuest SciTech Collection
ProQuest Technology Collection
Materials Science & Engineering Collection
ProQuest Central (Alumni)
ProQuest Central
ProQuest Central Essentials
ProQuest Central
Technology Collection
ProQuest One Community College
ProQuest Central Korea
SciTech Premium Collection
ProQuest Engineering Collection
Engineering Database
Publicly Available Content Database
ProQuest One Academic Eastern Edition (DO NOT USE)
ProQuest One Academic
ProQuest One Academic UKI Edition
ProQuest Central China
Engineering Collection
DatabaseTitle Publicly Available Content Database
Engineering Database
Technology Collection
ProQuest Central Essentials
ProQuest One Academic Eastern Edition
ProQuest Central (Alumni Edition)
SciTech Premium Collection
ProQuest One Community College
ProQuest Technology Collection
ProQuest SciTech Collection
ProQuest Central China
ProQuest Central
ProQuest Engineering Collection
ProQuest One Academic UKI Edition
ProQuest Central Korea
Materials Science & Engineering Collection
ProQuest One Academic
Engineering Collection
DatabaseTitleList Publicly Available Content Database
Database_xml – sequence: 1
  dbid: 8FG
  name: ProQuest Technology Collection
  url: https://search.proquest.com/technologycollection1
  sourceTypes: Aggregation Database
DeliveryMethod fulltext_linktorsrc
Discipline Physics
EISSN 2331-8422
Genre Working Paper/Pre-Print
GroupedDBID 8FE
8FG
ABJCF
ABUWG
AFKRA
ALMA_UNASSIGNED_HOLDINGS
AZQEC
BENPR
BGLVJ
CCPQU
DWQXO
FRJ
HCIFZ
L6V
M7S
M~E
PIMPY
PQEST
PQQKQ
PQUKI
PRINS
PTHSS
ID FETCH-proquest_journals_24350081763
IEDL.DBID 8FG
IngestDate Thu Oct 10 18:03:28 EDT 2024
IsOpenAccess true
IsPeerReviewed false
IsScholarly false
Language English
LinkModel DirectLink
MergedId FETCHMERGED-proquest_journals_24350081763
OpenAccessLink https://www.proquest.com/docview/2435008176?pq-origsite=%requestingapplication%
PQID 2435008176
PQPubID 2050157
ParticipantIDs proquest_journals_2435008176
PublicationCentury 2000
PublicationDate 20200817
PublicationDateYYYYMMDD 2020-08-17
PublicationDate_xml – month: 08
  year: 2020
  text: 20200817
  day: 17
PublicationDecade 2020
PublicationPlace Ithaca
PublicationPlace_xml – name: Ithaca
PublicationTitle arXiv.org
PublicationYear 2020
Publisher Cornell University Library, arXiv.org
Publisher_xml – name: Cornell University Library, arXiv.org
SSID ssj0002672553
Score 3.2881815
SecondaryResourceType preprint
Snippet Despite deep neural network (DNN)'s impressive prediction performance in various domains, it is well known now that a set of DNN models trained with the same...
SourceID proquest
SourceType Aggregation Database
SubjectTerms Activation
Artificial neural networks
Benchmarks
Buckets
Machine learning
Randomness
Recommender systems
Training
Title Beyond Point Estimate: Inferring Ensemble Prediction Variation from Neuron Activation Strength in Recommender Systems
URI https://www.proquest.com/docview/2435008176
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV3dS8MwED90RfDNT_yY40Bfh1uTNasvotI6hY0iKnsbXZrqwKaz7V79283FVh-EPYZASI7k7n53ufsBXDDjdSs_JphKJTmUa_TT2IAVPyaKGpdx2-x5PPFGL_xxOpjWAbey_lbZ6ESrqJNcUoz80jV2neyX8K6Xn11ijaLsak2hsQlO3xWCbvUwvP-NsbieMB4z-6dmre0Id8CJ4qUqdmFD6T3Ysl8uZbkPq5_iEYzyha4wME_NOI_qCh-oBI-ibRjoUmXzD4VRQekUEiG-GmxrhYlUGIK2t4bGG9mwlCGlmfVb9Y4LjYQts8ySxWHdmvwAzsPg-W7UbfY6q29TOfs7OzuEls61OgIcsAGTPcmlUAnn8dCXMaG9tC_SxO9xdQztdSudrJ8-hW2XkCU1fxVtaFXFSp0Z81vNO1bGHXBug0n0ZEbjr-Ab0kiQ6w
link.rule.ids 783,787,12779,21402,33387,33758,43614,43819
linkProvider ProQuest
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwfV1LS8NAEB60QfTmEx9VB_RarNk8jBdRSUm1DUGq9BbSzUYLZlOT9P-7syZ6EHoOLNlhd2a-mZ3vA7hkKusWXkIwlUZyqNfoZYkCK15CEjUmszTZ8zh0glfraWpPm4Jb1TyrbH2idtRpwalGfmWquE7xy3XuFl89Uo2i7mojobEOBlFVqVNtPPhh9PJbZTEdV-XM7J-j1dFjsA1GlCxEuQNrQu7Chn50yas9WP6Mj2BUzGWNvrpsKn0UtzikITyqt6EvK5HPPgVGJTVUyIj4ptCtNifSaAhqdg2J97zVKUNqNMv3-gPnEgld5rmWi8OGnHwfLgb-5DHotf8aN-epiv92zw6gIwspDgFtZjPe5xZ3RWpZyY3HE8J72bWbpV7fEkfQXbXS8erP57AZTMajeDQMn09gyyScSVSwbhc6dbkUpyoY17OzxuLfNDqScQ
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Beyond+Point+Estimate%3A+Inferring+Ensemble+Prediction+Variation+from+Neuron+Activation+Strength+in+Recommender+Systems&rft.jtitle=arXiv.org&rft.au=Chen%2C+Zhe&rft.au=Wang%2C+Yuyan&rft.au=Lin%2C+Dong&rft.au=Derek+Zhiyuan+Cheng&rft.date=2020-08-17&rft.pub=Cornell+University+Library%2C+arXiv.org&rft.eissn=2331-8422