Bayesian Deep Net GLM and GLMM
Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely us...
Saved in:
Published in | Journal of computational and graphical statistics Vol. 29; no. 1; pp. 97 - 113 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Alexandria
Taylor & Francis
02.01.2020
Taylor & Francis Ltd |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Abstract | Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parameterization of the covariance matrix. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient. Our flexible DFNN models and Bayesian inference approach lead to a regression and classification method that has a high prediction accuracy, and is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and compare favorably to a state of the art flexible regression and classification method in the statistical literature, the Bayesian additive regression trees (BART) method. User-friendly software packages in Matlab, R, and Python implementing the proposed methods are available at
https://github.com/VBayesLab
. |
---|---|
AbstractList | Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parameterization of the covariance matrix. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient. Our flexible DFNN models and Bayesian inference approach lead to a regression and classification method that has a high prediction accuracy, and is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and compare favorably to a state of the art flexible regression and classification method in the statistical literature, the Bayesian additive regression trees (BART) method. User-friendly software packages in Matlab, R, and Python implementing the proposed methods are available at https://github.com/VBayesLab. Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized linear mixed models incorporating basis functions formed by a DFNN. The consideration of neural networks with random effects is not widely used in the literature, perhaps because of the computational challenges of incorporating subject specific parameters into already complex models. Efficient computational methods for high-dimensional Bayesian inference are developed using Gaussian variational approximation, with a parsimonious but flexible factor parameterization of the covariance matrix. We implement natural gradient methods for the optimization, exploiting the factor structure of the variational covariance matrix in computation of the natural gradient. Our flexible DFNN models and Bayesian inference approach lead to a regression and classification method that has a high prediction accuracy, and is able to quantify the prediction uncertainty in a principled and convenient way. We also describe how to perform variable selection in our deep learning method. The proposed methods are illustrated in a wide range of simulated and real-data examples, and compare favorably to a state of the art flexible regression and classification method in the statistical literature, the Bayesian additive regression trees (BART) method. User-friendly software packages in Matlab, R, and Python implementing the proposed methods are available at https://github.com/VBayesLab . |
Author | Nguyen, N. Tran, M.-N. Kohn, R. Nott, D. |
Author_xml | – sequence: 1 givenname: M.-N. surname: Tran fullname: Tran, M.-N. email: minh-ngoc.tran@sydney.edu.au organization: Discipline of Business Analytics, The University of Sydney Business School and ACEMS – sequence: 2 givenname: N. surname: Nguyen fullname: Nguyen, N. organization: Discipline of Business Analytics, The University of Sydney Business School and ACEMS – sequence: 3 givenname: D. surname: Nott fullname: Nott, D. organization: Department of Statistics and Applied Probability, National University of Singapore – sequence: 4 givenname: R. surname: Kohn fullname: Kohn, R. organization: School of Economics, UNSW Business School and ACEMS |
BookMark | eNqFkDFPwzAUhC1UJNrCTwBVYk55L3ZsRyxAgYLUwgKzZRxbSpXaxU6F-u9J1LIwwHRvuO_e6UZk4IO3hJwjTBEkXCFwlBxgmgOWU-RUCCaOyBALKrJcYDHo7s6T9aYTMkppBQDISzEkF3d6Z1Ot_eTe2s3kxbaT-WI50b7qdXlKjp1ukj076Ji8Pz68zZ6yxev8eXa7yAylss0EA23LD-DSUmY0NbZAUXFddb2Y4RUis5QyVuZSSNfV4w5yNE5C5Spgho7J5T53E8Pn1qZWrcI2-u6lymmJWBZUis51vXeZGFKK1ilTt7qtg2-jrhuFoPpB1M8gqh9EHQbp6OIXvYn1Wsfdv9zNnqu9C3Gtv0JsKtXqXROii9qbOin6d8Q3uQl01A |
CitedBy_id | crossref_primary_10_1109_TPAMI_2023_3234291 crossref_primary_10_3390_app132011512 crossref_primary_10_1093_biomtc_ujae016 crossref_primary_10_1093_jrsssb_qkaf001 crossref_primary_10_1214_22_BA1331 crossref_primary_10_1016_j_dsp_2021_103137 crossref_primary_10_1080_00031305_2022_2164054 crossref_primary_10_1007_s13253_019_00361_7 crossref_primary_10_1016_j_spasta_2020_100408 crossref_primary_10_1080_10618600_2020_1807996 crossref_primary_10_1080_10618600_2021_1880921 crossref_primary_10_1146_annurev_statistics_033021_112628 crossref_primary_10_1214_22_AOAS1693 crossref_primary_10_1002_wics_1647 crossref_primary_10_1080_02331888_2024_2364688 crossref_primary_10_1080_19475705_2023_2206512 crossref_primary_10_1093_jrsssb_qkad078 crossref_primary_10_1214_22_AOAS1634 crossref_primary_10_29220_CSAM_2024_31_2_213 crossref_primary_10_1007_s11222_021_10047_1 crossref_primary_10_1007_s11222_024_10488_4 crossref_primary_10_1080_08982112_2022_2044049 crossref_primary_10_1080_10618600_2023_2262080 crossref_primary_10_1007_s13385_024_00388_2 crossref_primary_10_3390_axioms11060280 crossref_primary_10_1002_sta4_382 crossref_primary_10_1080_10618600_2023_2276122 crossref_primary_10_1111_insr_12360 crossref_primary_10_1080_03461238_2022_2081816 crossref_primary_10_51387_24_NEJSDS71 crossref_primary_10_1007_s42081_024_00270_1 crossref_primary_10_1080_01621459_2024_2392904 crossref_primary_10_1093_biomtc_ujae057 crossref_primary_10_1177_1471082X241258261 |
Cites_doi | 10.1016/j.neunet.2014.09.003 10.1111/j.1467-9868.2005.00532.x 10.1214/10-BA607 10.1007/s10928-005-9000-2 10.1214/17-BA1082 10.1016/0041-5553(64)90137-5 10.1093/rfs/9.2.557 10.1007/978-1-4899-3242-6 10.1016/j.csda.2018.07.008 10.1162/089976698300017746 10.1214/aoms/1177729586 10.1080/10618600.2017.1390472 10.1007/s10463-013-0429-6 10.1111/j.2517-6161.1974.tb00989.x 10.1007/978-3-642-68874-4_21 10.1002/9781119970583 10.1007/s11222-017-9729-7 10.1214/13-BA858 10.1093/biostatistics/2.4.485 10.1080/01621459.2017.1285773 10.1007/s11222-017-9773-3 10.1093/biomet/asq017 10.1007/978-3-642-35289-8_26 10.1002/jae.3950030206 10.1007/978-1-4614-0406-4 10.1080/10618600.2015.1012293 10.1080/10618600.2017.1330205 10.1214/09-AOAS285 10.1162/089976601750265045 |
ContentType | Journal Article |
Copyright | 2019 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America 2019 2019 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America |
Copyright_xml | – notice: 2019 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America 2019 – notice: 2019 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America |
DBID | AAYXX CITATION JQ2 |
DOI | 10.1080/10618600.2019.1637747 |
DatabaseName | CrossRef ProQuest Computer Science Collection |
DatabaseTitle | CrossRef ProQuest Computer Science Collection |
DatabaseTitleList | ProQuest Computer Science Collection |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Statistics Mathematics |
EISSN | 1537-2715 |
EndPage | 113 |
ExternalDocumentID | 10_1080_10618600_2019_1637747 1637747 |
Genre | Article |
GrantInformation_xml | – fundername: Australian Research Council Center of Excellence grantid: CE140100049 |
GroupedDBID | -~X .4S .7F .DC .QJ 0BK 0R~ 30N 4.4 5GY AAENE AAJMT AALDU AAMIU AAPUL AAQRR ABCCY ABFAN ABFIM ABJNI ABLIJ ABLJU ABPAQ ABPEM ABTAI ABXUL ABXYU ABYWD ACGFO ACGFS ACIWK ACMTB ACTIO ACTMH ADCVX ADGTB AEGXH AELLO AENEX AEOZL AEPSL AEUPB AEYOC AFVYC AGDLA AGMYJ AHDZW AIAGR AIJEM AKBRZ AKBVH AKOOK ALMA_UNASSIGNED_HOLDINGS ALQZU AQRUH ARCSS AVBZW AWYRJ BLEHA CCCUG CS3 D0L DGEBU DKSSO DU5 EBS E~A E~B F5P GTTXZ H13 HF~ HZ~ H~P IAO IEA IGG IGS IOF IPNFZ J.P JAA KYCEM LJTGL M4Z MS~ NA5 NY~ O9- P2P PQQKQ RIG RNANH ROSJB RTWRZ RWL RXW S-T SNACF TAE TBQAZ TDBHL TEJ TFL TFT TFW TN5 TTHFI TUROJ TUS UT5 UU3 WZA XWC ZGOLN ~S~ AAGDL AAHIA AAYXX ADXHL ADYSH AFRVT AMPGV AMVHM CITATION JQ2 TASJS |
ID | FETCH-LOGICAL-c338t-740ae9b068e34ca3ce517d6ad1864c6d114e334492878f7746f021cf80dfd04c3 |
ISSN | 1061-8600 |
IngestDate | Wed Aug 13 07:49:56 EDT 2025 Tue Jul 01 02:05:30 EDT 2025 Thu Apr 24 23:07:19 EDT 2025 Wed Dec 25 09:08:07 EST 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Issue | 1 |
Language | English |
LinkModel | OpenURL |
MergedId | FETCHMERGED-LOGICAL-c338t-740ae9b068e34ca3ce517d6ad1864c6d114e334492878f7746f021cf80dfd04c3 |
Notes | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
PQID | 2391195387 |
PQPubID | 29738 |
PageCount | 17 |
ParticipantIDs | proquest_journals_2391195387 crossref_citationtrail_10_1080_10618600_2019_1637747 informaworld_taylorfrancis_310_1080_10618600_2019_1637747 crossref_primary_10_1080_10618600_2019_1637747 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | 2020-01-02 |
PublicationDateYYYYMMDD | 2020-01-02 |
PublicationDate_xml | – month: 01 year: 2020 text: 2020-01-02 day: 02 |
PublicationDecade | 2020 |
PublicationPlace | Alexandria |
PublicationPlace_xml | – name: Alexandria |
PublicationTitle | Journal of computational and graphical statistics |
PublicationYear | 2020 |
Publisher | Taylor & Francis Taylor & Francis Ltd |
Publisher_xml | – name: Taylor & Francis – name: Taylor & Francis Ltd |
References | Martens J. (CIT0026) 2010 CIT0030 CIT0032 CIT0031 CIT0034 CIT0033 Rao C. R. (CIT0036) 1945; 37 CIT0035 CIT0037 Goodfellow I. (CIT0015) 2016 CIT0039 CIT0041 CIT0040 CIT0043 CIT0042 CIT0001 CIT0044 Stroup W. W. (CIT0045) 2012 Hoffman M. D. (CIT0018) 2013; 14 CIT0002 CIT0046 CIT0005 Titsias M. (CIT0047) 2014 CIT0049 CIT0004 CIT0048 CIT0007 CIT0009 CIT0008 CIT0050 CIT0052 CIT0051 CIT0010 Baltagi B. (CIT0003) 2013 CIT0053 CIT0011 CIT0013 CIT0016 CIT0017 Rezende D. J. (CIT0038) 2014 Honkela A. (CIT0019) 2010; 11 CIT0021 Kucukelbir A. (CIT0022) 2017; 18 CIT0020 CIT0023 Glorot X. (CIT0014) 2010; 9 Fan K. (CIT0012) 2015; 28 CIT0025 CIT0024 CIT0027 Bishop C. M. (CIT0006) 2006 CIT0029 CIT0028 |
References_xml | – ident: CIT0040 – ident: CIT0043 doi: 10.1016/j.neunet.2014.09.003 – ident: CIT0052 doi: 10.1111/j.1467-9868.2005.00532.x – ident: CIT0021 – volume: 37 start-page: 81 year: 1945 ident: CIT0036 publication-title: Bulletin of the Calcutta Mathematical Society – ident: CIT0023 doi: 10.1214/10-BA607 – ident: CIT0024 doi: 10.1007/s10928-005-9000-2 – ident: CIT0048 – volume: 9 start-page: 249 volume-title: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, PMLR year: 2010 ident: CIT0014 – volume: 14 start-page: 1303 year: 2013 ident: CIT0018 publication-title: Journal of Machine Learning Research – ident: CIT0033 doi: 10.1214/17-BA1082 – ident: CIT0035 – ident: CIT0016 – ident: CIT0034 doi: 10.1016/0041-5553(64)90137-5 – ident: CIT0037 – ident: CIT0013 doi: 10.1093/rfs/9.2.557 – volume: 11 start-page: 3235 year: 2010 ident: CIT0019 publication-title: Journal of Machine Learning Research – volume-title: Econometric Analysis of Panel Data year: 2013 ident: CIT0003 – volume: 18 start-page: 1 year: 2017 ident: CIT0022 publication-title: Journal of Machine Learning Research – ident: CIT0028 doi: 10.1007/978-1-4899-3242-6 – ident: CIT0030 doi: 10.1016/j.csda.2018.07.008 – volume-title: “Deep Learning via Hessian-Free Optimization,” in 27th International Conference on Machine Learning year: 2010 ident: CIT0026 – ident: CIT0053 – ident: CIT0032 – ident: CIT0001 doi: 10.1162/089976698300017746 – ident: CIT0039 doi: 10.1214/aoms/1177729586 – ident: CIT0029 doi: 10.1080/10618600.2017.1390472 – ident: CIT0025 doi: 10.1007/s10463-013-0429-6 – ident: CIT0002 doi: 10.1111/j.2517-6161.1974.tb00989.x – ident: CIT0044 doi: 10.1007/978-3-642-68874-4_21 – ident: CIT0004 doi: 10.1002/9781119970583 – ident: CIT0017 – start-page: 1971 volume-title: in year: 2014 ident: CIT0047 – volume-title: Deep Learning year: 2016 ident: CIT0015 – ident: CIT0027 – ident: CIT0046 doi: 10.1007/s11222-017-9729-7 – volume-title: Pattern Recognition and Machine Learning year: 2006 ident: CIT0006 – ident: CIT0041 doi: 10.1214/13-BA858 – ident: CIT0009 doi: 10.1093/biostatistics/2.4.485 – ident: CIT0007 doi: 10.1080/01621459.2017.1285773 – ident: CIT0031 doi: 10.1007/s11222-017-9773-3 – ident: CIT0008 doi: 10.1093/biomet/asq017 – ident: CIT0005 doi: 10.1007/978-3-642-35289-8_26 – ident: CIT0011 doi: 10.1002/jae.3950030206 – ident: CIT0020 doi: 10.1007/978-1-4614-0406-4 – ident: CIT0050 doi: 10.1080/10618600.2015.1012293 – ident: CIT0051 – ident: CIT0049 doi: 10.1080/10618600.2017.1330205 – volume-title: Generalized Linear Mixed Models: Modern Concepts, Methods and Applications year: 2012 ident: CIT0045 – ident: CIT0010 doi: 10.1214/09-AOAS285 – volume: 28 start-page: 1387 volume-title: Advances in Neural Information Processing Systems year: 2015 ident: CIT0012 – start-page: 1278 volume-title: in year: 2014 ident: CIT0038 – ident: CIT0042 doi: 10.1162/089976601750265045 |
SSID | ssj0001697 |
Score | 2.5014722 |
Snippet | Deep feedforward neural networks (DFNNs) are a powerful tool for functional approximation. We describe flexible versions of generalized linear and generalized... |
SourceID | proquest crossref informaworld |
SourceType | Aggregation Database Enrichment Source Index Database Publisher |
StartPage | 97 |
SubjectTerms | Approximation Artificial neural networks Basis functions Bayesian analysis Classification Computer simulation Covariance matrix Deep learning Factor models Machine learning Methods Neural networks Optimization Parameterization Regression analysis Reparameterization gradient Statistical analysis Statistical inference Statistical models Stochastic optimization Variable selection Variational approximation |
Title | Bayesian Deep Net GLM and GLMM |
URI | https://www.tandfonline.com/doi/abs/10.1080/10618600.2019.1637747 https://www.proquest.com/docview/2391195387 |
Volume | 29 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV1Lb9QwELagXMqhggKipVQ5cENenNhxnCOPPoQ2y4GstDcriW0uaKloeii_nvEzW22hlEuyctbrjb_xzHg8D4TeaA1KRcc6rArNMeuowjUrFDYEtGVBKc-NtUM2C36-ZJ9X5WqqiuqiS8Z-Nvy6Na7kf1CFNsDVRsneA9n0o9AAnwFfuALCcP0njD9019oFQX7S-sLG7r49mzfuOADuzR_0zsHVcYg2QPtll7Tah0fadpe5OQGBnR-AFWnJbvzt6tozq3SMswhuH8l92EbSTO6IwapQEGdVmPag7VaBjw0vI8snQQ3AghN_pKIj76xwUfnozMhcgzljk4g8p_ReuUHm5j4edYude_9HO5gdyzri1TNQIEFlrSb5Fc_sF1_k6XI-l-3Jqn2IHhWwb7AlLShZJNGch2o78d_HkC5B3t06yA1l5UYq2y3R7fSR9gnaC4Bm7z1VPEUP9HofPW5SFt7LfbT7NcH5DB1HYskssWRALBkQSQb423vzHC1PT9qP5zgUx8ADpWLEFSOdrntYUJqyoaODLvNK8U7BW7CBK9jnakoZq2FLLAy8DDegzg1GEGUUYQN9gXbWP9b6JcpMVQtVispwRhg8FSVXtOppYXrBNSsPEIuzIIeQOd4WMPku85BgNk6etJMnw-QdoFnqduFTp9zVod6cYjk6IjSe_iS9o-9RxEOGFXopC1rbjIZUVId_f_wK7U6L4AjtjD-v9GtQNsf-2BHQb0Q8cFY |
linkProvider | Taylor & Francis |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV1LU8IwEN5RPIgHH6gjitiD12Jp0jQ9-kJUyglmuGXaPC46yEg56K83S1sGdBwOnHrobJpukt1vdzbfAlxrbUFFQhNX-Zq5NCHKjaivXONZtMwJYW2Deci4z7pD-jIKRkt3YbCsEmNokxNFzG01Hm5MRpclcTcYxnDrqbEyK2pZRGExTLgNO0HEQuxiQLz-whq3iwYrVsRFmfIWz3_DrPinFfbSP9Z67oI6ByDLyeeVJ2-tWZa25PcvXsfN_u4Q9guE6tzmW-oItvS4Bnvxgt51WoMqQtSc4fkYmnfJl8armM6D1hOnrzPnqRc79vv4jE9g2Hkc3HfdouuCK224mrkh9RIdpXalNKEyIVIH7VCxRNlJUcmUDaA0IZRGNtbixs6NGYsTpOGeMsqjkpxCZfwx1mfgmDDiKuChYdSj9i0PmCJhSnyTcqZpUAda6lrIgpIcO2O8i3bBXFrqQqAuRKGLOrQWYpOck2OdQLS8kCKbJ0NM3rlEkDWyjXLVRXG8p8InEVLlER6ebzD0Fex2B3FP9J77rxdQ9TGSx-SO34BK9jnTlxbuZGlzvp9_AGn56wo |
linkToPdf | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwpV09T8MwED1BkVAZ-CggCqVkYHVIY8dxRqCUAk3EQCU2K4ntBVQqmg7w67ETp6IgxNApQ3SOc7bv3p3O7wDOpdSgIiUpEr6kiKRYoIj4AilPo2WGMe0pk4eMEzock_vnoK4mnNmyShNDq4ooorTV5nBPhaor4i5MFMO0ozaFWZGrAYWGMOE6bFBDHm5ucXjJwhj3bH8VLYKMTH2J569hltzTEnnpL2NdeqDBDmT13KvCkxd3XmRu_vmD1nGln9uFbYtPnctqQ-3Bmpy0YCtekLvOWtA0ALXid96H7lX6Ic1FTKcv5dRJZOHcjmJHf9484wMYD26erofI9lxAuQ5WCxQSL5VRptdJYpKnOJdBLxQ0FXpSJKdCh08SY0IiHWkxpedGlUYJuWKeUMIjOT6ExuRtIo_AUWHERMBCRYlH9FsWUIHDDPsqY1SSoA2kVjXPLSG56YvxynuWt7TWBTe64FYXbXAXYtOKkeM_gej7OvKiTIWoqm8Jx__IdupF5_Zwz7iPI0OUh1l4vMLQZ7D52B_w0V3ycAJN34TxJrPjd6BRvM_lqcY6RdYtd_MXe27prg |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Bayesian+Deep+Net+GLM+and+GLMM&rft.jtitle=Journal+of+computational+and+graphical+statistics&rft.au=M-N%2C+Tran&rft.au=Nguyen%2C+N&rft.au=Nott%2C+D&rft.au=Kohn%2C+R&rft.date=2020-01-02&rft.pub=Taylor+%26+Francis+Ltd&rft.issn=1061-8600&rft.eissn=1537-2715&rft.volume=29&rft.issue=1&rft.spage=97&rft.epage=113&rft_id=info:doi/10.1080%2F10618600.2019.1637747&rft.externalDBID=NO_FULL_TEXT |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1061-8600&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1061-8600&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1061-8600&client=summon |