Discrete Sparse Coding

Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions...

Full description

Saved in:
Bibliographic Details
Published inNeural computation Vol. 29; no. 11; pp. 2979 - 3013
Main Authors Exarchakis, Georgios, Lücke, Jörg
Format Journal Article
LanguageEnglish
Published One Rogers Street, Cambridge, MA 02142-1209, USA MIT Press 01.11.2017
MIT Press Journals, The
Subjects
Online AccessGet full text
ISSN0899-7667
1530-888X
1530-888X
DOI10.1162/neco_a_01015

Cover

Loading…
Abstract Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
AbstractList Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.
Author Lücke, Jörg
Exarchakis, Georgios
Author_xml – sequence: 1
  givenname: Georgios
  surname: Exarchakis
  fullname: Exarchakis, Georgios
  email: georgios.exarchakis@uol.de
– sequence: 2
  givenname: Jörg
  surname: Lücke
  fullname: Lücke, Jörg
BackLink https://www.ncbi.nlm.nih.gov/pubmed/28957027$$D View this record in MEDLINE/PubMed
BookMark eNp10MtLAzEQBvAgFfvQm3iUghcPrk6SzetY6hMKHlTwFuJ2Kintbk12Bf3rTWmVKvWUy--bzHxd0iqrEgk5pHBOqWQXJRaVdRYoULFDOlRwyLTWzy3SAW1MpqRUbdKNcQoAkoLYI22mjVDAVIccXfpYBKyx_7BwIWJ_WI19-bpPdiduFvFg_fbI0_XV4_A2G93f3A0Ho6zIc11n6JxCPeZKMq4NSjER6LQzUORaKuQaUDkhQfFCTnJDNeSyACW0BiZg7HiPnK7mLkL11mCs7Tztg7OZK7FqoqUmF4xxSmmiJ3_otGpCmbZLSnFudPoyqeO1al7mOLaL4OcufNjvixM4W4EiVDEGnPwQCnZZqN0sNHH2hxe-drWvyjo4P_svtL5q7je33E4HW-iSvDPjKbU8NZVCDBhNcQvGfvrF7xlfwpKYDQ
CitedBy_id crossref_primary_10_3390_math11122674
crossref_primary_10_1088_2634_4386_ac970d
Cites_doi 10.1152/jn.2002.88.1.455
10.1098/rspb.1977.0085
10.1109/TIT.2006.871582
10.1016/S0893-6080(97)00133-0
10.1098/rspb.1998.0303
10.1109/TIP.2005.860325
10.1364/JOSAA.14.002379
10.1109/TPAMI.2012.273
10.1162/089976604774201631
10.1109/TNNLS.2015.2412686
10.1152/jn.2000.84.1.390
10.1162/neco.1994.6.4.559
10.1109/TSP.2006.881199
10.1016/S0042-6989(97)00121-1
10.1007/s10827-006-0003-9
10.1016/S0262-8856(99)00010-4
10.1109/NNSP.2002.1030067
10.1038/nrn3136
10.1364/JOSA.70.001458
10.1017/S0952523800009640
10.1007/978-94-011-5014-9_12
10.1162/neco.2009.07-07-584
10.1007/s10044-007-0096-4
10.1007/978-3-540-74695-9_40
10.1152/jn.00749.2002
10.1038/381607a0
10.1038/44565
10.1523/JNEUROSCI.0623-08.2008
10.1016/S0042-6989(97)00169-7
10.1142/S0129065700000028
10.1371/journal.pcbi.1003062
10.1088/0954-898X_3_1_008
10.1007/978-1-4615-0799-4
10.1371/journal.pcbi.1002250
10.1117/12.55666
10.1007/978-3-642-15995-4_56
10.1017/CBO9780511794308
10.1007/s10044-003-0194-x
10.1007/978-3-642-28551-6_26
ContentType Journal Article
Copyright Copyright MIT Press Journals, The Nov 2017
Copyright_xml – notice: Copyright MIT Press Journals, The Nov 2017
DBID AAYXX
CITATION
NPM
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
DOI 10.1162/neco_a_01015
DatabaseName CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitle CrossRef
PubMed
Computer and Information Systems Abstracts
Technology Research Database
Computer and Information Systems Abstracts – Academic
Advanced Technologies Database with Aerospace
ProQuest Computer Science Collection
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
DatabaseTitleList CrossRef

Computer and Information Systems Abstracts
MEDLINE - Academic
PubMed
Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
DeliveryMethod fulltext_linktorsrc
Discipline Computer Science
EISSN 1530-888X
EndPage 3013
ExternalDocumentID 28957027
10_1162_neco_a_01015
neco_a_01015.pdf
Genre Research Support, Non-U.S. Gov't
Journal Article
GroupedDBID -
0R
123
36B
4.4
4S
6IK
AAJGR
AALMD
AAPBV
ABDBF
ABDNZ
ABFLS
ABIVO
ABPTK
ACGFO
ADIYS
AEGXH
AEILP
AENEX
AFHIN
AIAGR
ALMA_UNASSIGNED_HOLDINGS
ARCSS
AVWKF
AZFZN
BEFXN
BFFAM
BGNUA
BKEBE
BPEOZ
CAG
CS3
DC
DU5
EAP
EAS
EBC
EBD
EBS
ECS
EDO
EJD
EMB
EMK
EPL
EPS
EST
ESX
F5P
FEDTE
FNEHJ
HZ
I-F
IPLJI
JAVBF
MCG
MKJ
O9-
OCL
P2P
PK0
PQEST
PQQKQ
RMI
SV3
TUS
WG8
WH7
X
XJE
ZWS
---
-~X
.4S
.DC
0R~
41~
53G
AAFWJ
AAYOK
AAYXX
ABAZT
ABEFU
ABJNI
ABVLG
ACUHS
ACYGS
ADMLS
AMVHM
CITATION
COF
EMOBN
HVGLF
HZ~
H~9
MINIK
NPM
7SC
8FD
JQ2
L7M
L~C
L~D
7X8
ID FETCH-LOGICAL-c448t-eaa7e8d3762389e65f5ea8a90c4867e380e7a56073c6f4918046c075880250da3
ISSN 0899-7667
1530-888X
IngestDate Thu Jul 10 23:34:53 EDT 2025
Mon Jun 30 11:07:40 EDT 2025
Thu Apr 03 07:05:38 EDT 2025
Tue Jul 01 01:19:53 EDT 2025
Thu Apr 24 22:59:32 EDT 2025
Sun Jul 17 10:31:15 EDT 2022
Tue Mar 01 17:17:48 EST 2022
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 11
Language English
LinkModel OpenURL
MergedId FETCHMERGED-LOGICAL-c448t-eaa7e8d3762389e65f5ea8a90c4867e380e7a56073c6f4918046c075880250da3
Notes November, 2017
ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
OpenAccessLink https://direct.mit.edu/neco/article-pdf/29/11/2979/1026079/neco_a_01015.pdf
PMID 28957027
PQID 1973398867
PQPubID 37252
PageCount 35
ParticipantIDs crossref_citationtrail_10_1162_neco_a_01015
crossref_primary_10_1162_neco_a_01015
proquest_miscellaneous_1945223111
proquest_journals_1973398867
pubmed_primary_28957027
mit_journals_10_1162_neco_a_01015
mit_journals_necov29i11_302562_2021_11_09_zip_neco_a_01015
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate 2017-11-01
2017-11-00
20171101
PublicationDateYYYYMMDD 2017-11-01
PublicationDate_xml – month: 11
  year: 2017
  text: 2017-11-01
  day: 01
PublicationDecade 2010
PublicationPlace One Rogers Street, Cambridge, MA 02142-1209, USA
PublicationPlace_xml – name: One Rogers Street, Cambridge, MA 02142-1209, USA
– name: United States
– name: Cambridge
PublicationTitle Neural computation
PublicationTitleAlternate Neural Comput
PublicationYear 2017
Publisher MIT Press
MIT Press Journals, The
Publisher_xml – name: MIT Press
– name: MIT Press Journals, The
References B20
B21
B22
B24
B26
B27
Lücke J. (B36) 2008; 9
Mairal J. (B37) 2010; 11
Sheikh A.-S. (B51) 2014; 15
Titsias M. K. (B55) 2011; 24
Goodfellow I. (B16) 2012
Kandel E. R. (B29) 1991
Hyvärinen A. (B28) 2005
Dai Z. (B8) 2013
B30
B32
B33
B34
Sahani M. (B50) 1999
Lücke J. (B35) 2010; 11
B38
B39
Puertas G. (B46) 2010
B1
Mohamed S. (B40) 2010
B2
B4
B5
B6
B7
Sparrer S. (B53) 2014
B9
Olshausen B. (B45) 2000
B41
Henze D. A. (B25) 2009
B42
B43
B44
B47
B48
B49
Griffiths T. L. (B18) 2011; 12
Zhou M. (B60) 2009
B10
Henniges M. (B23) 2014; 15
B54
B11
B12
B56
B13
B57
B14
B58
B59
Lee H. (B31) 2007
B17
Berkes P. (B3) 2008
B19
Shelton J. A. (B52) 2011; 24
B61
References_xml – ident: B49
  doi: 10.1152/jn.2002.88.1.455
– ident: B27
  doi: 10.1098/rspb.1977.0085
– ident: B10
  doi: 10.1109/TIT.2006.871582
– ident: B56
  doi: 10.1016/S0893-6080(97)00133-0
– ident: B58
  doi: 10.1098/rspb.1998.0303
– ident: B38
  doi: 10.1109/TIP.2005.860325
– ident: B59
  doi: 10.1364/JOSAA.14.002379
– ident: B17
  doi: 10.1109/TPAMI.2012.273
– ident: B47
  doi: 10.1162/089976604774201631
– volume: 24
  volume-title: Advances in neural information processing systems
  year: 2011
  ident: B55
– start-page: 841
  volume-title: Advances in neural information processing systems, 12
  year: 2000
  ident: B45
– volume-title: Proceedings of the Int. Symposium on Adaptive Knowledge Representation and Reasoning.
  year: 2005
  ident: B28
– ident: B14
  doi: 10.1109/TNNLS.2015.2412686
– ident: B24
  doi: 10.1152/jn.2000.84.1.390
– ident: B13
  doi: 10.1162/neco.1994.6.4.559
– ident: B1
  doi: 10.1109/TSP.2006.881199
– ident: B2
  doi: 10.1016/S0042-6989(97)00121-1
– ident: B48
  doi: 10.1007/s10827-006-0003-9
– ident: B39
  doi: 10.1016/S0262-8856(99)00010-4
– start-page: 801
  volume-title: Advances in neural information processing systems, 20
  year: 2007
  ident: B31
– ident: B26
  doi: 10.1109/NNSP.2002.1030067
– volume: 9
  start-page: 1227
  year: 2008
  ident: B36
  publication-title: Journal of Machine Learning Research
– ident: B7
  doi: 10.1038/nrn3136
– volume: 15
  start-page: 2689
  year: 2014
  ident: B23
  publication-title: Journal of Machine Learning Research
– ident: B32
  doi: 10.1364/JOSA.70.001458
– ident: B21
  doi: 10.1017/S0952523800009640
– ident: B41
  doi: 10.1007/978-94-011-5014-9_12
– ident: B34
  doi: 10.1162/neco.2009.07-07-584
– ident: B5
  doi: 10.1007/s10044-007-0096-4
– start-page: 1939
  volume-title: Advances in neural information processing systems, 23
  year: 2010
  ident: B46
– start-page: 1
  year: 2014
  ident: B53
  publication-title: 2014 18th International ITG Workshop on Smart Antennas
– volume: 11
  start-page: 2855
  year: 2010
  ident: B35
  publication-title: Journal of Machine Learning Research
– ident: B33
  doi: 10.1007/978-3-540-74695-9_40
– ident: B57
  doi: 10.1152/jn.00749.2002
– ident: B43
  doi: 10.1038/381607a0
– year: 2009
  ident: B25
  publication-title: Simultaneous intracellular and extracellular recordings from hippocampus region CA1 of anesthetized rats
– start-page: 243
  volume-title: Advances in neural information processing systems
  year: 2013
  ident: B8
– start-page: 2295
  volume-title: Advances in neural information processing systems
  year: 2009
  ident: B60
– volume-title: Proceedings of the 29th International Conference on Machine Learning
  year: 2012
  ident: B16
– ident: B30
  doi: 10.1038/44565
– year: 1999
  ident: B50
  publication-title: Latent variable models for neural data analysis
– volume: 11
  start-page: 19
  year: 2010
  ident: B37
  publication-title: Journal of Machine Learning Research
– ident: B42
  doi: 10.1523/JNEUROSCI.0623-08.2008
– volume-title: Principles of neural science
  year: 1991
  ident: B29
– ident: B44
  doi: 10.1016/S0042-6989(97)00169-7
– ident: B4
  doi: 10.1142/S0129065700000028
– ident: B6
  doi: 10.1371/journal.pcbi.1003062
– ident: B20
  doi: 10.1088/0954-898X_3_1_008
– volume: 12
  start-page: 1185
  year: 2011
  ident: B18
  publication-title: Journal of Machine Learning Research
– volume: 24
  volume-title: Advances in neural information processing systems
  year: 2011
  ident: B52
– volume: 15
  start-page: 2653
  year: 2014
  ident: B51
  publication-title: Journal of Machine Learning Research
– ident: B54
  doi: 10.1007/978-1-4615-0799-4
– ident: B61
  doi: 10.1371/journal.pcbi.1002250
– ident: B9
  doi: 10.1117/12.55666
– ident: B22
  doi: 10.1007/978-3-642-15995-4_56
– ident: B11
  doi: 10.1017/CBO9780511794308
– ident: B19
  doi: 10.1007/s10044-003-0194-x
– volume-title: Sparse exponential family latent variable models
  year: 2010
  ident: B40
– volume-title: Advances in neural information processing systems, 21
  year: 2008
  ident: B3
– ident: B12
  doi: 10.1007/978-3-642-28551-6_26
SSID ssj0006105
Score 2.2883027
Snippet Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding...
SourceID proquest
pubmed
crossref
mit
SourceType Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 2979
SubjectTerms Algorithms
Approximation
Cells
Coding
Coding theory
Conditional probability
Continuity (mathematics)
Experiments
Letters
Mathematical models
Neurons
Probability
Statistical analysis
Waveforms
Title Discrete Sparse Coding
URI https://direct.mit.edu/neco/article/doi/10.1162/neco_a_01015
https://www.ncbi.nlm.nih.gov/pubmed/28957027
https://www.proquest.com/docview/1973398867
https://www.proquest.com/docview/1945223111
Volume 29
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3Pb9MwGLVYd-HC70FhoCDBaQoksePE3MY2NKHBZZvUm-U4jpTB2goyhPbX8-zYaTqoBFyiNP4cK37O5_c59fsIecVUwlWTVzFGcB4z3iSxSmkdi6pmrBZNjWr23xaf-fE5-zjLZ6scRW53SVe90dd_3FfyP6jiGnC1u2T_AdnhpriAc-CLIxDG8a8wPmzx0oP17p0uEZ_aBHTDTHQRVJmcrIZ2uRvWProf_XQySV_a1dJ4uxgI9on9gP7-QHs5TveLey0sv0aAeScd1ghM8GtJjGB3NnZ8fqnBA5yO3ZjoM7z87l-502tFZCyVtPJ0-dgMvbO8dH2NMC4vkn7X_w0961C0RbYzUPtsQrb3Dz-dnA7zJwhdHrYo8OztuDEr3eyrr_GIrcu22xwiOKpwdo_c8Rw_2u8Bu09umfkDcjfkz4i8O31IdgJ-UY9f1OP3iJx_ODo7OI59nopYI7jtYqNUYcoarhr8RxieN7lRpRKJtnKGhpaJKRSYZUE1b5hIy4RxDaoG1wkCWiu6Qybzxdw8IZFJm6aiKku1oswwhXNDa01ZlfIa95ySvfDUUnsRd5tL5Kt0wRzP5Li7puT1YL3sxUs22L1EB0o_sr9vsHm3ZmPLfmSiRfxILW_OZAa6iHoyEfK6Xd6ovBuwGbUiCkpFiS5C-0MxvJv9ZKXmZnFlbaziP8WEPCWPe0yHJwkj4enGkmfk9upt2CWT7tuVeQ4O2VUv_KD7BcbDbNY
linkProvider EBSCOhost
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Discrete+Sparse+Coding&rft.jtitle=Neural+computation&rft.au=Exarchakis%2C+Georgios&rft.au=L%C3%BCcke%2C+J%C3%B6rg&rft.date=2017-11-01&rft.eissn=1530-888X&rft.volume=29&rft.issue=11&rft.spage=2979&rft_id=info:doi/10.1162%2Fneco_a_01015&rft_id=info%3Apmid%2F28957027&rft.externalDocID=28957027
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0899-7667&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0899-7667&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0899-7667&client=summon