Covariance-regularized regression and classification for high dimensional problems

We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-...

Full description

Saved in:
Bibliographic Details
Published inJournal of the Royal Statistical Society. Series B, Statistical methodology Vol. 71; no. 3; pp. 615 - 636
Main Authors Witten, Daniela M., Tibshirani, Robert
Format Journal Article
LanguageEnglish
Published Oxford, UK Oxford, UK : Blackwell Publishing Ltd 01.06.2009
Blackwell Publishing Ltd
Blackwell Publishing
Blackwell
Royal Statistical Society
Oxford University Press
SeriesJournal of the Royal Statistical Society Series B
Subjects
Online AccessGet full text
ISSN1369-7412
1467-9868
DOI10.1111/j.1467-9868.2009.00699.x

Cover

Loading…
Abstract We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes.
AbstractList We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes. Copyright (c) 2009 Royal Statistical Society.
We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes. Reprinted by permission of Blackwell Publishers
We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes.
We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse covariance matrix of the features to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing the log-likelihood of the data, under a multivariate normal model, subject to a penalty; it is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyse gene expression data sets with multiple class and survival outcomes. [PUBLICATION ABSTRACT]
In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of methods that use a shrunken estimate of the inverse covariance matrix of the features in order to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing its log likelihood, under a multivariate normal model, subject to a constraint on its elements; this estimate is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso, and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyze gene expression data sets with multiple class and survival outcomes.In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of methods that use a shrunken estimate of the inverse covariance matrix of the features in order to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing its log likelihood, under a multivariate normal model, subject to a constraint on its elements; this estimate is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso, and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyze gene expression data sets with multiple class and survival outcomes.
In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of methods that use a shrunken estimate of the inverse covariance matrix of the features in order to achieve superior prediction. An estimate of the inverse covariance matrix is obtained by maximizing its log likelihood, under a multivariate normal model, subject to a constraint on its elements; this estimate is then used to estimate coefficients for the regression of the response onto the features. We show that ridge regression, the lasso, and the elastic net are special cases of covariance-regularized regression, and we demonstrate that certain previously unexplored forms of covariance-regularized regression can outperform existing methods in a range of situations. The covariance-regularized regression framework is extended to generalized linear models and linear discriminant analysis, and is used to analyze gene expression data sets with multiple class and survival outcomes.
Author Tibshirani, Robert
Witten, Daniela M.
Author_xml – sequence: 1
  givenname: Daniela M.
  surname: Witten
  fullname: Witten, Daniela M.
– sequence: 2
  givenname: Robert
  surname: Tibshirani
  fullname: Tibshirani, Robert
BackLink http://pascal-francis.inist.fr/vibad/index.php?action=getRecordDetail&idt=21516506$$DView record in Pascal Francis
https://www.ncbi.nlm.nih.gov/pubmed/20084176$$D View this record in MEDLINE/PubMed
http://econpapers.repec.org/article/blajorssb/v_3a71_3ay_3a2009_3ai_3a3_3ap_3a615-636.htm$$DView record in RePEc
BookMark eNqNU11v0zAUjdAQ-4CfAERIIF5S7Nix4weQ2IDxMcG0MvF45bhO65LGxU5Ly6_nZt062APM0o2vc885sa9P9pOd1rc2SVJKBhTHi-mAciEzVYpykBOiBoQIpQarO8netrCDORMqk5zmu8l-jFOCQ0h2L9lFTsmpFHvJ2ZFf6uB0a2wW7HjR4OKXHaWYBxuj822q21FqGo2L2hnd9a9qH9KJG0_SkZvZtkfpJp0HXzV2Fu8nd2vdRPvgcj5Izt-9_Xr0Pjv5cvzh6PVJZkpKVFaSgla5rakqLJeskrnNi7oWxlIlJS2VsEoUPNeVqSpmTa4sz8u6qozmqhKKHSSvNrrzRTWzI2PbLugG5sHNdFiD1w7-rrRuAmO_hLwkQhCGAs8uBYL_sbCxg5mLxjaNbq1fROCCEYLg_wKZpELIUiDw-T-BtGQFl1TieQ-SJzegU78I2McI_e2IAiUR9HEDCnZuzfZoVaOnPsRYwRKYlhQfa4zeCTg5DIYxxxC0AMEETLoZij36s19btSszIODpJUBHo5s6oCtcvMZR3FNBxHXjTfAxBluDcd2FMbDNrgFKoHcpTKE3I_RmhIu9XbgUVihQ3hC4-sYtqC831J-usetb8-BsODzEDPkPN_xp7HzY8jnJuSwUxXq2qbvY2dW2rsN3wF9HFvDt8zGcHZ6-GQ4_HcIp4h9v8LX2oMcB-3U-zAllhApWFpyz3wGbIP8
CitedBy_id crossref_primary_10_1016_j_baae_2012_08_001
crossref_primary_10_1155_2017_4518429
crossref_primary_10_1002_gepi_22377
crossref_primary_10_1080_10618600_2022_2076687
crossref_primary_10_1145_3374919
crossref_primary_10_1214_13_AOAS638
crossref_primary_10_1109_TSMC_2017_2724761
crossref_primary_10_1198_jcgs_2010_09188
crossref_primary_10_1186_s12859_014_0443_6
crossref_primary_10_1080_10618600_2014_900500
crossref_primary_10_1371_journal_pone_0162947
crossref_primary_10_5183_jjscs_1412001_219
crossref_primary_10_1002_hbm_23092
crossref_primary_10_1111_j_1467_9868_2011_00771_x
crossref_primary_10_1080_10618600_2016_1209117
crossref_primary_10_1016_j_csda_2015_10_006
crossref_primary_10_1080_03610926_2013_878359
crossref_primary_10_1038_s41598_024_61334_6
crossref_primary_10_3390_nu10060674
crossref_primary_10_3390_nu10060795
crossref_primary_10_1007_s10994_021_06025_3
crossref_primary_10_1111_rssb_12326
crossref_primary_10_1016_j_artint_2022_103823
crossref_primary_10_1214_11_AOS962
crossref_primary_10_1093_biomet_asu049
crossref_primary_10_1214_17_EJS1288
crossref_primary_10_1214_20_EJS1764
crossref_primary_10_1093_sysbio_syaa010
crossref_primary_10_1017_S0022109019000899
crossref_primary_10_1111_j_1467_9868_2011_00783_x
crossref_primary_10_1515_ijb_2017_0013
crossref_primary_10_1214_11_STS358
crossref_primary_10_1186_1297_9686_43_39
crossref_primary_10_1007_s10618_016_0471_0
crossref_primary_10_1007_s11081_021_09611_5
crossref_primary_10_1038_s41598_023_31609_5
crossref_primary_10_1016_j_compbiolchem_2012_06_001
crossref_primary_10_1109_ACCESS_2016_2596379
crossref_primary_10_1109_TSP_2017_2652358
crossref_primary_10_3390_jpm8030026
crossref_primary_10_1016_j_addr_2013_04_008
crossref_primary_10_1371_journal_pone_0060536
crossref_primary_10_1109_ACCESS_2019_2936402
crossref_primary_10_3390_jpm8010010
crossref_primary_10_1080_10618600_2013_852554
crossref_primary_10_2139_ssrn_4631353
crossref_primary_10_1016_j_jspi_2013_09_005
crossref_primary_10_1198_jasa_2011_tm11199
crossref_primary_10_1111_sjos_12632
crossref_primary_10_1080_01621459_2013_858630
crossref_primary_10_1214_14_EJS973
crossref_primary_10_1007_s13571_013_0065_4
crossref_primary_10_1093_bioinformatics_btz667
crossref_primary_10_1214_22_AOAS1677
crossref_primary_10_1080_10705511_2022_2069114
crossref_primary_10_1016_j_schres_2013_10_004
crossref_primary_10_1111_sjos_12353
crossref_primary_10_1093_biomet_asy023
crossref_primary_10_2139_ssrn_2466289
crossref_primary_10_1111_j_1365_2745_2011_01852_x
crossref_primary_10_1214_09_AOAS277
crossref_primary_10_1214_22_EJS2066
crossref_primary_10_1002_wics_1551
crossref_primary_10_1214_09_AOAS314
crossref_primary_10_1111_rssb_12033
crossref_primary_10_1007_s10115_021_01587_z
crossref_primary_10_1016_j_jmva_2014_04_026
crossref_primary_10_1109_TBDATA_2019_2937785
crossref_primary_10_3390_rs15071723
crossref_primary_10_1002_biot_201100305
crossref_primary_10_1016_j_csda_2016_05_012
crossref_primary_10_1198_jcgs_2011_10102
crossref_primary_10_1137_15M102469X
crossref_primary_10_2217_fon_2022_0333
crossref_primary_10_1109_TCYB_2019_2952711
crossref_primary_10_1111_j_1461_0248_2010_01460_x
crossref_primary_10_1016_j_jbi_2013_07_002
crossref_primary_10_1093_sysbio_syy045
crossref_primary_10_1162_netn_a_00363
crossref_primary_10_1186_1471_2105_10_384
crossref_primary_10_18632_oncotarget_25520
crossref_primary_10_1093_biomet_asw034
crossref_primary_10_1080_07350015_2020_1730859
crossref_primary_10_1109_TSP_2025_3544376
crossref_primary_10_1111_boj_12135
crossref_primary_10_2139_ssrn_1746315
crossref_primary_10_1007_s10489_020_01810_4
crossref_primary_10_1080_03610926_2015_1060344
crossref_primary_10_1016_j_csda_2020_107031
crossref_primary_10_1534_genetics_115_186114
crossref_primary_10_1371_journal_pcbi_1004182
crossref_primary_10_1111_j_1467_9868_2011_01023_x
crossref_primary_10_1007_s11222_010_9219_7
crossref_primary_10_1109_TMM_2022_3140892
crossref_primary_10_1214_23_AOAS1830
crossref_primary_10_1111_j_1755_148X_2012_01010_x
crossref_primary_10_1080_01621459_2015_1034319
crossref_primary_10_1109_TSP_2014_2360826
crossref_primary_10_1109_TIT_2014_2381241
crossref_primary_10_1155_2013_604548
crossref_primary_10_1007_s11306_018_1365_5
crossref_primary_10_1177_0962280209105024
crossref_primary_10_1007_s00429_021_02319_3
crossref_primary_10_1080_10618600_2023_2210174
crossref_primary_10_1016_j_patcog_2021_107981
crossref_primary_10_1016_j_neuroimage_2013_09_069
crossref_primary_10_1016_j_jmva_2020_104641
crossref_primary_10_1109_TNNLS_2018_2846783
crossref_primary_10_1016_j_coi_2011_04_005
crossref_primary_10_1017_S0022109015000526
crossref_primary_10_1007_s10489_021_02563_4
crossref_primary_10_1007_s11265_019_01478_1
crossref_primary_10_1109_TIT_2013_2296784
crossref_primary_10_1007_s10614_025_10864_w
crossref_primary_10_1002_cjs_11267
crossref_primary_10_1371_journal_pcbi_1010758
crossref_primary_10_1214_13_AOAS668
crossref_primary_10_1016_j_jspi_2020_04_003
crossref_primary_10_1088_1751_8121_aa812f
crossref_primary_10_1007_s10278_024_01172_0
crossref_primary_10_1214_13_EJS872
crossref_primary_10_1007_s11634_018_0335_0
crossref_primary_10_1093_mnras_stw1042
crossref_primary_10_1155_2020_6527462
crossref_primary_10_1038_s41467_021_23102_2
crossref_primary_10_1080_10618600_2014_920709
crossref_primary_10_1016_j_virusres_2013_02_011
crossref_primary_10_1007_s11222_013_9430_4
Cites_doi 10.1162/neco.2006.18.7.1527
10.1080/01621459.1978.10480106
10.1214/ss/1056397488
10.1214/aos/1176344845
10.1056/NEJMoa055351
10.1038/nm0102-68
10.1093/biostatistics/kxm045
10.1080/01621459.1989.10478752
10.1214/aos/1176349756
10.1056/NEJMoa012914
10.1080/00401706.1970.10488634
10.1073/pnas.211566398
10.1080/00401706.1993.10485033
10.1214/009053606000000281
10.1093/biostatistics/kxg046
10.1093/biostatistics/kxj035
10.1214/088342307000000032
10.1073/pnas.091062498
10.1371/journal.pbio.0020108
10.1002/0471725293
10.1182/blood-2004-07-2947
10.1111/j.1467-9868.2005.00503.x
10.1023/A:1010933404324
10.1073/pnas.082099299
10.1214/08-AOS600
10.1111/j.1467-9868.2007.00607.x
ContentType Journal Article
Copyright Copyright 2009 The Royal Statistical Society and Blackwell Publishing Ltd.
2009 Royal Statistical Society
2009 INIST-CNRS
2009 The Royal Statistical Society and Blackwell Publishing Ltd
Copyright_xml – notice: Copyright 2009 The Royal Statistical Society and Blackwell Publishing Ltd.
– notice: 2009 Royal Statistical Society
– notice: 2009 INIST-CNRS
– notice: 2009 The Royal Statistical Society and Blackwell Publishing Ltd
DBID FBQ
BSCLL
AAYXX
CITATION
IQODW
NPM
DKI
X2L
7SC
8BJ
8FD
FQK
JBE
JQ2
L7M
L~C
L~D
7X8
7S9
L.6
5PM
DOI 10.1111/j.1467-9868.2009.00699.x
DatabaseName AGRIS
Istex
CrossRef
Pascal-Francis
PubMed
RePEc IDEAS
RePEc
Computer and Information Systems Abstracts
International Bibliography of the Social Sciences (IBSS)
Technology Research Database
International Bibliography of the Social Sciences
International Bibliography of the Social Sciences
ProQuest Computer Science Collection
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts – Academic
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
AGRICOLA
AGRICOLA - Academic
PubMed Central (Full Participant titles)
DatabaseTitle CrossRef
PubMed
International Bibliography of the Social Sciences (IBSS)
Technology Research Database
Computer and Information Systems Abstracts – Academic
ProQuest Computer Science Collection
Computer and Information Systems Abstracts
Advanced Technologies Database with Aerospace
Computer and Information Systems Abstracts Professional
MEDLINE - Academic
AGRICOLA
AGRICOLA - Academic
DatabaseTitleList
International Bibliography of the Social Sciences (IBSS)
AGRICOLA

International Bibliography of the Social Sciences (IBSS)
MEDLINE - Academic

PubMed

CrossRef

Database_xml – sequence: 1
  dbid: NPM
  name: PubMed
  url: https://proxy.k.utb.cz/login?url=http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?db=PubMed
  sourceTypes: Index Database
– sequence: 2
  dbid: DKI
  name: RePEc IDEAS
  url: http://ideas.repec.org/
  sourceTypes: Index Database
– sequence: 3
  dbid: FBQ
  name: AGRIS
  url: http://www.fao.org/agris/Centre.asp?Menu_1ID=DB&Menu_2ID=DB1&Language=EN&Content=http://www.fao.org/agris/search?Language=EN
  sourceTypes: Publisher
DeliveryMethod fulltext_linktorsrc
Discipline Statistics
Mathematics
EISSN 1467-9868
EndPage 636
ExternalDocumentID PMC2806603
1738494201
blajorssb_v_3a71_3ay_3a2009_3ai_3a3_3ap_3a615_636_htm
20084176
21516506
10_1111_j_1467_9868_2009_00699_x
RSSB699
40247591
ark_67375_WNG_RBPDSSKB_P
US201301638544
Genre article
Journal Article
Feature
GrantInformation_xml – fundername: NHLBI NIH HHS
  grantid: N01 HV028183
– fundername: NIBIB NIH HHS
  grantid: R01 EB001988
GroupedDBID -~X
.3N
.4S
.DC
.GA
.Y3
05W
0R~
10A
1OC
29L
2AX
3-9
31~
33P
3SF
4.4
50Y
50Z
51W
51X
52M
52N
52O
52P
52S
52T
52U
52W
52X
5GY
5HH
5LA
5VS
66C
702
7PT
8-0
8-1
8-3
8UM
8VB
930
A03
AAESR
AAEVG
AAHHS
AAONW
AAPXW
AASGY
AAXRX
AAZKR
ABBHK
ABCQN
ABCUV
ABEHJ
ABEML
ABFAN
ABHUG
ABIVO
ABLJU
ABPFR
ABPTD
ABPVW
ABWST
ABYAD
ABYWD
ABZEH
ACAHQ
ACBWZ
ACCFJ
ACCZN
ACFRR
ACGFS
ACIWK
ACMTB
ACNCT
ACPOU
ACSCC
ACTMH
ACTWD
ACUBG
ACXBN
ACXME
ACXQS
ADAWD
ADBBV
ADDAD
ADEOM
ADIPN
ADIYS
ADIZJ
ADKYN
ADMGS
ADODI
ADOZA
ADQBN
ADRDM
ADULT
ADVEK
AEEZP
AEGXH
AEIMD
AELPN
AEMOZ
AEQDE
AEUPB
AEUQT
AFBPY
AFEBI
AFGKR
AFPWT
AFVGU
AFVYC
AFXHP
AFXKK
AFZJQ
AGJLS
AIHXQ
AIURR
AIWBW
AJAOE
AJBDE
AJXKR
AKVCP
ALAGY
ALMA_UNASSIGNED_HOLDINGS
AMBMR
AMYDB
ANFBD
ARCSS
ASPBG
AS~
ATUGU
AUFTA
AVWKF
AZBYB
AZFZN
AZVAB
BAFTC
BDRZF
BFHJK
BHBCM
BMNLL
BMXJE
BNHUX
BROTX
BRXPI
BY8
CAG
CJ0
CO8
COF
CS3
D-E
DCZOG
DPXWK
DQDLB
DR2
DRFUL
DRSTM
DSRWC
EBA
EBO
EBR
EBS
EBU
ECEWR
EDO
EFSUC
EJD
EMK
F00
F5P
FBQ
FEDTE
FVMVE
G-S
G.N
GODZA
H.T
H.X
HF~
HGD
HQ6
HVGLF
HZI
HZ~
H~9
IHE
IX1
J0M
JAAYA
JAS
JBMMH
JBZCM
JENOY
JHFFW
JKQEH
JLEZI
JLXEF
JMS
JPL
JSODD
JST
K1G
K48
LATKE
LC2
LC3
LEEKS
LH4
LITHE
LOXES
LP6
LP7
LUTES
LW6
LYRES
MK4
MRFUL
MRSTM
MSFUL
MSSTM
MXFUL
MXSTM
N04
N05
NF~
NHB
O66
O9-
OJZSN
OWPYF
P2W
P2X
P4D
PQQKQ
Q.N
Q11
QB0
QWB
R.K
RJQFR
RNS
ROL
ROX
RX1
SA0
SUPJJ
TH9
TN5
TUS
UB1
UPT
W8V
W99
WBKPD
WH7
WIH
WIK
WOHZO
WQJ
WYISQ
XBAML
XG1
YQT
ZGI
ZL0
ZZTAW
~02
~IA
~KM
~WT
AAHBH
AARHZ
AAUAY
ABPQH
ABXSQ
ADACV
ADZMN
ALUQN
ATGXG
BCRHZ
BSCLL
H13
IPSME
OIG
AANHP
AAWIL
ABAWQ
ABDFA
ABPQP
ACHJO
ACRPL
ACYXJ
ADNMO
AGLNM
AGQPQ
AHQJS
AIHAF
AJNCP
ALRMG
AMVHM
NU-
AAYXX
CITATION
IQODW
NPM
02
08R
0R
31
3N
4S
8RP
AS
DKI
GA
HZ
IA
IPNFZ
KM
MEWTI
NF
P4A
PQEST
RIG
WRC
WT
X
X2L
XHC
Y3
7SC
8BJ
8FD
FQK
JBE
JQ2
L7M
L~C
L~D
7X8
7S9
L.6
5PM
ID FETCH-LOGICAL-c8109-8051b2ef195e473b72e25ff6ce19771896e96542abcbb3ec29e428fbbca49b693
IEDL.DBID DR2
ISSN 1369-7412
IngestDate Thu Aug 21 18:07:53 EDT 2025
Thu Jul 10 22:52:36 EDT 2025
Fri Jul 11 11:46:34 EDT 2025
Fri Jul 11 01:40:16 EDT 2025
Sat Aug 16 21:44:11 EDT 2025
Wed Aug 18 03:51:57 EDT 2021
Mon Jul 21 05:48:51 EDT 2025
Mon Jul 21 09:12:45 EDT 2025
Tue Jul 01 03:43:02 EDT 2025
Thu Apr 24 23:09:08 EDT 2025
Wed Jan 22 16:35:00 EST 2025
Thu Jul 03 21:13:47 EDT 2025
Wed Oct 30 09:48:54 EDT 2024
Wed Dec 27 19:00:41 EST 2023
IsDoiOpenAccess false
IsOpenAccess true
IsPeerReviewed true
IsScholarly true
Issue 3
Keywords Ridge regression
Rank statistic
Generalized linear model
Covariance regularization
Prediction theory
Multivariate analysis
Stochastic process
Linear model
Penalty method
Parametric method
Covariance
Survival function
Inverse matrix
n « p
Log likelihood
Classification
Ill posed problem
Variable selection
Censored data
Discriminant analysis
Numerical linear algebra
Prediction
Regression
Statistical estimation
Covariance matrix
Survival
Regularization method
Statistical method
Statistical regression
Regression coefficient
Selection problem
Numerical analysis
Filtering theory
Regularization
Language English
License https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model
CC BY 4.0
LinkModel DirectLink
MergedId FETCHMERGED-LOGICAL-c8109-8051b2ef195e473b72e25ff6ce19771896e96542abcbb3ec29e428fbbca49b693
Notes http://dx.doi.org/10.1111/j.1467-9868.2009.00699.x
istex:BA67D2CCA1D732F74C4DEDF9213B82072B3A8F1D
ark:/67375/WNG-RBPDSSKB-P
ArticleID:RSSB699
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 14
ObjectType-Article-1
ObjectType-Feature-2
content type line 23
ObjectType-Article-2
OpenAccessLink https://academic.oup.com/jrsssb/article-pdf/71/3/615/49686350/jrsssb_71_3_615.pdf
PMID 20084176
PQID 200865166
PQPubID 39359
PageCount 22
ParticipantIDs pubmedcentral_primary_oai_pubmedcentral_nih_gov_2806603
proquest_miscellaneous_46300280
proquest_miscellaneous_37166786
proquest_miscellaneous_1835471747
proquest_journals_200865166
repec_primary_blajorssb_v_3a71_3ay_3a2009_3ai_3a3_3ap_3a615_636_htm
pubmed_primary_20084176
pascalfrancis_primary_21516506
crossref_citationtrail_10_1111_j_1467_9868_2009_00699_x
crossref_primary_10_1111_j_1467_9868_2009_00699_x
wiley_primary_10_1111_j_1467_9868_2009_00699_x_RSSB699
jstor_primary_40247591
istex_primary_ark_67375_WNG_RBPDSSKB_P
fao_agris_US201301638544
ProviderPackageCode CITATION
AAYXX
PublicationCentury 2000
PublicationDate June 2009
PublicationDateYYYYMMDD 2009-06-01
PublicationDate_xml – month: 06
  year: 2009
  text: June 2009
PublicationDecade 2000
PublicationPlace Oxford, UK
PublicationPlace_xml – name: Oxford, UK
– name: Oxford
– name: England
PublicationSeriesTitle Journal of the Royal Statistical Society Series B
PublicationTitle Journal of the Royal Statistical Society. Series B, Statistical methodology
PublicationTitleAlternate J R Stat Soc Series B Stat Methodol
PublicationYear 2009
Publisher Oxford, UK : Blackwell Publishing Ltd
Blackwell Publishing Ltd
Blackwell Publishing
Blackwell
Royal Statistical Society
Oxford University Press
Publisher_xml – name: Oxford, UK : Blackwell Publishing Ltd
– name: Blackwell Publishing Ltd
– name: Blackwell Publishing
– name: Blackwell
– name: Royal Statistical Society
– name: Oxford University Press
References Frank, I. and Friedman, J. (1993) A statistical view of some chemometrics regression tools (with discussion). Technometrics, 35, 109-148.
Rothman, A., Levina, E. and Zhu, J. (2008) Sparse permutation invariant covariance estimation. Electr. J. Statist., 2, 494-515.
Zhu, J. and Hastie, T. (2004) Classification of gene microarrays by penalized logistic regression. Biostatistics, 5, 427-443.
Kalbfleisch, J. and Prentice, R. (1980) The Statistical Analysis of Failure Time Data. New York: Wiley.
Liang, F., Mukherjee, S. and West, M. (2007) The use of unlabeled data in predictive modeling. Statist. Sci., 22, 189-205.
Zhao, P. and Yu, B. (2006) On model selection consistency of lasso. J. Mach. Learn. Res., 7, 2541-2563.
Shipp, M. A., Ross, K. N., Tamayo, P., Weng, A. P., Kutok, J. L., Aguiar, R. C., Gaasenbeek, M., Angelo, M., Reich, M., Pinkus, G. S., Ray, T. S., Koval, M. A., Last, K. W., Norton, A., Lister, T. A., Mesirov, J., Neuberg, D. S., Lander, E. S., Aster, J. C. and Golub, T. R. (2002) Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning. Nat. Med., 8, 68-74.
Tibshirani, R. (1996) Regression shrinkage and selection via the lasso. J. R. Statist. Soc. B, 58, 267-288.
Green, P. J. (1984) Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives. J. R. Statist. Soc. B, 46, 149-192.
Meinshausen, N. and Bühlmann, P. (2006) High dimensional graphs and variable selection with the lasso. Ann. Statist., 34, 1436-1462.
Friedman, J. (1989) Regularized discriminant analysis. J. Am. Statist. Ass., 84, 165-175.
Friedman, J., Hastie, T. and Tibshirani, R. (2007) Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9, 432-441.
Hummel, M., Bentink, S., Berger, H., Klappwe, W., Wessendorf, S., Barth, F. T. E., Bernd, H.-W., Cogliatti, S. B., Dierlamm, J., Feller, A. C., Hansmann, M. L., Haralambieva, E., Harder, L., Hasenclever, D., Kuhn, M., Lenze, D., Lichter, P., Martin-Subero, J. I., Moller, P., Muller-Hermelink, H.-K., Ott, G., Parwaresch, R. M., Pott, C., Rosenwald, A., Rosolowski, M., Schwaenen, C., Sturzenhofecker, B., Szczepanowski, M., Trautmann, H., Wacker, H.-H., Spang, R., Loefler, M., Trumper, L., Stein, H. and Siebert, R. (2006) A biological definition of Burkitt's lymphoma from transcriptional and genomic profiling. New Engl. J. Med., 354, 2419-2430.
Mardia, K., Kent, J. and Bibby, J. (1979) Multivariate Analysis. London: Academic Press.
Banerjee, O., El Ghaoui, L. E. and D'Aspremont, A. (2008) Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data. J. Mach. Learn. Res., 9, 485-516.
Bair, E. and Tibshirani, R. (2004) Semi-supervised methods to predict patient survival from gene expression data. PLOS Biol., 2, 511-522.
Tibshirani, R., Hastie, T., Narasimhan, B. and Chu, G. (2002) Diagnosis of multiple cancer types by shrunken centroids of gene expression. Proc. Natn. Acad. Sci. USA, 99, 6567-6572.
Hoerl, A. E. and Kennard, R. (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12, 55-67.
O'Neill, T. (1978) Normal discrimination with unclassified observations. J. Am. Statist. Ass., 73, 821-826.
Hinton, G., Osindero, S. and Teh, Y. (2006) A fast learning algorithm for deep belief nets. Neur. Computn, 18, 1527-1553.
Bickel, P. and Levina, E. (2008) Covariance regularization by thresholding. Ann. Statist., to be published.
Zou, H. and Hastie, T. (2005) Regularization and variable selection via the elastic net. J. R. Statist. Soc. B, 67, 301-320.
Tibshirani, R., Hastie, T., Narasimhan, B. and Chu, G. (2003) Class prediction by nearest shrunken centroids, with applications to DNA microarrays. Statist. Sci., 18, 104-117.
Dey, D. and Srinivasan, C. (1985) Estimation of a covariance matrix under Stein's loss. Ann. Statist., 13, 1581-1591.
Monti, S., Savage, K. J., Kutok, J. L., Feuerhake, F., Kurtin, P., Mihm, M., Wu, B., Pasqualucci, L., Neuberg, D., Aguiar, R. C. T., Dal Cin, P., Ladd, C., Pinkus, G. S., Salles, G., Harris, N. L., Dalla-Favera, R., Habermann, T. M., Aster, J. C., Golub, T. R. and Shipp, M. A. (2005) Molecular profiling of diffuse large B-cell lymphoma identifies robust subtypes including one characterized by host inflammatory response. Blood, 105, 1851-1861.
Rosenwald, A., Wright, G., Chan, W. C., Connors, J. M., Campo, E., Fisher, R. I., Gascoyne, R. D., Muller-Hermelink, H. K., Smeland, E. B. and Staudt, L. M. (2002) The use of molecular profiling to predict survival after chemotherapy for diffuse large B-cell lymphoma. New Engl. J. Med., 346, 1937-1947.
Guo, Y., Hastie, T. and Tibshirani, R. (2007) Regularized linear discriminant analysis and its application in microarrays. Biostatistics, 8, 86-100.
Haff, L. (1979) Estimation of the inverse covariance matrix: random mixtures of the inverse Wishart matrix and the identity. Ann. Statist., 7, 1264-1276.
Park, M. Y. and Hastie, T. (2007) L1-regularization path algorithm for generalized linear models. J. R. Statist. Soc. B, 69, 659-677.
Ramaswamy, S., Tamayo, P., Rifkin, R., Mukherjee, S., Yeang, C., Angelo, M., Ladd, C., Reich, M., Latulippe, E., Mesirov, J., Poggio, T., Gerald, W., Loda, M., Lander, E. and Golub, T. (2001) Multiclass cancer diagnosis using tumor gene expression signature. Proc. Natn. Acad. Sci. USA, 98, 15149-15154.
Breiman, L. (2001) Random forests. Mach. Learn., 45, 5-32.
McLachlan, G. J. (1992) Discriminant Analysis and Statistical Pattern Recognition. New York: Wiley.
Tusher, V. G., Tibshirani, R. and Chu, G. (2001) Significance analysis of microarrays applied to the ionizing radiation response. Proc. Natn. Acad. Sci. USA, 98, 5116-5121.
1989; 84
2006; 34
1978; 73
1984; 46
2002; 8
2002; 99
2008; 9
1970; 12
2006; 7
2008
2006; 18
2004; 5
2003; 18
2004; 2
1992
1996; 58
2001; 45
2008; 2
2006; 354
1979
2005; 67
1993; 35
2005; 105
2007; 8
2007; 9
2002; 346
1961
1980
2007; 22
2007; 69
1985; 13
1979; 7
2001; 98
James (2023033000105762400_) 1961
Meinshausen (2023033000105762400_) 2006; 34
Hoerl (2023033000105762400_) 1970; 12
Tusher (2023033000105762400_) 2001; 98
Bair (2023033000105762400_) 2004; 2
Friedman (2023033000105762400_) 1989; 84
Breiman (2023033000105762400_) 2001; 45
Guo (2023033000105762400_) 2007; 8
Haff (2023033000105762400_) 1979; 7
Zhao (2023033000105762400_) 2006; 7
Dey (2023033000105762400_) 1985; 13
Banerjee (2023033000105762400_) 2008; 9
Mardia (2023033000105762400_) 1979
Tibshirani (2023033000105762400_) 1996; 58
Shipp (2023033000105762400_) 2002; 8
Rosenwald (2023033000105762400_) 2002; 346
Ramaswamy (2023033000105762400_) 2001; 98
Tibshirani (2023033000105762400_) 2003; 18
Hummel (2023033000105762400_) 2006; 354
McLachlan (2023033000105762400_) 1992
Hinton (2023033000105762400_) 2006; 18
Monti (2023033000105762400_) 2005; 105
Park (2023033000105762400_) 2007; 69
Tibshirani (2023033000105762400_) 2002; 99
Zhu (2023033000105762400_) 2004; 5
O’Neill (2023033000105762400_) 1978; 73
Frank (2023033000105762400_) 1993; 35
Green (2023033000105762400_) 1984; 46
Friedman (2023033000105762400_) 2007; 9
Rothman (2023033000105762400_) 2008; 2
Bickel (2023033000105762400_) 2008
Liang (2023033000105762400_) 2007; 22
Zou (2023033000105762400_) 2005; 67
Friedman (2023033000105762400_) 2008
Kalbfleisch (2023033000105762400_) 1980
References_xml – reference: Hinton, G., Osindero, S. and Teh, Y. (2006) A fast learning algorithm for deep belief nets. Neur. Computn, 18, 1527-1553.
– reference: Green, P. J. (1984) Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives. J. R. Statist. Soc. B, 46, 149-192.
– reference: Monti, S., Savage, K. J., Kutok, J. L., Feuerhake, F., Kurtin, P., Mihm, M., Wu, B., Pasqualucci, L., Neuberg, D., Aguiar, R. C. T., Dal Cin, P., Ladd, C., Pinkus, G. S., Salles, G., Harris, N. L., Dalla-Favera, R., Habermann, T. M., Aster, J. C., Golub, T. R. and Shipp, M. A. (2005) Molecular profiling of diffuse large B-cell lymphoma identifies robust subtypes including one characterized by host inflammatory response. Blood, 105, 1851-1861.
– reference: Zou, H. and Hastie, T. (2005) Regularization and variable selection via the elastic net. J. R. Statist. Soc. B, 67, 301-320.
– reference: Dey, D. and Srinivasan, C. (1985) Estimation of a covariance matrix under Stein's loss. Ann. Statist., 13, 1581-1591.
– reference: Meinshausen, N. and Bühlmann, P. (2006) High dimensional graphs and variable selection with the lasso. Ann. Statist., 34, 1436-1462.
– reference: Park, M. Y. and Hastie, T. (2007) L1-regularization path algorithm for generalized linear models. J. R. Statist. Soc. B, 69, 659-677.
– reference: Friedman, J., Hastie, T. and Tibshirani, R. (2007) Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9, 432-441.
– reference: Zhao, P. and Yu, B. (2006) On model selection consistency of lasso. J. Mach. Learn. Res., 7, 2541-2563.
– reference: Tibshirani, R., Hastie, T., Narasimhan, B. and Chu, G. (2002) Diagnosis of multiple cancer types by shrunken centroids of gene expression. Proc. Natn. Acad. Sci. USA, 99, 6567-6572.
– reference: Tibshirani, R., Hastie, T., Narasimhan, B. and Chu, G. (2003) Class prediction by nearest shrunken centroids, with applications to DNA microarrays. Statist. Sci., 18, 104-117.
– reference: Shipp, M. A., Ross, K. N., Tamayo, P., Weng, A. P., Kutok, J. L., Aguiar, R. C., Gaasenbeek, M., Angelo, M., Reich, M., Pinkus, G. S., Ray, T. S., Koval, M. A., Last, K. W., Norton, A., Lister, T. A., Mesirov, J., Neuberg, D. S., Lander, E. S., Aster, J. C. and Golub, T. R. (2002) Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning. Nat. Med., 8, 68-74.
– reference: Zhu, J. and Hastie, T. (2004) Classification of gene microarrays by penalized logistic regression. Biostatistics, 5, 427-443.
– reference: Friedman, J. (1989) Regularized discriminant analysis. J. Am. Statist. Ass., 84, 165-175.
– reference: Hummel, M., Bentink, S., Berger, H., Klappwe, W., Wessendorf, S., Barth, F. T. E., Bernd, H.-W., Cogliatti, S. B., Dierlamm, J., Feller, A. C., Hansmann, M. L., Haralambieva, E., Harder, L., Hasenclever, D., Kuhn, M., Lenze, D., Lichter, P., Martin-Subero, J. I., Moller, P., Muller-Hermelink, H.-K., Ott, G., Parwaresch, R. M., Pott, C., Rosenwald, A., Rosolowski, M., Schwaenen, C., Sturzenhofecker, B., Szczepanowski, M., Trautmann, H., Wacker, H.-H., Spang, R., Loefler, M., Trumper, L., Stein, H. and Siebert, R. (2006) A biological definition of Burkitt's lymphoma from transcriptional and genomic profiling. New Engl. J. Med., 354, 2419-2430.
– reference: Haff, L. (1979) Estimation of the inverse covariance matrix: random mixtures of the inverse Wishart matrix and the identity. Ann. Statist., 7, 1264-1276.
– reference: Ramaswamy, S., Tamayo, P., Rifkin, R., Mukherjee, S., Yeang, C., Angelo, M., Ladd, C., Reich, M., Latulippe, E., Mesirov, J., Poggio, T., Gerald, W., Loda, M., Lander, E. and Golub, T. (2001) Multiclass cancer diagnosis using tumor gene expression signature. Proc. Natn. Acad. Sci. USA, 98, 15149-15154.
– reference: Rothman, A., Levina, E. and Zhu, J. (2008) Sparse permutation invariant covariance estimation. Electr. J. Statist., 2, 494-515.
– reference: Kalbfleisch, J. and Prentice, R. (1980) The Statistical Analysis of Failure Time Data. New York: Wiley.
– reference: Bair, E. and Tibshirani, R. (2004) Semi-supervised methods to predict patient survival from gene expression data. PLOS Biol., 2, 511-522.
– reference: Mardia, K., Kent, J. and Bibby, J. (1979) Multivariate Analysis. London: Academic Press.
– reference: Hoerl, A. E. and Kennard, R. (1970) Ridge regression: biased estimation for nonorthogonal problems. Technometrics, 12, 55-67.
– reference: McLachlan, G. J. (1992) Discriminant Analysis and Statistical Pattern Recognition. New York: Wiley.
– reference: Breiman, L. (2001) Random forests. Mach. Learn., 45, 5-32.
– reference: Tibshirani, R. (1996) Regression shrinkage and selection via the lasso. J. R. Statist. Soc. B, 58, 267-288.
– reference: Tusher, V. G., Tibshirani, R. and Chu, G. (2001) Significance analysis of microarrays applied to the ionizing radiation response. Proc. Natn. Acad. Sci. USA, 98, 5116-5121.
– reference: Guo, Y., Hastie, T. and Tibshirani, R. (2007) Regularized linear discriminant analysis and its application in microarrays. Biostatistics, 8, 86-100.
– reference: O'Neill, T. (1978) Normal discrimination with unclassified observations. J. Am. Statist. Ass., 73, 821-826.
– reference: Frank, I. and Friedman, J. (1993) A statistical view of some chemometrics regression tools (with discussion). Technometrics, 35, 109-148.
– reference: Rosenwald, A., Wright, G., Chan, W. C., Connors, J. M., Campo, E., Fisher, R. I., Gascoyne, R. D., Muller-Hermelink, H. K., Smeland, E. B. and Staudt, L. M. (2002) The use of molecular profiling to predict survival after chemotherapy for diffuse large B-cell lymphoma. New Engl. J. Med., 346, 1937-1947.
– reference: Banerjee, O., El Ghaoui, L. E. and D'Aspremont, A. (2008) Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data. J. Mach. Learn. Res., 9, 485-516.
– reference: Liang, F., Mukherjee, S. and West, M. (2007) The use of unlabeled data in predictive modeling. Statist. Sci., 22, 189-205.
– reference: Bickel, P. and Levina, E. (2008) Covariance regularization by thresholding. Ann. Statist., to be published.
– start-page: 361
  year: 1961
  end-page: 379
– volume: 99
  start-page: 6567
  year: 2002
  end-page: 6572
  article-title: Diagnosis of multiple cancer types by shrunken centroids of gene expression
  publication-title: Proc. Natn. Acad. Sci. USA
– volume: 98
  start-page: 15149
  year: 2001
  end-page: 15154
  article-title: Multiclass cancer diagnosis using tumor gene expression signature
  publication-title: Proc. Natn. Acad. Sci. USA
– volume: 22
  start-page: 189
  year: 2007
  end-page: 205
  article-title: The use of unlabeled data in predictive modeling
  publication-title: Statist. Sci.
– volume: 18
  start-page: 104
  year: 2003
  end-page: 117
  article-title: Class prediction by nearest shrunken centroids, with applications to DNA microarrays
  publication-title: Statist. Sci.
– volume: 98
  start-page: 5116
  year: 2001
  end-page: 5121
  article-title: Significance analysis of microarrays applied to the ionizing radiation response
  publication-title: Proc. Natn. Acad. Sci. USA
– year: 2008
  article-title: Covariance regularization by thresholding
  publication-title: Ann. Statist.
– volume: 8
  start-page: 86
  year: 2007
  end-page: 100
  article-title: Regularized linear discriminant analysis and its application in microarrays
  publication-title: Biostatistics
– volume: 9
  start-page: 432
  year: 2007
  end-page: 441
  article-title: Sparse inverse covariance estimation with the graphical lasso
  publication-title: Biostatistics
– volume: 354
  start-page: 2419
  year: 2006
  end-page: 2430
  article-title: A biological definition of Burkitt's lymphoma from transcriptional and genomic profiling
  publication-title: New Engl. J. Med.
– volume: 5
  start-page: 427
  year: 2004
  end-page: 443
  article-title: Classification of gene microarrays by penalized logistic regression
  publication-title: Biostatistics
– year: 1979
– volume: 9
  start-page: 485
  year: 2008
  end-page: 516
  article-title: Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data
  publication-title: J. Mach. Learn. Res.
– year: 1992
– volume: 105
  start-page: 1851
  year: 2005
  end-page: 1861
  article-title: Molecular profiling of diffuse large B‐cell lymphoma identifies robust subtypes including one characterized by host inflammatory response
  publication-title: Blood
– volume: 2
  start-page: 494
  year: 2008
  end-page: 515
  article-title: Sparse permutation invariant covariance estimation
  publication-title: Electr. J. Statist.
– volume: 34
  start-page: 1436
  year: 2006
  end-page: 1462
  article-title: High dimensional graphs and variable selection with the lasso
  publication-title: Ann. Statist.
– volume: 7
  start-page: 1264
  year: 1979
  end-page: 1276
  article-title: Estimation of the inverse covariance matrix: random mixtures of the inverse Wishart matrix and the identity
  publication-title: Ann. Statist.
– volume: 18
  start-page: 1527
  year: 2006
  end-page: 1553
  article-title: A fast learning algorithm for deep belief nets
  publication-title: Neur. Computn
– volume: 2
  start-page: 511
  year: 2004
  end-page: 522
  article-title: Semi‐supervised methods to predict patient survival from gene expression data
  publication-title: PLOS Biol.
– volume: 8
  start-page: 68
  year: 2002
  end-page: 74
  article-title: Diffuse large B‐cell lymphoma outcome prediction by gene‐expression profiling and supervised machine learning
  publication-title: Nat. Med.
– volume: 45
  start-page: 5
  year: 2001
  end-page: 32
  article-title: Random forests
  publication-title: Mach. Learn.
– year: 1980
– volume: 46
  start-page: 149
  year: 1984
  end-page: 192
  article-title: Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives
  publication-title: J. R. Statist. Soc. B
– year: 2008
– volume: 35
  start-page: 109
  year: 1993
  end-page: 148
  article-title: A statistical view of some chemometrics regression tools (with discussion)
  publication-title: Technometrics
– volume: 12
  start-page: 55
  year: 1970
  end-page: 67
  article-title: Ridge regression: biased estimation for nonorthogonal problems
  publication-title: Technometrics
– volume: 346
  start-page: 1937
  year: 2002
  end-page: 1947
  article-title: The use of molecular profiling to predict survival after chemotherapy for diffuse large B‐cell lymphoma
  publication-title: New Engl. J. Med.
– volume: 13
  start-page: 1581
  year: 1985
  end-page: 1591
  article-title: Estimation of a covariance matrix under Stein's loss
  publication-title: Ann. Statist.
– volume: 67
  start-page: 301
  year: 2005
  end-page: 320
  article-title: Regularization and variable selection via the elastic net
  publication-title: J. R. Statist. Soc. B
– volume: 84
  start-page: 165
  year: 1989
  end-page: 175
  article-title: Regularized discriminant analysis
  publication-title: J. Am. Statist. Ass.
– volume: 69
  start-page: 659
  year: 2007
  end-page: 677
  article-title: ‐regularization path algorithm for generalized linear models
  publication-title: J. R. Statist. Soc. B
– volume: 7
  start-page: 2541
  year: 2006
  end-page: 2563
  article-title: On model selection consistency of lasso
  publication-title: J. Mach. Learn. Res.
– volume: 58
  start-page: 267
  year: 1996
  end-page: 288
  article-title: Regression shrinkage and selection via the lasso
  publication-title: J. R. Statist. Soc. B
– volume: 73
  start-page: 821
  year: 1978
  end-page: 826
  article-title: Normal discrimination with unclassified observations
  publication-title: J. Am. Statist. Ass.
– volume: 18
  start-page: 1527
  year: 2006
  ident: 2023033000105762400_
  article-title: A fast learning algorithm for deep belief nets
  publication-title: Neur. Computn
  doi: 10.1162/neco.2006.18.7.1527
– volume: 73
  start-page: 821
  year: 1978
  ident: 2023033000105762400_
  article-title: Normal discrimination with unclassified observations
  publication-title: J. Am. Statist. Ass.
  doi: 10.1080/01621459.1978.10480106
– volume: 18
  start-page: 104
  year: 2003
  ident: 2023033000105762400_
  article-title: Class prediction by nearest shrunken centroids, with applications to DNA microarrays
  publication-title: Statist. Sci.
  doi: 10.1214/ss/1056397488
– volume: 7
  start-page: 1264
  year: 1979
  ident: 2023033000105762400_
  article-title: Estimation of the inverse covariance matrix: random mixtures of the inverse Wishart matrix and the identity
  publication-title: Ann. Statist.
  doi: 10.1214/aos/1176344845
– volume: 354
  start-page: 2419
  year: 2006
  ident: 2023033000105762400_
  article-title: A biological definition of Burkitt’s lymphoma from transcriptional and genomic profiling
  publication-title: New Engl. J. Med.
  doi: 10.1056/NEJMoa055351
– volume: 8
  start-page: 68
  year: 2002
  ident: 2023033000105762400_
  article-title: Diffuse large B-cell lymphoma outcome prediction by gene-expression profiling and supervised machine learning
  publication-title: Nat. Med.
  doi: 10.1038/nm0102-68
– volume: 9
  start-page: 432
  year: 2007
  ident: 2023033000105762400_
  article-title: Sparse inverse covariance estimation with the graphical lasso
  publication-title: Biostatistics
  doi: 10.1093/biostatistics/kxm045
– volume: 84
  start-page: 165
  year: 1989
  ident: 2023033000105762400_
  article-title: Regularized discriminant analysis
  publication-title: J. Am. Statist. Ass.
  doi: 10.1080/01621459.1989.10478752
– volume: 13
  start-page: 1581
  year: 1985
  ident: 2023033000105762400_
  article-title: Estimation of a covariance matrix under Stein’s loss
  publication-title: Ann. Statist.
  doi: 10.1214/aos/1176349756
– volume: 346
  start-page: 1937
  year: 2002
  ident: 2023033000105762400_
  article-title: The use of molecular profiling to predict survival after chemotherapy for diffuse large B-cell lymphoma
  publication-title: New Engl. J. Med.
  doi: 10.1056/NEJMoa012914
– volume: 12
  start-page: 55
  year: 1970
  ident: 2023033000105762400_
  article-title: Ridge regression: biased estimation for nonorthogonal problems
  publication-title: Technometrics
  doi: 10.1080/00401706.1970.10488634
– volume: 98
  start-page: 15149
  year: 2001
  ident: 2023033000105762400_
  article-title: Multiclass cancer diagnosis using tumor gene expression signature
  publication-title: Proc. Natn. Acad. Sci. USA
  doi: 10.1073/pnas.211566398
– volume: 35
  start-page: 109
  year: 1993
  ident: 2023033000105762400_
  article-title: A statistical view of some chemometrics regression tools (with discussion)
  publication-title: Technometrics
  doi: 10.1080/00401706.1993.10485033
– volume: 58
  start-page: 267
  year: 1996
  ident: 2023033000105762400_
  article-title: Regression shrinkage and selection via the lasso
  publication-title: J. R. Statist. Soc. B
– volume: 34
  start-page: 1436
  year: 2006
  ident: 2023033000105762400_
  article-title: High dimensional graphs and variable selection with the lasso
  publication-title: Ann. Statist.
  doi: 10.1214/009053606000000281
– volume: 5
  start-page: 427
  year: 2004
  ident: 2023033000105762400_
  article-title: Classification of gene microarrays by penalized logistic regression
  publication-title: Biostatistics
  doi: 10.1093/biostatistics/kxg046
– volume: 8
  start-page: 86
  year: 2007
  ident: 2023033000105762400_
  article-title: Regularized linear discriminant analysis and its application in microarrays
  publication-title: Biostatistics
  doi: 10.1093/biostatistics/kxj035
– start-page: 361
  volume-title: Proc. 4th Berkeley Symp. Mathematics and Statistical Probability
  year: 1961
  ident: 2023033000105762400_
– volume: 22
  start-page: 189
  year: 2007
  ident: 2023033000105762400_
  article-title: The use of unlabeled data in predictive modeling
  publication-title: Statist. Sci.
  doi: 10.1214/088342307000000032
– volume: 98
  start-page: 5116
  year: 2001
  ident: 2023033000105762400_
  article-title: Significance analysis of microarrays applied to the ionizing radiation response
  publication-title: Proc. Natn. Acad. Sci. USA
  doi: 10.1073/pnas.091062498
– volume: 46
  start-page: 149
  year: 1984
  ident: 2023033000105762400_
  article-title: Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives
  publication-title: J. R. Statist. Soc. B
– volume: 2
  start-page: 511
  year: 2004
  ident: 2023033000105762400_
  article-title: Semi-supervised methods to predict patient survival from gene expression data
  publication-title: PLOS Biol.
  doi: 10.1371/journal.pbio.0020108
– volume-title: Discriminant Analysis and Statistical Pattern Recognition
  year: 1992
  ident: 2023033000105762400_
  doi: 10.1002/0471725293
– volume: 105
  start-page: 1851
  year: 2005
  ident: 2023033000105762400_
  article-title: Molecular profiling of diffuse large B-cell lymphoma identifies robust subtypes including one characterized by host inflammatory response
  publication-title: Blood
  doi: 10.1182/blood-2004-07-2947
– volume-title: The Statistical Analysis of Failure Time Data
  year: 1980
  ident: 2023033000105762400_
– volume-title: Multivariate Analysis
  year: 1979
  ident: 2023033000105762400_
– volume: 67
  start-page: 301
  year: 2005
  ident: 2023033000105762400_
  article-title: Regularization and variable selection via the elastic net
  publication-title: J. R. Statist. Soc. B
  doi: 10.1111/j.1467-9868.2005.00503.x
– volume: 7
  start-page: 2541
  year: 2006
  ident: 2023033000105762400_
  article-title: On model selection consistency of lasso
  publication-title: J. Mach. Learn. Res.
– volume: 45
  start-page: 5
  year: 2001
  ident: 2023033000105762400_
  article-title: Random forests
  publication-title: Mach. Learn.
  doi: 10.1023/A:1010933404324
– volume: 9
  start-page: 485
  year: 2008
  ident: 2023033000105762400_
  article-title: Model selection through sparse maximum likelihood estimation for multivariate gaussian or binary data
  publication-title: J. Mach. Learn. Res.
– volume: 2
  start-page: 494
  year: 2008
  ident: 2023033000105762400_
  article-title: Sparse permutation invariant covariance estimation
  publication-title: Electr. J. Statist.
– volume: 99
  start-page: 6567
  year: 2002
  ident: 2023033000105762400_
  article-title: Diagnosis of multiple cancer types by shrunken centroids of gene expression
  publication-title: Proc. Natn. Acad. Sci. USA
  doi: 10.1073/pnas.082099299
– year: 2008
  ident: 2023033000105762400_
  article-title: Covariance regularization by thresholding
  publication-title: Ann. Statist.
  doi: 10.1214/08-AOS600
– volume-title: Regularization paths for generalized linear models via coordinate descent
  year: 2008
  ident: 2023033000105762400_
– volume: 69
  start-page: 659
  year: 2007
  ident: 2023033000105762400_
  article-title: L1-regularization path algorithm for generalized linear models
  publication-title: J. R. Statist. Soc. B
  doi: 10.1111/j.1467-9868.2007.00607.x
SSID ssj0000673
Score 2.3603723
Snippet We propose covariance-regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse...
We propose covariance‐regularized regression, a family of methods for prediction in high dimensional settings that uses a shrunken estimate of the inverse...
In recent years, many methods have been developed for regression in high-dimensional settings. We propose covariance-regularized regression, a family of...
SourceID pubmedcentral
proquest
repec
pubmed
pascalfrancis
crossref
wiley
jstor
istex
fao
SourceType Open Access Repository
Aggregation Database
Index Database
Enrichment Source
Publisher
StartPage 615
SubjectTerms Classification
Coefficients
Covariance
Covariance matrices
Covariance regularization
data collection
Datasets
Discriminant analysis
Estimating techniques
Estimation
Estimation methods
Estimators
Exact sciences and technology
Gene expression
General topics
Generalized linear models
Least squares
Linear inference, regression
Linear models
Linear regression
Mathematical procedures
Mathematics
Modeling
Multivariate analysis
n[double less-than sign]p
n≪p
prediction
Probability and statistics
Probability theory and stochastic processes
Regression
Regression analysis
Regression coefficients
Regulation
Sciences and techniques of general use
Statistical methods
Statistics
Stochastic processes
Studies
Variable selection
Title Covariance-regularized regression and classification for high dimensional problems
URI https://api.istex.fr/ark:/67375/WNG-RBPDSSKB-P/fulltext.pdf
https://www.jstor.org/stable/40247591
https://onlinelibrary.wiley.com/doi/abs/10.1111%2Fj.1467-9868.2009.00699.x
https://www.ncbi.nlm.nih.gov/pubmed/20084176
http://econpapers.repec.org/article/blajorssb/v_3a71_3ay_3a2009_3ai_3a3_3ap_3a615-636.htm
https://www.proquest.com/docview/200865166
https://www.proquest.com/docview/1835471747
https://www.proquest.com/docview/37166786
https://www.proquest.com/docview/46300280
https://pubmed.ncbi.nlm.nih.gov/PMC2806603
Volume 71
hasFullText 1
inHoldings 1
isFullTextHit
isPrint
link http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV3db9MwELdgvOyF77EwKEFCvGWqY8eJH2lhDCamqV3F3izbdfbRkVZNO4098SfwN_KXcOekge5DmhAPriLZjqrz3fl39i93hLyBPVQYajytXEc8MyIymrpIUysgak408zf4X3bF9oB_PkgOav4TfgtT5YdoDtzQMry_RgPXprxq5DIT2SLtpJByE_EkUrcQH_Xiv50yqz7BkhFsopdIPde-aGmnupvrMeBXFP35grqIPEpdgijzqgbGdSD1Ktfy3tRNnF3Gwn4z23pARgsxVByW0eZ8ZjbtxaUMkf9HTg_J_Rrzhu8qJX1E7rjiMVlFmFtliX5C9rvjMwjaUQN__fg5dYdIjj2-cMMQniumbhHqYhhaBPvIbvIKFQLiDjHhcjjEIgVVgpGwrpFTPiWDrQ_73e2orvcQ2Yy2JWyWCTWxy6lMHE-ZSWMXJ3kurKOAUmkmhZNYX0sbawxzNpYOgqfcGKu5NEKyNbJSjAu3TsKc5tI6wTjnsNpayJy2rUlcmnAjs9wGJF2srbJ1MnSsyXGqloKiVKHYsFSnVF5s6jwgtJk5qRKC3GLOOqiP0ofgt9WgH-NtMeLghPOAvPU61bxLT0fItUsT9XX3o-p19t73-zsdtReQNa90zUAI_DFXIw1Ia0kLmwEI5gB_i4BsLNRS1Q6q9NVHBfRD7-umFzwLXhfpwo3npaJ4JgjRPk8D8uqGMQzCbYA74uYRHHO6xVk7IM8qQ_jz_7CWA01hbrpkIs0AzHy-3FMcH_kM6EgHEG0WkK43pmaGOdUn42lZGnWmmE4p_HyH5teCwduYZtAm0MANKcGEOpp9C4jw9nPr5VS9fr8DT8__deIGWa0uJfEw7wVZmU3n7iVg25lpgdfa-dTyvus38kGVHQ
linkProvider Wiley-Blackwell
linkToHtml http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1Lb9QwELagPdAL79JQaIOEuGW1jhMnPrJbykLbVbUP0Ztle522dMmu9lGVnvgJ_EZ-CTNONrB9SBXi4FUk29FqPDP-xp58Q8hb2EO5ptqllasgSjUPtKI2UNRwiJpjxdwN_kGbt_rR56P4qCwHhN_CFPwQ1YEbWobz12jgeCB93cpFytMF7yQXogaAchULfGM5g51O-LdbZsVHWCKAbfRKWs-Nb1raq-5nagQIFoV_sUhexExKNQVhZkUVjJtg6vVsy9WJHVuzjIbddrb7iAwXgiiyWM5q85mumcsrHJH_SVKPycMS9vrvCz19Qu7Z_ClZQ6RbEEU_I73m6BzidlTCXz9-Tuwx5seeXtqBD89Fsm7uq3zgG8T7mODkdMoH0O0j57I_wDoFBceIX5bJmT4n_d0PvWYrKEs-BCaldQH7ZUx1aDMqYhslTCehDeMs48ZSAKo0FdwKLLGltNGaWRMKC_FTprVRkdBcsHWyko9yu0H8jGbCWM6iKILlVlxktG50bJM40iLNjEeSxeJKU_KhY1mOoVyKixKJYsNqnUI6sckLj9Bq5rjgBLnDnA3QH6mOwXXLfjfEC2OEwnEUeeSdU6rqXWpyhul2SSy_tD_KTuNwp9vda8hDj6w7rasGQuyPdI3UI1tLalgNQDwHEJx7ZHOhl7L0UVNXgBTshEPvm6oXnAveGKncjuZTSfFYEAL-KPHI9i1jGETcgHj47SMipHUL07pHXhSW8Of_YTkHmsDcZMlGqgFIfr7ck5-eOBJ0zAjgdeaRprOmaoYeqq-jyXSq5blkKqHw8x2aWwsGb2OKQRtDA08kOePyZPbNI9wZ0J2XU3a63QY8vfzXidvkQat3sC_3P7X3NslacUeJZ3uvyMpsMrevAerO9JZzYb8BfFmYPg
linkToPdf http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwrV1fb9MwELdgk9Be-D8WBluQEG-d6thx4kfaUQaDqmpXsTfLdp1tdKRV005jT3wEPiOfhDunDXR_pAnx4CqS7ag6351_Z_9yR8hr2EOFocbTynWNp0bUjKaupqkVEDXHmvkb_M9tsdfnHw_jwzn_Cb-FKfNDVAduaBneX6OBjwfZVSOXqUgXaSeFlDuAJ1e5ANtBgNSN_vbKrPwGS9ZgF73E6rn2TUtb1d1MjwDAouzPF9xFJFLqAmSZlUUwrkOpV8mWqxM3dnYZDPvdrPWADBdyKEksw53Z1OzYi0spIv-PoB6S-3PQG74ttfQRuePyx2QNcW6ZJvoJOWiOziBqRxX89ePnxB0hO_bkwg1CeC6punmo80FoEe0jvclrVAiQO8SMy-EAqxSUGUbCeZGc4inpt94dNPdq84IPNZvSuoTdMqYmchmVseMJM0nkojjLhHUUYCpNpXASC2xpY41hzkbSQfSUGWM1l0ZItk5W8lHuNkiY0UxaJxjnHFZbC5nRujWxS2JuZJrZgCSLtVV2ng0di3KcqqWoKFEoNqzVKZUXmzoPCK1mjsuMILeYswHqo_QROG7V70V4XYxAOOY8IG-8TlXv0pMhku2SWH1pv1fdRme319tvqE5A1r3SVQMh8sdkjTQgW0taWA1ANAcAXARkc6GWau6hCl9-VEA_9L6qesG14H2Rzt1oViiKh4IQ7vMkINs3jGEQbwPeETeP4JjULUrrAXlWGsKf_4fFHGgCc5MlE6kGYOrz5Z785NinQEc-gKizgDS9MVUzzKn-OpoUhVFniumEws93aH4tGLyNaQZtDA38kBJMqOPpt4AIbz-3Xk7V7fUa8PT8Xyduk3ud3Zb69KG9v0nWygtKPNh7QVamk5l7CTh3ara8A_sNUzKW9g
openUrl ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Covariance-regularized+regression+and+classification+for+high+dimensional+problems&rft.jtitle=Journal+of+the+Royal+Statistical+Society.+Series+B%2C+Statistical+methodology&rft.au=Witten%2C+Daniela+M&rft.au=Tibshirani%2C+Robert&rft.series=Journal+of+the+Royal+Statistical+Society+Series+B&rft.date=2009-06-01&rft.pub=Royal+Statistical+Society&rft.issn=1369-7412&rft.eissn=1467-9868&rft.volume=71&rft.issue=3&rft.spage=615&rft.epage=636&rft_id=info:doi/10.1111%2Fj.1467-9868.2009.00699.x&rft.externalDocID=blajorssb_v_3a71_3ay_3a2009_3ai_3a3_3ap_3a615_636_htm
thumbnail_l http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=1369-7412&client=summon
thumbnail_m http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=1369-7412&client=summon
thumbnail_s http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=1369-7412&client=summon