Dimension reduction estimation for central mean subspace with missing multivariate response
Multivariate response data often arise in practice and they are frequently subject to missingness. Under this circumstance, the standard sufficient dimension reduction (SDR) methods cannot be used directly. To reduce the dimension and estimate the central mean subspace, a profile least squares estim...
Saved in:
Published in | Journal of multivariate analysis Vol. 174; p. 104542 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Elsevier Inc
01.11.2019
|
Subjects | |
Online Access | Get full text |
ISSN | 0047-259X 1095-7243 |
DOI | 10.1016/j.jmva.2019.104542 |
Cover
Loading…
Abstract | Multivariate response data often arise in practice and they are frequently subject to missingness. Under this circumstance, the standard sufficient dimension reduction (SDR) methods cannot be used directly. To reduce the dimension and estimate the central mean subspace, a profile least squares estimation method is proposed based on an inverse probability weighted technique. The profile least squares method does not need any distributional assumptions on the covariates and hence differs from existing SDR methods. The resulting estimator of the central mean subspace is proved to be asymptotically normal and root n consistent under some mild conditions. The structural dimension is determined by a BIC-type criterion and the consistency of its estimator is established. Comprehensive simulations and a real data analysis show that the proposed method works promisingly. |
---|---|
AbstractList | Multivariate response data often arise in practice and they are frequently subject to missingness. Under this circumstance, the standard sufficient dimension reduction (SDR) methods cannot be used directly. To reduce the dimension and estimate the central mean subspace, a profile least squares estimation method is proposed based on an inverse probability weighted technique. The profile least squares method does not need any distributional assumptions on the covariates and hence differs from existing SDR methods. The resulting estimator of the central mean subspace is proved to be asymptotically normal and root n consistent under some mild conditions. The structural dimension is determined by a BIC-type criterion and the consistency of its estimator is established. Comprehensive simulations and a real data analysis show that the proposed method works promisingly. |
ArticleNumber | 104542 |
Author | Fan, Guo-Liang Xu, Hong-Xia Liang, Han-Ying |
Author_xml | – sequence: 1 givenname: Guo-Liang surname: Fan fullname: Fan, Guo-Liang organization: School of Economics and Management, Shanghai Maritime University, Shanghai, 201306, PR China – sequence: 2 givenname: Hong-Xia surname: Xu fullname: Xu, Hong-Xia organization: Department of Mathematics, Shanghai Maritime University, Shanghai, 201306, PR China – sequence: 3 givenname: Han-Ying surname: Liang fullname: Liang, Han-Ying email: hyliang@tongji.edu.cn organization: School of Mathematical Sciences, Tongji University, Shanghai 200092, PR China |
BookMark | eNp9kMtKxDAUhoOM4Dj6Aq76Ah1zbVNwI-MVBtwoCC5Cmp5oSpsOSWbEt7edceViVudw4Ps5_3eOZn7wgNAVwUuCSXHdLtt-p5cUk2o8cMHpCZoTXIm8pJzN0BxjXuZUVO9n6DzGFmNCRMnn6OPO9eCjG3wWoNmaNG0Qk-v1frVDyAz4FHSX9aB9Frd13GgD2bdLX1nvYnT-M-u3XXI7HZxOMAbFzeAjXKBTq7sIl39zgd4e7l9XT_n65fF5dbvODcM45bZgFZcNrrkRnJTSCCYLELaWVrDGMi40b4raUs0LShlQwMYwIoUEpmsi2ALRQ64JQ4wBrNqE8f_wowhWkx7VqkmPmvSog54Rkv8g49K-89jVdcfRmwMKY6mdg6CiceANNC6ASaoZ3DH8Fx8nhQU |
CitedBy_id | crossref_primary_10_1016_j_jmva_2021_104852 crossref_primary_10_1142_S2010326322500514 crossref_primary_10_1080_03610918_2023_2242606 crossref_primary_10_2298_FIL2407521F |
Cites_doi | 10.1198/016214505000001285 10.1016/j.jspi.2012.02.039 10.1198/016214501753208979 10.1007/s11749-014-0425-z 10.5705/ss.202017.0288 10.1016/j.csda.2015.05.006 10.1186/s12859-014-0346-6 10.1080/01621459.1992.10476258 10.1093/biomet/asq005 10.1177/0962280213511027 10.1111/rssb.12044 10.1016/j.spl.2017.02.030 10.1080/01621459.2011.646925 10.1016/j.jspi.2009.04.024 10.1006/jmva.1999.1866 10.1093/biomet/asv066 10.1214/aos/1021379861 10.1080/01621459.1991.10475035 10.1016/j.csda.2013.08.001 10.1016/j.jspi.2016.08.007 10.1093/biomet/ass075 10.1111/1467-9868.03411 |
ContentType | Journal Article |
Copyright | 2019 Elsevier Inc. |
Copyright_xml | – notice: 2019 Elsevier Inc. |
DBID | AAYXX CITATION |
DOI | 10.1016/j.jmva.2019.104542 |
DatabaseName | CrossRef |
DatabaseTitle | CrossRef |
DatabaseTitleList | |
DeliveryMethod | fulltext_linktorsrc |
Discipline | Mathematics |
EISSN | 1095-7243 |
ExternalDocumentID | 10_1016_j_jmva_2019_104542 S0047259X19301605 |
GroupedDBID | --K --M --Z -~X .~1 0R~ 0SF 1B1 1RT 1~. 1~5 29L 4.4 457 4G. 5GY 5VS 6I. 7-5 71M 8P~ 9JN 9JO AAAKF AACTN AAEDT AAEDW AAFTH AAIAV AAIKJ AAKOC AALRI AAOAW AAQFI AAQXK AARIN AAXUO AAYJJ ABAOU ABEFU ABFNM ABIVO ABJNI ABMAC ABUCO ABVKL ABXDB ABYKQ ACAZW ACDAQ ACGFS ACIWK ACNCT ACRLP ADBBV ADEZE ADFGL ADMUD AEBSH AEKER AENEX AEXQZ AFKWA AFTJW AGHFR AGUBO AGYEJ AHHHB AIEXJ AIGVJ AIKHN AITUG AJBFU AJOXV ALMA_UNASSIGNED_HOLDINGS AMFUW AMRAJ APLSM ARUGR ASPBG AVWKF AXJTR AZFZN BKOJK BLXMC CAG COF CS3 DM4 EBS EFBJH EFLBG EJD EO8 EO9 EP2 EP3 F5P FDB FEDTE FGOYB FIRID FNPLU FYGXN G-2 G-Q GBLVA HAMUX HVGLF HZ~ IHE IXB J1W KOM LG5 M25 M41 MHUIS MO0 N9A NCXOZ O-L O9- OAUVE OK1 OZT P-8 P-9 P2P PC. PQQKQ Q38 R2- RIG RNS ROL RPZ SDF SDG SDP SES SEW SPC SPCBC SSB SSD SSW SSZ T5K TN5 UHS WUQ XFK XOL XPP YHZ ZGI ZMT ZU3 ~G- AATTM AAXKI AAYWO AAYXX ABWVN ACRPL ACVFH ADCNI ADNMO ADVLN AEIPS AEUPX AFJKZ AFPUW AFXIZ AGCQF AGQPQ AGRNS AIGII AIIUN AKBMS AKRWK AKYEP ANKPU APXCP BNPGV CITATION SSH |
ID | FETCH-LOGICAL-c300t-f63948d0b4c54178c5386e5fb8f53df345a4d6bf2a46223e2e0cc31858e3ab153 |
IEDL.DBID | IXB |
ISSN | 0047-259X |
IngestDate | Thu Apr 24 22:58:50 EDT 2025 Tue Jul 01 01:21:48 EDT 2025 Fri Feb 23 02:18:39 EST 2024 |
IsPeerReviewed | true |
IsScholarly | true |
Keywords | secondary Missing data Central mean subspace Sufficient dimension reduction High dimensionality Multivariate response primary |
Language | English |
LinkModel | DirectLink |
MergedId | FETCHMERGED-LOGICAL-c300t-f63948d0b4c54178c5386e5fb8f53df345a4d6bf2a46223e2e0cc31858e3ab153 |
ParticipantIDs | crossref_primary_10_1016_j_jmva_2019_104542 crossref_citationtrail_10_1016_j_jmva_2019_104542 elsevier_sciencedirect_doi_10_1016_j_jmva_2019_104542 |
ProviderPackageCode | CITATION AAYXX |
PublicationCentury | 2000 |
PublicationDate | November 2019 2019-11-00 |
PublicationDateYYYYMMDD | 2019-11-01 |
PublicationDate_xml | – month: 11 year: 2019 text: November 2019 |
PublicationDecade | 2010 |
PublicationTitle | Journal of multivariate analysis |
PublicationYear | 2019 |
Publisher | Elsevier Inc |
Publisher_xml | – name: Elsevier Inc |
References | Cook (b4) 1998 Shi, Lau (b18) 2000; 72 Xia, Tong, Li, Zhu (b22) 2002; 64 Sun, Wang (b19) 2009; 139 Li (b13) 1992; 87 Zhu, Zhong (b28) 2015; 92 Fan, Zhu, Ma (b8) 2019; 29 Liao, Lin, Kang, Chandra, Bon, Kaminski, Sciurba, Tseng (b14) 2014; 15 Zhao, Long (b25) 2016; 25 Cheng, Zhu (b3) 2017; 126 Guo, Xu (b10) 2012; 142 Tang, Zhou (b20) 2015; 24 Ma, Zhu (b17) 2014; 76 Cook, Li (b5) 2002; 30 Ma, Zhu (b15) 2012; 107 Casella, Berger (b2) 2002 Zhu, Wang, Zhu (b27) 2012; 22 Y. Wei, Y. Tang, P.D. McNicholas, Flexible high-dimensional unsupervised learning with missing data, 2017 Fan, Gijbels (b7) 1996 Yang, Wang (b24) 2019 Guo, Wang, Xu, Zhu (b9) 2014; 69 . Bura, Cook (b1) 2001; 96 Li (b12) 1991; 86 Hu, Follmann, Qin (b11) 2010; 97 Deng, Wang (b6) 2017; 181 Ma, Zhu (b16) 2013; 100 Xu, Guo, Xiong, Zhu, Jin (b23) 2016; 103 Zhu, Miao, Peng (b26) 2006; 101 Zhu (10.1016/j.jmva.2019.104542_b26) 2006; 101 Cook (10.1016/j.jmva.2019.104542_b5) 2002; 30 Deng (10.1016/j.jmva.2019.104542_b6) 2017; 181 Ma (10.1016/j.jmva.2019.104542_b17) 2014; 76 10.1016/j.jmva.2019.104542_b21 Cook (10.1016/j.jmva.2019.104542_b4) 1998 Shi (10.1016/j.jmva.2019.104542_b18) 2000; 72 Fan (10.1016/j.jmva.2019.104542_b7) 1996 Guo (10.1016/j.jmva.2019.104542_b9) 2014; 69 Xu (10.1016/j.jmva.2019.104542_b23) 2016; 103 Xia (10.1016/j.jmva.2019.104542_b22) 2002; 64 Casella (10.1016/j.jmva.2019.104542_b2) 2002 Zhu (10.1016/j.jmva.2019.104542_b27) 2012; 22 Guo (10.1016/j.jmva.2019.104542_b10) 2012; 142 Hu (10.1016/j.jmva.2019.104542_b11) 2010; 97 Cheng (10.1016/j.jmva.2019.104542_b3) 2017; 126 Zhu (10.1016/j.jmva.2019.104542_b28) 2015; 92 Li (10.1016/j.jmva.2019.104542_b13) 1992; 87 Ma (10.1016/j.jmva.2019.104542_b15) 2012; 107 Liao (10.1016/j.jmva.2019.104542_b14) 2014; 15 Ma (10.1016/j.jmva.2019.104542_b16) 2013; 100 Tang (10.1016/j.jmva.2019.104542_b20) 2015; 24 Li (10.1016/j.jmva.2019.104542_b12) 1991; 86 Zhao (10.1016/j.jmva.2019.104542_b25) 2016; 25 Fan (10.1016/j.jmva.2019.104542_b8) 2019; 29 Yang (10.1016/j.jmva.2019.104542_b24) 2019 Bura (10.1016/j.jmva.2019.104542_b1) 2001; 96 Sun (10.1016/j.jmva.2019.104542_b19) 2009; 139 |
References_xml | – volume: 103 start-page: 189 year: 2016 end-page: 203 ident: b23 article-title: An estimating equation approach to dimension reduction for longitudinal data publication-title: Biometrika – volume: 107 start-page: 168 year: 2012 end-page: 179 ident: b15 article-title: A semiparametric approach to dimension reduction publication-title: J. Amer. Statist. Assoc. – volume: 76 start-page: 885 year: 2014 end-page: 901 ident: b17 article-title: On estimation efficiency of the central mean subspace publication-title: J. R. Stat. Soc. Ser. B Stat. Methodol. – volume: 92 start-page: 68 year: 2015 end-page: 83 ident: b28 article-title: Estimation and inference on central mean subspace for multivariate response data publication-title: Comput. Statist. Data Anal. – volume: 100 start-page: 371 year: 2013 end-page: 383 ident: b16 article-title: Efficiency loss and the linearity condition in dimension reduction publication-title: Biometrika – volume: 181 start-page: 11 year: 2017 end-page: 29 ident: b6 article-title: Dimension reduction estimation for probability density with data missing at random when covariables are present publication-title: J. Statist. Plann. Inference – volume: 69 start-page: 228 year: 2014 end-page: 242 ident: b9 article-title: Dimension reduction with missing response at random publication-title: Comput. Statist. Data Anal. – volume: 15 start-page: 346 year: 2014 ident: b14 article-title: Missing value imputation in high-dimensional phenomic data: imputable or not, and how? publication-title: BMC Bioinformatics – volume: 97 start-page: 305 year: 2010 end-page: 319 ident: b11 article-title: Semiparametric dimension reduction estimation for mean response with missing data publication-title: Biometrika – volume: 86 start-page: 316 year: 1991 end-page: 327 ident: b12 article-title: Sliced inverse regression for dimension reduction publication-title: J. Amer. Statist. Assoc. – volume: 30 start-page: 455 year: 2002 end-page: 474 ident: b5 article-title: Dimension reduction for conditional mean in regression publication-title: Ann. Statist. – volume: 24 start-page: 583 year: 2015 end-page: 604 ident: b20 article-title: Weighted local linear CQR for varying-coefficient models with missing covariates publication-title: Test – reference: Y. Wei, Y. Tang, P.D. McNicholas, Flexible high-dimensional unsupervised learning with missing data, 2017, – volume: 64 start-page: 363 year: 2002 end-page: 410 ident: b22 article-title: An adaptive estimation of dimension reduction space publication-title: J. R. Stat. Soc. Ser. B Stat. Methodol. – volume: 25 start-page: 2021 year: 2016 end-page: 2035 ident: b25 article-title: Multiple imputation in the presence of high-dimensional data publication-title: Stat. Methods Med. Res. – volume: 139 start-page: 3588 year: 2009 end-page: 3604 ident: b19 article-title: Checking the adequacy of a general linear model with responses missing at random publication-title: J. Statist. Plann. Inference – volume: 29 start-page: 917 year: 2019 end-page: 937 ident: b8 article-title: Nonlinear interaction detection through model-based sufficient dimension reduction publication-title: Statist. Sinica – year: 1996 ident: b7 article-title: Local Polynomial Modelling and Its Applications – volume: 72 start-page: 132 year: 2000 end-page: 148 ident: b18 article-title: Empirical likelihood for partially linear models publication-title: J. Multivariate Anal. – volume: 22 start-page: 1611 year: 2012 end-page: 1637 ident: b27 article-title: Sufficient dimension reduction in regressions with missing predictors publication-title: Statist. Sinica – volume: 87 start-page: 1025 year: 1992 end-page: 1039 ident: b13 article-title: On principal hessian directions for data visualization and dimension reduction: another application of Stein’s lemma publication-title: J. Amer. Statist. Assoc. – reference: . – volume: 96 start-page: 996 year: 2001 end-page: 1003 ident: b1 article-title: Extending sliced inverse regression: the weighted chi-squared test publication-title: J. Amer. Statist. Assoc. – year: 2019 ident: b24 article-title: Sufficient dimension reduction under dimension-reduction-based imputation with predictors missing at random publication-title: Statist. Sinica – volume: 142 start-page: 2047 year: 2012 end-page: 2058 ident: b10 article-title: Goodness-of-fit tests for general linear models with covariates missed at random publication-title: J. Statist. Plann. Inference – volume: 126 start-page: 108 year: 2017 end-page: 113 ident: b3 article-title: On relative efficiency of principal Hessian directions publication-title: Statist. Probab. Lett. – year: 2002 ident: b2 article-title: Statistical Inference – volume: 101 start-page: 630 year: 2006 end-page: 643 ident: b26 article-title: On sliced inverse regression with high-dimensional covariates publication-title: J. Amer. Statist. Assoc. – year: 1998 ident: b4 article-title: Regression Graphics – year: 1996 ident: 10.1016/j.jmva.2019.104542_b7 – volume: 101 start-page: 630 year: 2006 ident: 10.1016/j.jmva.2019.104542_b26 article-title: On sliced inverse regression with high-dimensional covariates publication-title: J. Amer. Statist. Assoc. doi: 10.1198/016214505000001285 – volume: 142 start-page: 2047 year: 2012 ident: 10.1016/j.jmva.2019.104542_b10 article-title: Goodness-of-fit tests for general linear models with covariates missed at random publication-title: J. Statist. Plann. Inference doi: 10.1016/j.jspi.2012.02.039 – ident: 10.1016/j.jmva.2019.104542_b21 – volume: 96 start-page: 996 year: 2001 ident: 10.1016/j.jmva.2019.104542_b1 article-title: Extending sliced inverse regression: the weighted chi-squared test publication-title: J. Amer. Statist. Assoc. doi: 10.1198/016214501753208979 – volume: 24 start-page: 583 year: 2015 ident: 10.1016/j.jmva.2019.104542_b20 article-title: Weighted local linear CQR for varying-coefficient models with missing covariates publication-title: Test doi: 10.1007/s11749-014-0425-z – year: 2019 ident: 10.1016/j.jmva.2019.104542_b24 article-title: Sufficient dimension reduction under dimension-reduction-based imputation with predictors missing at random publication-title: Statist. Sinica doi: 10.5705/ss.202017.0288 – volume: 92 start-page: 68 year: 2015 ident: 10.1016/j.jmva.2019.104542_b28 article-title: Estimation and inference on central mean subspace for multivariate response data publication-title: Comput. Statist. Data Anal. doi: 10.1016/j.csda.2015.05.006 – volume: 15 start-page: 346 year: 2014 ident: 10.1016/j.jmva.2019.104542_b14 article-title: Missing value imputation in high-dimensional phenomic data: imputable or not, and how? publication-title: BMC Bioinformatics doi: 10.1186/s12859-014-0346-6 – volume: 87 start-page: 1025 year: 1992 ident: 10.1016/j.jmva.2019.104542_b13 article-title: On principal hessian directions for data visualization and dimension reduction: another application of Stein’s lemma publication-title: J. Amer. Statist. Assoc. doi: 10.1080/01621459.1992.10476258 – year: 2002 ident: 10.1016/j.jmva.2019.104542_b2 – volume: 97 start-page: 305 year: 2010 ident: 10.1016/j.jmva.2019.104542_b11 article-title: Semiparametric dimension reduction estimation for mean response with missing data publication-title: Biometrika doi: 10.1093/biomet/asq005 – volume: 29 start-page: 917 year: 2019 ident: 10.1016/j.jmva.2019.104542_b8 article-title: Nonlinear interaction detection through model-based sufficient dimension reduction publication-title: Statist. Sinica – volume: 25 start-page: 2021 year: 2016 ident: 10.1016/j.jmva.2019.104542_b25 article-title: Multiple imputation in the presence of high-dimensional data publication-title: Stat. Methods Med. Res. doi: 10.1177/0962280213511027 – volume: 76 start-page: 885 year: 2014 ident: 10.1016/j.jmva.2019.104542_b17 article-title: On estimation efficiency of the central mean subspace publication-title: J. R. Stat. Soc. Ser. B Stat. Methodol. doi: 10.1111/rssb.12044 – volume: 126 start-page: 108 year: 2017 ident: 10.1016/j.jmva.2019.104542_b3 article-title: On relative efficiency of principal Hessian directions publication-title: Statist. Probab. Lett. doi: 10.1016/j.spl.2017.02.030 – volume: 107 start-page: 168 year: 2012 ident: 10.1016/j.jmva.2019.104542_b15 article-title: A semiparametric approach to dimension reduction publication-title: J. Amer. Statist. Assoc. doi: 10.1080/01621459.2011.646925 – volume: 139 start-page: 3588 year: 2009 ident: 10.1016/j.jmva.2019.104542_b19 article-title: Checking the adequacy of a general linear model with responses missing at random publication-title: J. Statist. Plann. Inference doi: 10.1016/j.jspi.2009.04.024 – year: 1998 ident: 10.1016/j.jmva.2019.104542_b4 – volume: 72 start-page: 132 year: 2000 ident: 10.1016/j.jmva.2019.104542_b18 article-title: Empirical likelihood for partially linear models publication-title: J. Multivariate Anal. doi: 10.1006/jmva.1999.1866 – volume: 22 start-page: 1611 year: 2012 ident: 10.1016/j.jmva.2019.104542_b27 article-title: Sufficient dimension reduction in regressions with missing predictors publication-title: Statist. Sinica – volume: 103 start-page: 189 year: 2016 ident: 10.1016/j.jmva.2019.104542_b23 article-title: An estimating equation approach to dimension reduction for longitudinal data publication-title: Biometrika doi: 10.1093/biomet/asv066 – volume: 30 start-page: 455 year: 2002 ident: 10.1016/j.jmva.2019.104542_b5 article-title: Dimension reduction for conditional mean in regression publication-title: Ann. Statist. doi: 10.1214/aos/1021379861 – volume: 86 start-page: 316 year: 1991 ident: 10.1016/j.jmva.2019.104542_b12 article-title: Sliced inverse regression for dimension reduction publication-title: J. Amer. Statist. Assoc. doi: 10.1080/01621459.1991.10475035 – volume: 69 start-page: 228 year: 2014 ident: 10.1016/j.jmva.2019.104542_b9 article-title: Dimension reduction with missing response at random publication-title: Comput. Statist. Data Anal. doi: 10.1016/j.csda.2013.08.001 – volume: 181 start-page: 11 year: 2017 ident: 10.1016/j.jmva.2019.104542_b6 article-title: Dimension reduction estimation for probability density with data missing at random when covariables are present publication-title: J. Statist. Plann. Inference doi: 10.1016/j.jspi.2016.08.007 – volume: 100 start-page: 371 year: 2013 ident: 10.1016/j.jmva.2019.104542_b16 article-title: Efficiency loss and the linearity condition in dimension reduction publication-title: Biometrika doi: 10.1093/biomet/ass075 – volume: 64 start-page: 363 year: 2002 ident: 10.1016/j.jmva.2019.104542_b22 article-title: An adaptive estimation of dimension reduction space publication-title: J. R. Stat. Soc. Ser. B Stat. Methodol. doi: 10.1111/1467-9868.03411 |
SSID | ssj0011574 |
Score | 2.2406867 |
Snippet | Multivariate response data often arise in practice and they are frequently subject to missingness. Under this circumstance, the standard sufficient dimension... |
SourceID | crossref elsevier |
SourceType | Enrichment Source Index Database Publisher |
StartPage | 104542 |
SubjectTerms | Central mean subspace High dimensionality Missing data Multivariate response Sufficient dimension reduction |
Title | Dimension reduction estimation for central mean subspace with missing multivariate response |
URI | https://dx.doi.org/10.1016/j.jmva.2019.104542 |
Volume | 174 |
hasFullText | 1 |
inHoldings | 1 |
isFullTextHit | |
isPrint | |
link | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3PS8MwFA5zXvQg_sT5Y-TgTeraJmnT45yOqWwnBwUPpUlT2HB1bHNH_3bfa9OhIDt4Ki1NKV_Cy_eS930h5EbLIDCSZY6vmXJ4lDEn0ix0NPfgykLDy2M6h6NgMObPsYgbpFdrYbCs0sb-KqaX0do-6Vg0O_PJBDW-PATyHgMFQZs0FJozLksRX3y_2UnwhHViLh0JotgKZ6oar-lsjd5DXoRbnYL7f09OPyac_iE5sEyRdqufOSINUxyT_eHGZnV5Qt4e0Jsf17voAi1YEWSKthmVHpECIaW2-pLOTFrQJYQJSJINxeVXCl2MKwW0LCpcQ9IMvBM-VBbNmlMy7j--9gaOPS3B0cx1V04OXIPLzFVcC-6FUkMoC4zIlcwFy3LGRcqzQOV-ygPgBMY3rtaonZaGpQoC3xlpFh-FOSfUeNzAvJVDagLZmwzTSAsAKdKh0kYrt0W8GqZEWytxPNHiPalrxqYJQpsgtEkFbYvcbtrMKyONrW-LGv3k13BIINJvaXfxz3aXZA_vKpHhFWmuFp_mGtjGSrXJzt2X1ya73aeXwahdDq5vMrPVzg |
linkProvider | Elsevier |
linkToHtml | http://utb.summon.serialssolutions.com/2.0.0/link/0/eLvHCXMwnV3PT8IwFG4QD-rB-DPizx68mYVtbffjqCgBBU6QLPHQrF2XQGQSQP5-39sK0cRw8LRkW5fla_P1e-17Xwm511EQmIhljq-ZcnicMSfWLHQ09-DKQsPLYzr7g6Az4q-JSGqkta6FwbRKy_0Vp5dsbe80LZrN2XiMNb48BPGegARBmzSxQ3ZBDQQ4tLvJ02YrwRPWirm0JIgTWzlTJXlNpis0H_Ji3OsU3P97dvox47SPyKGVivSx-ptjUjPFCTnob3xWF6fk_RnN-XHBi87RgxVRpuibURUkUlCk1KZf0qlJC7oAnoAo2VBcf6XQx7hUQMuswhVEzSA84UNl1qw5I6P2y7DVcexxCY5mrrt0chAbPMpcxbXgXhhp4LLAiFxFuWBZzrhIeRao3E95AKLA-MbVGounI8NSBcx3TurFZ2EuCDUeNzBx5RCbQPgWhWmsBYAU61Bpo5XbIN4aJqmtlzgeafEh10ljE4nQSoRWVtA2yMOmzaxy0tj6tlijL3-NBwlUv6Xd5T_b3ZG9zrDfk73u4O2K7OOTquLwmtSX8y9zA9JjqW7LofUNUTDWaQ |
openUrl | ctx_ver=Z39.88-2004&ctx_enc=info%3Aofi%2Fenc%3AUTF-8&rfr_id=info%3Asid%2Fsummon.serialssolutions.com&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.genre=article&rft.atitle=Dimension+reduction+estimation+for+central+mean+subspace+with+missing+multivariate+response&rft.jtitle=Journal+of+multivariate+analysis&rft.au=Fan%2C+Guo-Liang&rft.au=Xu%2C+Hong-Xia&rft.au=Liang%2C+Han-Ying&rft.date=2019-11-01&rft.issn=0047-259X&rft.volume=174&rft.spage=104542&rft_id=info:doi/10.1016%2Fj.jmva.2019.104542&rft.externalDBID=n%2Fa&rft.externalDocID=10_1016_j_jmva_2019_104542 |
thumbnail_l | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/lc.gif&issn=0047-259X&client=summon |
thumbnail_m | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/mc.gif&issn=0047-259X&client=summon |
thumbnail_s | http://covers-cdn.summon.serialssolutions.com/index.aspx?isbn=/sc.gif&issn=0047-259X&client=summon |