An RKHS-based approach to double-penalized regression in high-dimensional partially linear models

We study simultaneous variable selection and estimation in high-dimensional partially linear models under the assumption that the nonparametric component is from a reproducing kernel Hilbert space (RKHS) and that the vector of regression coefficients for the parametric component is sparse. A double...

Full description

Saved in:
Bibliographic Details
Published inJournal of multivariate analysis Vol. 168; pp. 201 - 210
Main Authors Cui, Wenquan, Cheng, Haoyang, Sun, Jiajing
Format Journal Article
LanguageEnglish
Published Elsevier Inc 01.11.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We study simultaneous variable selection and estimation in high-dimensional partially linear models under the assumption that the nonparametric component is from a reproducing kernel Hilbert space (RKHS) and that the vector of regression coefficients for the parametric component is sparse. A double penalty is used to deal with the problem. The estimate of the nonparametric component is subject to a roughness penalty based on the squared semi-norm on the RKHS, and a penalty with oracle properties is used to achieve sparsity in the parametric component. Under regularity conditions, we establish the consistency and rate of convergence of the parametric estimation together with the consistency of variable selection. The proposed estimators of the non-zero coefficients are also shown to have the asymptotic oracle property. Simulations and empirical studies illustrate the performance of the method.
ISSN:0047-259X
1095-7243
DOI:10.1016/j.jmva.2018.07.013