通用稀疏多核学习
针对L1范数多核学习方法产生核权重的稀疏解时可能会导致有用信息的丢失和泛化性能退化、Lp范数多核学习方法产生核权重的非稀疏解时会产生很多冗余信息并对噪声敏感,提出了一种通用稀疏多核学习方法。该算法是基于L1范数和Lp范数(p〉1)混合的网状正则化多核学习方法,不仅能灵活地调整稀疏性,而且鼓励核权重的组效应,L1范数和Lp范数多核学习方法可以认为是该方法的特例。该方法引进的混合约束为非线性约束,对此约束采用二阶泰勒展开式近似,并使用半无限规划来求解该优化问题。实验结果表明,改进后的方法在动态调整稀疏性的前提下能获得较好的分类性能,同时也支持组效应,从而验证了改进后的方法是有效可行的。...
Saved in:
Published in | 计算机应用研究 Vol. 33; no. 1; pp. 21 - 27 |
---|---|
Main Author | |
Format | Journal Article |
Language | Chinese |
Published |
安庆师范学院 数学与计算科学学院,安徽 安庆 246133
2016
江南大学 物联网工程学院,江苏 无锡,214122%江南大学 物联网工程学院,江苏 无锡 214122 |
Subjects | |
Online Access | Get full text |
ISSN | 1001-3695 |
DOI | 10.3969/j.issn.1001-3695.2016.01.005 |
Cover
Summary: | 针对L1范数多核学习方法产生核权重的稀疏解时可能会导致有用信息的丢失和泛化性能退化、Lp范数多核学习方法产生核权重的非稀疏解时会产生很多冗余信息并对噪声敏感,提出了一种通用稀疏多核学习方法。该算法是基于L1范数和Lp范数(p〉1)混合的网状正则化多核学习方法,不仅能灵活地调整稀疏性,而且鼓励核权重的组效应,L1范数和Lp范数多核学习方法可以认为是该方法的特例。该方法引进的混合约束为非线性约束,对此约束采用二阶泰勒展开式近似,并使用半无限规划来求解该优化问题。实验结果表明,改进后的方法在动态调整稀疏性的前提下能获得较好的分类性能,同时也支持组效应,从而验证了改进后的方法是有效可行的。 |
---|---|
Bibliography: | 51-1196/TP Zhang Renfeng , Wu Xiaojun,Chen Sugen (1. School of lnternet of Things Engineering, Jiangnan University, Wuxi Jiangsu 214122, China; 2. School of Mathematics & Computational Science, Anqing Teachers College, Anqing Anhui 246133, China) multiple kernel learning (MKL); sparsity; grouping effect; classification Considering that the Ll-norm multiple kernel learning(MKL) method may lead to discard useful informations and yield degenerated generalization performance when it produces sparse solution of the kernel weights, and Lp-norm (p 〉 1 ) mul- tiple kernel learning (MKL) method may result in numerous redundant information and is sensitive to noise when the method produces the kernel weight with non-sparse solution. This paper proposed a method called generalized sparse MKL (GSMKL) method by introducing an elastic-net-type constraint on the kernel weights. More specifically, it was an MKL method with a constraint on the combination of the Ll-norm and Lp-norm (p 〉 1 ) on the kernel weights, which could not |
ISSN: | 1001-3695 |
DOI: | 10.3969/j.issn.1001-3695.2016.01.005 |