Element Selection with Wide Class of Optimization Criteria Using Non-Convex Sparse Optimization

Element selection techniques for high-dimensional features have various applications in machine learning. In general, the problem of element selection is typically solved by greedy methods or convex relaxation methods. However, these algorithms are applicable to only a specific class of optimization...

Full description

Saved in:
Bibliographic Details
Published inICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) pp. 1 - 5
Main Authors Kawamura, Taiga, Ueno, Natsuki, Ono, Nobutaka
Format Conference Proceeding
LanguageEnglish
Published IEEE 04.06.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Element selection techniques for high-dimensional features have various applications in machine learning. In general, the problem of element selection is typically solved by greedy methods or convex relaxation methods. However, these algorithms are applicable to only a specific class of optimization criteria such as the minimization of the squared error loss between the original and restored data. To overcome this limitation, we propose an element selection algorithm based on non-convex sparse optimization that can be used with a wider class of optimization criteria than conventional algorithms. In the proposed method, an algorithm based on the alternating direction method of multipliers (ADMM) is de-rived by reformulating the element selection problem as a matrix optimization on a non-convex set. A numerical experiment demonstrated the effectiveness of the proposed method for element selection with a non-squared error loss compared with the conventional greedy method.
ISSN:2379-190X
DOI:10.1109/ICASSP49357.2023.10095559