Noise reduction method for nonlinear signal based on maximum variance unfolding and its application to fault diagnosis
A new noise reduction method for nonlinear signal based on maximum variance unfolding (MVU) is proposed. The noisy signal is firstly embedded into a high-dimensional phase space based on phase space reconstruction theory, and then the manifold learning algorithm MVU is used to perform nonlinear dime...
Saved in:
Published in | Science China Technological Sciences Vol. 53; no. 8; pp. 2122 - 2128 |
---|---|
Main Authors | , |
Format | Journal Article |
Language | English |
Published |
Heidelberg
SP Science China Press
01.08.2010
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | A new noise reduction method for nonlinear signal based on maximum variance unfolding (MVU) is proposed. The noisy signal is firstly embedded into a high-dimensional phase space based on phase space reconstruction theory, and then the manifold learning algorithm MVU is used to perform nonlinear dimensionality reduction on the data of phase space in order to separate low-dimensional manifold representing the attractor from noise subspace. Finally, the noise-reduced signal is obtained through reconstructing the low-dimensional manifold. The simulation results of Lorenz system show that the proposed MVU-based noise reduction method outperforms the KPCA-based method and has the advantages of simple parameter estimation and low parameter sensitivity. The proposed method is applied to fault detection of a vibration signal from rotor-stator of aero engine with slight rubbing fault. The denoised results show that the slight rubbing features overwhelmed by noise can be effectively extracted by the proposed noise reduction method. |
---|---|
ISSN: | 1674-7321 1862-281X |
DOI: | 10.1007/s11431-009-3172-8 |