Sparse LMS for system identification
Abstract
We propose a new approach to adaptive system identification when the system model is sparse. The approach applies L1 relaxation, common in compressive sensing, to improve the performance of LMS-type adaptive methods. This results in two new algorithms, the zero-attracting LMS (ZA-LMS) and the reweighted zero-attracting LMS (RZA-LMS). The ZA-LMS is derived via combining a L1 norm penalty on the coefficients into the quadratic LMS cost function, which generates a zero attractor in the LMS iteration. The zero attractor promotes sparsity in taps during the filtering process, and therefore accelerates convergence when identifying sparse systems. We prove that the ZA-LMS can achieve lower mean square error than the standard LMS. To further improve the filtering performance, the RZA-LMS is developed using a reweighted zero attractor. The performance of the RZA-LMS is superior to that of the ZA-LMS numerically. Experiments demonstrate the advantages of the proposed filters in both convergence rate and steady-state behavior under sparsity assumptions on the true coefficient vector. The RZA-LMS is also shown to be robust when the number of non-zero taps increases.
Paper
Yilun Chen, Yuantao Gu, and Alfred O. Hero III, “Sparse LMS for system identification,” in IEEE Intl Conf. on Acoust., Speech, and Signal Processing, Taiwan, Mar 2009. (.pdf)
Matlab Code
Download (.rar)
Figures
-
Figure 1. Tracking and steady-state behaviors of 16-order adaptive filters, driven by white input signal.
-
Figure 2. Tracking and steady-state behaviors of 16-order adaptive filters, driven by correlated input signal.
-
Figure 3. Tracking and steady-state behaviors of 256-order adaptive filters, driven by white input signal.