Second-order H∞ optimal LMS and NLMS algorithms based on a second-order Markov model
Abstract
It is shown that two algorithms obtained by simplifying a Kalman filter considered for a second-order Markov model are H∞ suboptimal. Similar to least mean squares (LMS) and normalized LMS (NLMS) algorithms, these second order algorithms can be thought of as approximate solutions to stochastic or deterministic least squares minimization. It is proved that second-order LMS and NLMS are exact solutions causing the maximum energy gain from the disturbances to the predicted and filtered errors to be less than one, respectively. These algorithms are implemented in two steps. Operation of the first step is like conventional LMS/NLMS algorithms and the second step consists of the estimation of the weight increment vector and prediction of weights for the next iteration. This step applies simple smoothing on the increment of the estimated weights to estimate the speed of the weights. Also they are cost-effective, robust and attractive for improving the tracking performance of smoothly time-varying models.