Hello. I agree with Omar Gerek's description. To add some details, RLS is an adaptive filtering method for parameter estimation in a deterministic system with parameter vector x(k) and with noisy observations of the parameter vector y(k)=H(k)x(k)+w(k) for k=1..K and w(k) is an iid white noise sequence with zero mean and covariance R (when this is unknown it is usually taken as the identity matrix).
The RLS parameter estimator is an online implementation of least squares that is, as its name suggests, recursive. Thus the current parameter estimate xhat(k) is predicted and corrected using the current measurement only rather than going all the way back to time 1 and solving the LS problem again. The form of the recursion is:
xhat(k+1)=xhat(k)+W(k+1)(y(k+1)-H(k+1)xhat(k)) where W(k+1) is a specific gain term for RLS. Because the gain varies with k, it is an adaptive estimator. Other adaptive estimators can be obtained by varying this gain term.
Specifically W(k+1)=P(k)H^T*inv(HP(k)H^T+R), P(k+1)=(I-W(k+1))P(k) and H=H(k+1). The Kalman filter is closely related to the RLS recursion but you have to include the dynamical system for the state prediction.
RLS based identification is a "case" of adaptive identification. RLS is a rather fast way (as compared to other LMS-based methods - RLS being among them) to do adaptive identification. There are even faster variants (FLS, etc.).
The recursive method identification is: computer by some 'simple modification', used in Central part of adaptive Systems, small requirement on memory, easily modified into real time algorithms, used in fault detection to find out if the System has changed significantly.
RLS (Recursive Least Squares), can be used for a system where the current state can be solved using A*x=b using least squares. but still we are getting observations from the sensors so instead of making our A matrix bigger we try to upate the inverse of our matrix.
for more details, please have a look on the attached pdf
Hello. I agree with Omar Gerek's description. To add some details, RLS is an adaptive filtering method for parameter estimation in a deterministic system with parameter vector x(k) and with noisy observations of the parameter vector y(k)=H(k)x(k)+w(k) for k=1..K and w(k) is an iid white noise sequence with zero mean and covariance R (when this is unknown it is usually taken as the identity matrix).
The RLS parameter estimator is an online implementation of least squares that is, as its name suggests, recursive. Thus the current parameter estimate xhat(k) is predicted and corrected using the current measurement only rather than going all the way back to time 1 and solving the LS problem again. The form of the recursion is:
xhat(k+1)=xhat(k)+W(k+1)(y(k+1)-H(k+1)xhat(k)) where W(k+1) is a specific gain term for RLS. Because the gain varies with k, it is an adaptive estimator. Other adaptive estimators can be obtained by varying this gain term.
Specifically W(k+1)=P(k)H^T*inv(HP(k)H^T+R), P(k+1)=(I-W(k+1))P(k) and H=H(k+1). The Kalman filter is closely related to the RLS recursion but you have to include the dynamical system for the state prediction.