Fixed-Budget Kernel Recursive Least-Squares

Steven Van Vaerenbergh$ \ast$, Ignacio Santamaría$ \ast$, Weifeng Liu$ \dag$ and José C. Príncipe$ \ddag$ 1
$ \ast$Dept. of Communications Engineering, University of Cantabria, Santander, Spain
$ \dag$Forecasting Team, Amazon.com, 701 5th Ave, Seattle, WA 98104 USA
$ \ddag$Dept. of Electrical and Computer Engineering, University of Florida, Gainesville, FL 32611 USA
Email: {steven,nacho} #at# gtas.dicom.unican.es, weifeng #at# ieee.org, principe #at# cnel.ufl.edu


Abstract:

We present a kernel-based recursive least-squares (KRLS) algorithm on a fixed memory budget, capable of recursively learning a nonlinear mapping and tracking changes over time. In order to deal with the growing support inherent to online kernel methods, the proposed method uses a combined strategy of growing and pruning the support. In contrast to a previous sliding-window based technique, the presented algorithm does not prune the oldest data point in every time instant but it instead aims to prune the least significant data point. We also introduce a label update procedure to equip the algorithm with tracking capability. Simulations show that the proposed method obtains better performance than state-of-the-art kernel adaptive filtering techniques given similar memory requirements.


\begin{keywords}
Kernel methods, machine learning, recursive least-squares, nonlinear filtering, fixed budget
\end{keywords}


Pdf version (236 KB)
Steven Van Vaerenbergh
Last modified: 2010-08-07