next up previous
Next: CCA and Kernel CCA Up: Online Kernel Canonical Correlation Previous: Online Kernel Canonical Correlation

Introduction

Wiener systems are well-known nonlinear models consisting of a possibly time-varying linear filter followed by a memoryless nonlinearity. These structures have been successfully applied in a number of areas such as chemical and biological modeling, signal processing and communications. A considerable amount of research has been carried out in the last decades on identifying and/or inverting Wiener systems. These techniques include neural network models, orthonormal functions, higher-order input-output cross-correlations and many others (see, for instance, [1,2,3,4,5]). However, most of these techniques do not exploit the specific cascade structure of Wiener systems in the identification/equalization procedure: they are black-box models. A recent exception to this rule is the work by Aschbacher and Rupp [6], which jointly identifies the inverse nonlinearity and the linear filter of the Wiener model. Inspired by this work, in this paper we propose to use kernel canonical correlation analysis (K-CCA) as a suitable technique to exploit the structure of Wiener systems in supervised identification/equalization algorithms.

Canonical correlation analysis (CCA) is a well-known technique in multivariate statistical analysis, which has been widely used in economics, meteorology and in many modern information processing fields, such as communication theory, statistical signal processing and blind source separation (BSS). The concept of CCA was first introduced by H. Hotelling [7] as a way of measuring the linear relationship between two multidimensional sets of variables and was later extended to several data sets [8].

Several extensions of CCA to account for nonlinear relationships between two data sets have recently been proposed [9,10]. Among them, one of the most promising approaches is the application of K-CCA [11,12]. K-CCA searches for nonlinear relationships between data sets, by first applying a nonlinear kernel transformation that maps the input data onto a high dimensional feature space (usually infinitely dimensional) and then performing conventional (linear) CCA. In the proposed approach for supervised equalization of Wiener systems, given a set of input-output patterns, K-CCA tries to maximize the correlation between the linearly transformed input and the nonlinearly transformed output (i.e. we use a linear kernel for the input data and a nonlinear kernel for the output). In this way, K-CCA efficiently exploits the cascade structure of Wiener systems and provides estimates of the linear filter and the inverse nonlinearity.

In their original forms, most of the kernel algorithms cannot be used to operate online because of a number of difficulties such as the growing dimensionality of the kernel matrix and the need to avoid overfitting of the problem. Recently, a kernel RLS algorithm was proposed that deals with both difficulties [13]: by applying a sparsification procedure the kernel matrix size was limited and the order of the problem was reduced. In [14] we presented a different kernel RLS approach in which conventional regularization was used to avoid overfitting and a sliding-window procedure fixed the kernel matrix size, allowing the algorithm to operate online in time-varying environments. In this paper we extend the basic sliding-window kernel RLS algorithm to K-CCA. In addition, we also extend to K-CCA a recent reformulation of CCA as a pair of coupled iterative regression problems [,], which allows us to avoid any generalized eigenvalue decomposition in the computation of the K-CCA solution for each window.

The rest of this paper is organized as follows: in Section II CCA and K-CCA are briefly reviewed. In Section III the proposed procedure to identify/equalize Wiener systems based on K-CCA is described. In Section IV the online version of this algorithm is derived, followed by simulation results and comparisons in Section V. Finally, Section VI summarizes the main conclusions of this work.


next up previous
Next: CCA and Kernel CCA Up: Online Kernel Canonical Correlation Previous: Online Kernel Canonical Correlation
Steven Van Vaerenbergh
Last modified: 2006-04-05