next up previous
Next: Kernel CCA for Wiener Up: Measures Against Overfitting Previous: Order Reduction


Regularization

A second method considers regularizing the solution by adding a normalization restriction to the norm of $ \textbf{\~h}_k$ [11,12] or $ \alpha$$ _k$ [21]. Here, we apply the classical regularization technique in the framework of LS regression, which yields the following CCA-GEV problem

$\displaystyle \frac{1}{2} \begin{bmatrix}\textbf{X}_1^T \textbf{X}_1 & \textbf{...
...tbf{0} & \textbf{X}_2^T \textbf{X}_2 + c \textbf{I} \end{bmatrix} \textbf{\~h},$    

or, in the case of K-CCA

$\displaystyle \frac{1}{2} \begin{bmatrix}\textbf{K}_1 & \textbf{K}_2\\ \textbf{K}_1 & \textbf{K}_2 \end{bmatrix}$   $\displaystyle \mbox{\boldmath$\alpha$\unboldmath }$$\displaystyle = \beta \begin{bmatrix}\textbf{K}_1 + c {\mathbf I}& \textbf{0}\\ \textbf{0} & \textbf{K}_2 + c {\mathbf I} \end{bmatrix}$   $\displaystyle \mbox{\boldmath$\alpha$\unboldmath }$$\displaystyle .$ (7)

Therefore, the regularized version of CCA (or K-CCA) can be reformulated as two coupled LS (or kernel-LS) regression problems.



Steven Van Vaerenbergh
Last modified: 2006-04-05