next up previous
Next: Time-varying Wiener System Up: Simulation Results Previous: Simulation Results

Static Wiener System

Simulations are carried out to illustrate the performance of the proposed equalization algorithm. The performance is evaluated on a Wiener system consisting of the non-minimum phase linear filter $ H(z) = 1 + 0.8668 z^{-1} -0.4764 z^{-2} + 0.2070 z^{-3}$ and the nonlinearity $ f(x) = \tanh(x)$. The input signal is a white zero-mean Gaussian with unit variance. The output is affected by additive white Gaussian noise, matching an SNR of $ 25$dB. Given $ x_2[n]$, the desired output of the equalizer is a delayed version of the signal $ x_1[n-d]$, with $ d=1/2(L+L_W)$, where $ L$ is the length of the linear filter $ \hat{H}(z)$ and $ L_W$ is the length of the equalizer $ W(z)$.

Equalization is performed by the proposed online K-CCA method, for which a Gaussian kernel with $ \sigma = 0.2$ and the regularization constant $ c=0.1$ were used. The filter $ W(z)$ has length $ L_W=15$ and the RLS forgetting factor is $ 0.99$. For comparison, two other equalization methods are also included. The first one is the gradient identification method proposed in [6] to which we added the same RLS block for equalization as in the presented K-CCA method. The second one is a time-delay MLP with $ L_W=15$ inputs for its time-delay (i.e. equal to the equalizer's length), $ 15$ neurons in its hidden layer and $ \mu=0.01$. The MLP does not take the system structure into account, and hence its equalization results are only included to see the advantages of the other two methods (that do exploit the Wiener system structure). All three methods were trained with a training data set in an adaptive manner, while at every iteration the equalizing capabilities of each method were tested using a separate test data set, generated by the same Wiener system. In Fig. 4 the mean square error (MSE) curves are compared for these three methods, averaged out over $ 50$ Monte-Carlo simulations.

Figure 4: Wiener system equalization MSE for the presented online K-CCA method, the gradient identification method from [6] and a time-delay MLP. The MLP does not make use of the system structure and hence achieves a worse result. The K-CCA method needs an initialization period of the length of its window ($ L=150$), after which the MSE drops fast and reaches convergence. The steepness of this slope is mainly determined by the speed of the RLS algorithm.
Wiener system equalization MSE.

Fig. 5 shows the coefficients estimated by the K-CCA algorithm after processing $ 1000$ samples online for the given example when $ L=10$ instead of the correct $ L=4$ coefficients for the linear filter. Fig. 6 compares the MSE curves obtained for different values of $ L$ when the correct value is $ L=4$. Note that the effect of overestimating $ L$ on the algorithm's performance is minimal.

Figure 5: The $ 4$ real coefficients of the filter $ H(z)$ versus the $ 10$ coefficients of the estimated filter $ \hat{H}(z)$. The $ 6$ additionally estimated coefficients are very close to 0.
Channel length overestimation.

Figure 6: MSE curves for equalization of a Wiener system with linear filter length $ 4$, for different values of $ L$, the length of the linear filter used in system identification. In the ideal case ($ L=4$) the filter length is known. The presented MSE curves for $ L=5$ till $ L=13$ show very similar equalizer performance for all cases. The curves were averaged out over $ 50$ simulations.
Channel length overestimation.

A parameter that affects the performance of the K-CCA algorithm more is the length $ N$ of the sliding-window. Fig. 7 shows MSE curves for different window lengths, for the given setup. A longer window corresponds to a bigger kernel matrix, leading in turn to a better representation of the inverse nonlinearity $ g(.)$ and hence a lower equalization error. The curves were averaged out over $ 50$ simulations.

Figure 7: Influence of the window length $ N$ on the MSE curves of the online K-CCA algorithm. Note the initialization period of length $ N$, needed for replacing the initialization data in the kernel matrix by real data.
Influence of the window length N on the MSE curves of the online K-CCA algorithm.

In a second setup, the same Wiener system is used with a BPSK input signal. After training the K-CCA algorithm online with $ 1000$ symbols, its bit error rate (BER) was calculated on a test data set. The BER curve is shown in Fig. 8.

Figure 8: BER curve for the online K-CCA algorithm using BPSK input symbols.
BER curve for the online K-CCA algorithm using BPSK input symbols.


next up previous
Next: Time-varying Wiener System Up: Simulation Results Previous: Simulation Results
Steven Van Vaerenbergh
Last modified: 2006-04-05