stat946f10
Maximum Variance Unfolding (Semidefinite Embedding)
The main poposal of the technique is to lean a suitable kernel with several constraints when the data is given.
Here is the constraints for the kernel.
1. Semipositive definiteness
Kernel PCA is a kind of spectral decompostion in Hilber space. The semipositive definiteness interprets the kernel matrix as storing the inner products of vectors in a Hilber space. Furthermore, the semipositive definiteness also means all eigenvalues are non-negative.
2. Centering
Considering the centering process in Kernel PCA, it is also required here. The condition is given by
[math]\displaystyle{ \sum_i \Phi(x_i) =0 . }[/math]
Equivalently,
[math]\displaystyle{ 0 = |\sum_i \Phi(x_i)|^2 = \sum_{ij}\Phi(x_i)\Phi(x_j)=\sum_{ij}K_{ij}. }[/math]
3. Isometry
The local distance between a pairwise of data [math]\displaystyle{ x_i, x_j }[/math], under neighbourhood relation [math]\displaystyle{ \eta }[/math], should be preserved in new space [math]\displaystyle{ \Phi(x_i), \Phi(x_j) }[/math] after mapping. In other words, [math]\displaystyle{ \forall \eta_{ij}\gt 0 }[/math],
[math]\displaystyle{ |\Phi(x_i) - \Phi(x_j)|^2 = |x_i - x_j|^2. }[/math]