stat946f10: Difference between revisions
No edit summary |
No edit summary |
||
Line 6: | Line 6: | ||
'''1. Semipositive definiteness'''<br /> | '''1. Semipositive definiteness'''<br /> | ||
Kernel PCA is a kind of spectral decompostion in Hilber space. The semipositive definiteness interprets the kernel matrix as storing the inner products of vectors in a Hilber space. | Kernel PCA is a kind of spectral decompostion in Hilber space. The semipositive definiteness interprets the kernel matrix as storing the inner products of vectors in a Hilber space. Furthermore, the semipositive definiteness also means all eigenvalues are non-negative. | ||
'''2. Centering '''<br /> | '''2. Centering '''<br /> | ||
Considering the centering process in Kernel PCA, it is also required here. The condition is given by | |||
<br /> | |||
<math>\sum_i \Phi(x_i) =0 .</math> | |||
<br /> | |||
Equivalently, | |||
<br /> | |||
<math> 0 = |\sum_i \Phi(x_i)|^2 = \sum_{ij}\Phi(x_i)\Phi(x_j)=\sum_{ij}K_{ij}. </math> | |||
<br /> |
Revision as of 18:52, 3 June 2009
Maximum Variance Unfolding AKA Semidefinite Embedding
The main poposal of the technique is to lean a suitable kernel with several constraints when the data is given.
Here is the constraints for the kernel.
1. Semipositive definiteness
Kernel PCA is a kind of spectral decompostion in Hilber space. The semipositive definiteness interprets the kernel matrix as storing the inner products of vectors in a Hilber space. Furthermore, the semipositive definiteness also means all eigenvalues are non-negative.
2. Centering
Considering the centering process in Kernel PCA, it is also required here. The condition is given by
[math]\displaystyle{ \sum_i \Phi(x_i) =0 . }[/math]
Equivalently,
[math]\displaystyle{ 0 = |\sum_i \Phi(x_i)|^2 = \sum_{ij}\Phi(x_i)\Phi(x_j)=\sum_{ij}K_{ij}. }[/math]