Difference between revisions of "measuring statistical dependence with Hilbert-Schmidt norms"

From statwiki
Jump to: navigation, search
(Created page with " == References == [1] Gretton, Arthur, et al. "Measuring statistical dependence with Hilbert-Schmidt norms." Algorithmic learning theory. Springer Berlin Heidelberg, 2005. [2] ...")
 
(References)
Line 1: Line 1:
 +
 +
 +
This is another very popular kernel-based approach fro detecting dependence which is called HSIC(Hilbert-Schmidt Independence Criteria). It's based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces(RKHSs).This approach is simple and no user-defined regularisation is needed. Exponential convergence is guaranteed, so convergence is fast.
 +
 +
== Background ==
 +
Before the proposal of HSIC, there are already a few kernel-based independence detecting methods. Bach[] proposed a regularised correlation operator which is derived from the covariance and cross-covariance operators, and its largest singular value was used as a static to test independence. Gretton et al.[] used the largest singular value of the cross-covariance operator which resulted constrained covariance(COCO). HSIC is a extension of the concept COCO by using the entire spectrum of cross-covariance operator to determine when all its singular values are zero rather than just looking the largest singular value.
 +
 +
== Cross-Covariance Operators ==
 +
Cross-covariance operator is first propose by (Baker,1973). It can be used to measure the relations between probability measures on two RKHSs.
 +
Define two RKHSs <math>H_1</math> and <math>H_2</math> with inner product <math><.,.>_1</math>, <math><.,.>_2</math>. A probability measure <math>\mu_i</math> on <math>H_i,i=1,2</math> that satisfies
 +
 +
<math>\int_{H_i}||x||_i^2d\mu_i(x)<\infty</math>
 +
 +
defines an operator <math>R_i</math> in <math>H_i</math> by
 +
 +
<math><R_iu,v>=\int_{H_i}<x-m_i,u>_i<x-m_i,v>_id\mu_i(x)</math>
 +
 +
<math>R_i</math> is called covariance operator, if u and v are in different RKHS, then <math>R_i</math> is called cross-covariance operator.
  
  

Revision as of 12:40, 14 August 2013


This is another very popular kernel-based approach fro detecting dependence which is called HSIC(Hilbert-Schmidt Independence Criteria). It's based on the eigenspectrum of covariance operators in reproducing kernel Hilbert spaces(RKHSs).This approach is simple and no user-defined regularisation is needed. Exponential convergence is guaranteed, so convergence is fast.

Background

Before the proposal of HSIC, there are already a few kernel-based independence detecting methods. Bach[] proposed a regularised correlation operator which is derived from the covariance and cross-covariance operators, and its largest singular value was used as a static to test independence. Gretton et al.[] used the largest singular value of the cross-covariance operator which resulted constrained covariance(COCO). HSIC is a extension of the concept COCO by using the entire spectrum of cross-covariance operator to determine when all its singular values are zero rather than just looking the largest singular value.

Cross-Covariance Operators

Cross-covariance operator is first propose by (Baker,1973). It can be used to measure the relations between probability measures on two RKHSs. Define two RKHSs [math]H_1[/math] and [math]H_2[/math] with inner product [math]\lt .,.\gt _1[/math], [math]\lt .,.\gt _2[/math]. A probability measure [math]\mu_i[/math] on [math]H_i,i=1,2[/math] that satisfies

[math]\int_{H_i}||x||_i^2d\mu_i(x)\lt \infty[/math]

defines an operator [math]R_i[/math] in [math]H_i[/math] by

[math]\lt R_iu,v\gt =\int_{H_i}\lt x-m_i,u\gt _i\lt x-m_i,v\gt _id\mu_i(x)[/math]

[math]R_i[/math] is called covariance operator, if u and v are in different RKHS, then [math]R_i[/math] is called cross-covariance operator.


References

[1] Gretton, Arthur, et al. "Measuring statistical dependence with Hilbert-Schmidt norms." Algorithmic learning theory. Springer Berlin Heidelberg, 2005.

[2] Fukumizu, Kenji, Francis R. Bach, and Michael I. Jordan. "Kernel dimension reduction in regression." The Annals of Statistics 37.4 (2009): 1871-1905.

[3] Bach, Francis R., and Michael I. Jordan. "Kernel independent component analysis." The Journal of Machine Learning Research 3 (2003): 1-48.

[4] Baker, Charles R. "Joint measures and cross-covariance operators." Transactions of the American Mathematical Society 186 (1973): 273-289.