# Difference between revisions of "measuring Statistical Dependence with Hilbert-Schmidt Norm"

"Hilbert-Schmist Norm of the Cross-Covariance operator" is proposed as an independence criterion in reproducing kernel Hilbert spaces (RKHSs). The measure is refereed to as Hilbert-Schmidt Independence Criterion, or HSIC. An empirical estimate of this measure is introduced which may be well used in many practical application such as independent Component Analysis (ICA), Maximum Variance Unfolding (MVU), feature extraction, feature selection, ... .

## RKHS Theory

Let $\mathcal{F}$ be a Hilbert space from $\mathcal{X}$ to $\mathbb{R}$. We assume $\mathcal{F}$ is a Reproducing Kernel Hilbert Space,i.e., for all $x\in \mathcal{X}$, the corresponding Dirac evaluation operator $\delta_x:\mathcal{F} \rightarrow \mathbb{R}$ is a bounded (or equivalently continuous) linear functional. We denote the kernel of this operator by $k(x,x')=\langle \phi(x)\phi(x') \rangle_{\mathcal{F}}$ where $k:\mathcal{X}\times \mathcal{X}\rightarrow \mathbb{R}$ is a positive definite function and $\,\!\phi$ is the feature map of $\mathcal{F}$. Similarly, we consider another RKHS named $\mathcal{G}$ with Domain $\mathcal{Y}$, kernel $l(\cdot,\cdot)$ and feature map $\,\!\psi$. We assume both $\mathcal{F}$ and $\mathcal{G}$ are separable, i.e., they have a complete orthogonal bases.

### Hilbert-Schmidt Norm

For a linear operator $C:\mathcal{G}\rightarrow \mathcal{F}$, provided the sum converges, the Hilbert-Schmidt (HS) norm is defined as:
$\|C\|^2_{HS}:=\sum_{i,j}\langle Cv_i,u_j \rangle^2_{\mathcal{F}}$
where $\,\!u_j$ and $\,\!v_i$ are orthogonal bases of $\mathcal{F}$ and $\mathcal{G}$, respectively. It is easy to see that the Frobenius norm on matrices may be considered a spacial case of this norm.

### Hilbert-Schmidt Operator

A Hilbert-Schmidt Operator is a linear operator for which the Hilbert-Schmidt norm (introduced above) exists.

### Tensor Product Operator

We may employ any $f\in \mathcal{F}$ and $g\in \mathcal{G}$ to define a tensor product operator $f\otimes g:\mathcal{G}\rightarrow\mathcal{F}$ as follows:
$(f\otimes g)h:=f\langle g,h\rangle_{\mathcal{G}} \quad$ for all $h\in\mathcal{G}$
Using the definition of HS norm introduced above, we can simply show the norm of $f\otimes g$ equals
$\|f\otimes g\|^2_{HS}=\|f\|^2_{\mathcal{F}}\; \|g\|^2_{\mathcal{G}}$

## Cross-Covariance Operator

### Mean

Mean elements of $\mathcal{F}$ and $\mathcal{G}$ are defined as those elements of these spaces for which

$\langle\mu_x,f \rangle_{\mathcal{F}}=\mathbf{E}_x[\langle\phi(x),f \rangle_{\mathcal{F}}]=\mathbf{E}_x[f(x)]$
$\langle\mu_y,g \rangle_{\mathcal{G}}=\mathbf{E}_y[\langle\psi(y),g \rangle_{\mathcal{G}}]=\mathbf{E}_y[g(x)]$
Based on this, $\|\mu_x\|^2_{\mathcal{F}}$ may be calculated by applying expectation twice as follows:
$\|\mu_x\|^2_{\mathcal{F}}=\mathbf{E}_{x,x'}[\langle \phi(x),\phi(x')\rangle_{\mathcal{F}}]=\mathbf{E}_{x,x'}[k(x,x')]$

### Cross-covariance Operator

Now we are in aposition to define the cross-covariance operator as follows
$C_{xy}:=\underbrace{\mathbf{E}_{x,y}[\phi(x)\otimes\psi(y)]}_{:=\tilde{C}_{xy}}-\underbrace{\mu_x\otimes\mu_y}_{:=M_{xy}}$
We will use $\tilde{C}_{xy}$ and $\,\!M_{xy}$ as the basis of our measure of dependence.

## Hilbert-Schmidt Independence Criterion

Definition (HSIC) Given separable RKHSs $\mathcal{F}$, $\mathcal{G}$ and a joint probability $p_{x\,\!y}$, we define the Hilbert-Schmidt Independence Criterion(HSIC) as the squared HS-norm of the associated cross-covariance operator $C_{x\,\!y}$
$\text{HSIC}(p_{xy},\mathcal{F},\mathcal{G}):=\|C_{xy}\|^2_{HS}$

### HSIC in terms of kernels

To compute HSIC we need to express it in terms of kernel functions. It can be shown that this can be achieved via the following identity:
$\text{HSIC}(p_{xy},\mathcal{F},\mathcal{G})=\mathbf{E}_{x,x',y,y'}[k(x,x')l(y,y')]+\mathbf{E}_{x,x'}[k(x,x')]\mathbf{E}_{y,y'}[l(y,y')]-2\mathbf{E}_{x,y}[\mathbf{E}_{x'}[k(x,x')]\mathbf{E}_{y'}[l(y,y')]]$

Because the data is random, the kernels are random so we use expected values.

## Empirical Criterion

### Definition (Empirical HSIC)

Let $Z:=\{(x_1,y_1),\cdots,(x_m,y_m)\}\subseteq \mathcal{X}\times\mathcal{Y}$ be a series of $\,\!m$ independent observations drawn from $p_{x\,\!y}$. An estimator of HSIC, is given by
$\text{HSIC}(Z,\mathcal{F},\mathcal{G}):=(m-1)^{-2}\textbf{tr}(KHLH)$
where $H, K, L\in\mathbb{R}^{m\times m}, K_{ij}:=k(x_i,x_j), L_{ij}:=l(y_i,y_j) \,\, \text{and} \, \, H_{ij}:=\delta_{ij}-m^{-1}$.

### Bias of Estimator

It may bee shown that the bias of the above empirical estimation is of the order $\,\!O(m^{-1})$:
Theorem: Let $\mathbf{E}_Z$ denote the expectation taken over $\,\!m$ independent copies $\,\!(x_i,y_i)$ drawn from $p_{\,\!xy}$. Then
$\text{HSIC}(p_{xy},\mathcal{F},\mathcal{G})=\mathbf{E}_Z[\text{HSIC}(Z,\mathcal{F},\mathcal{G})]+O(m^{-1})$.

## Bound on Empirical HSIC

Theorem: Assume that $\,\!k$ and $\,\!l$ are bounded almost everywhere by 1, and are non-negative. Then for $\,\!m\gt 1$ and all $\,\!\delta\gt 0$, with probablity at least $\,\!1-\delta$, for all $\,\!p_{xy}$ :
$|\text{HSIC}(p_{xy},\mathcal{F},\mathcal{G})-\text{HSIC}(Z,\mathcal{F},\mathcal{G})|\leq\sqrt{\frac{log(6/\delta)}{\alpha^2m}}+\frac{C}{m}$
where $\,\!\alpha\gt 0.24$ and $\,\!C$ are constants.

## Independence Test using HSIC

Theorem: Denote by $\mathcal{F}$, $\mathcal{G}$ RKHSs with universal kernels $\,\!k$, $\,\!l$ on the compact domains $\mathcal{X}$ and $\mathcal{Y}$ respectively. We assume without loss of generality that $\|f\|_{\infty}\leq 1$ and $\|g\|_{\infty}\leq 1$ for all $f \in \mathcal{F}$ and $g \in \mathcal{G}$. Then $\|C_{xy}\|_{HS} =0$ if and only if $\,\!x$ and $\,\!y$ are independent.
Based on this result, to maximize the dependence between two kernels we need to increase the value of the empirical estimate, i.e., $\,\!\textbf{tr}(KHLH)$.
It can be shown that if even one of the kernels $\,\!K$ or $\,\!L$ is already centered, we may drop the centering matrices $\,\!H$ and simply use the objective function $\,\!\textbf{tr}(KL)$.