kernel Dimension Reduction in Regression: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
'''Introduction''' | |||
---- | |||
The problem of sufficient dimension reduction(SDR) for regression is to find a subspace such that covariates are conditionally independent given the subspace. In most of current methods, conditional independence are measured by conditional mean of covariates which only holds when the distribution of X is elliptic. This paper finds that the conditional independence assertion can be characterized in terms of conditional covariance operators on reproducing kernel hilbert spaces. This is the first few papers on independence measurement in reproducing hilbert kernel space. Other methods are the RHIC and dCor. |
Revision as of 15:56, 16 July 2013
Introduction
The problem of sufficient dimension reduction(SDR) for regression is to find a subspace such that covariates are conditionally independent given the subspace. In most of current methods, conditional independence are measured by conditional mean of covariates which only holds when the distribution of X is elliptic. This paper finds that the conditional independence assertion can be characterized in terms of conditional covariance operators on reproducing kernel hilbert spaces. This is the first few papers on independence measurement in reproducing hilbert kernel space. Other methods are the RHIC and dCor.