regression on Manifold using Kernel Dimension Reduction: Difference between revisions

From statwiki
Jump to navigation Jump to search
No edit summary
Line 2: Line 2:
===Introduction===
===Introduction===
This paper <ref>[http://www.machinelearning.org/proceedings/icml2007/papers/491.pdf] Jen Nilsson, Fei Sha, Michael I. Jordan, Regression on Manifold using Kernel Dimension Reduction, 2007 - cs.utah.edu
This paper <ref>[http://www.machinelearning.org/proceedings/icml2007/papers/491.pdf] Jen Nilsson, Fei Sha, Michael I. Jordan, Regression on Manifold using Kernel Dimension Reduction, 2007 - cs.utah.edu
</ref>introduces a new algorithm for for discovering a manifold that best preserves the information relevant to a non-linear regression. The approach introduced by the authors involves combining the machinery of Kernel Dimesnion Reduction (KDR) with Laplacian Eigenmaps by optimizing the cross-covariance operators in kernel feature space.  
</ref>introduces a new algorithm for for discovering a manifold that best preserves the information relevant to a non-linear regression. The approach introduced by the authors involves combining the machinery of Kernel Dimension Reduction (KDR) with Laplacian Eigenmaps by optimizing the cross-covariance operators in kernel feature space.  
 
Two main challenges that we usually come across in supervised learning are making a choice of manifold to represent the covariance vector and to choose function to represent the boundary for classification (i.e. regression surface). As a result of these two complexities, most of the research in supervised learning has been focused on learning linear manifolds. The authors introduce methodologies developed in Sufficient Dimension Reduction (SDR) and Kernel Dimension Reduction (KDR) to introduce the new algorithm called ''Manifold Kernel Dimension Reduction (mKDR)''.
 
===Sufficient Dimension Reduction===
===Sufficient Dimension Reduction===
(this section will be updated shortly)
(this section will be updated shortly)

Revision as of 19:00, 20 July 2009

An Algorithm for finding a new linear map for dimension reduction.

Introduction

This paper <ref>[1] Jen Nilsson, Fei Sha, Michael I. Jordan, Regression on Manifold using Kernel Dimension Reduction, 2007 - cs.utah.edu </ref>introduces a new algorithm for for discovering a manifold that best preserves the information relevant to a non-linear regression. The approach introduced by the authors involves combining the machinery of Kernel Dimension Reduction (KDR) with Laplacian Eigenmaps by optimizing the cross-covariance operators in kernel feature space.

Two main challenges that we usually come across in supervised learning are making a choice of manifold to represent the covariance vector and to choose function to represent the boundary for classification (i.e. regression surface). As a result of these two complexities, most of the research in supervised learning has been focused on learning linear manifolds. The authors introduce methodologies developed in Sufficient Dimension Reduction (SDR) and Kernel Dimension Reduction (KDR) to introduce the new algorithm called Manifold Kernel Dimension Reduction (mKDR).

Sufficient Dimension Reduction

(this section will be updated shortly)

Kernel Dimension Reduction

(this section will be updated shortly)

Manifold Learning

(this section will be updated shortly)

Manifold Kernel Dimension Reduction

(this section will be updated shortly)

Examples

(this section will be updated shortly)

SUmmary

(this section will be updated shortly)

Further Research