large-Scale Supervised Sparse Principal Component Analysis

From statwiki
Revision as of 21:12, 4 August 2013 by Z47xu (talk | contribs) (Created page with "== 1. Introduction == The drawbacks of most existing technique: '''1 Drawbacks of Existing techniques''' Existing techniques include ad-hoc methods(e.g. factor rotation techn...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

1. Introduction

The drawbacks of most existing technique:

1 Drawbacks of Existing techniques

Existing techniques include ad-hoc methods(e.g. factor rotation techniques, simple thresholding), greedy algorithms, SCoTLASS, the regularized SVD method, SPCA, the generalized power method. These methods are based on non-convex optimization and they don't guarantee global optimum.

A semi-definite relaxation method called DSPCA can guarantee global convergence and has better performance than above algorithms, however, it is computationally expensive.

2 Contribution of this paper

This paper solves DSPCA in a computationally easier way, and hence it is a good solution for large scale data sets. This paper applies a block coordinate ascent algorithm with computational complexity