stat841F18/: Difference between revisions

From statwiki
Jump to navigation Jump to search
Line 38: Line 38:


== References ==
== References ==
* <sup>[https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1380068 [1]]</sup>G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, “Extreme learning machine: A new learning scheme of feedforward neural networks,” in Proc. IJCNN,
Budapest, Hungary, Jul. 25–29, 2004, vol. 2, pp. 985–990.

Revision as of 22:39, 8 November 2018

Presented by

Yan Yu Chen, Qisi Deng, Hengxin Li, Bochao Zhang

Introduction

In the past two decades, due to their surprising classi- fication capability, support vector machine (SVM) [1] and its variants [2]–[4] have been extensively used in classification applications. Least square support vector machine (LS-SVM) and proximal sup- port vector machine (PSVM) have been widely used in binary classification applications. The conventional LS-SVM and PSVM cannot be used in regression and multiclass classification appli- cations directly, although variants of LS-SVM and PSVM have been proposed to handle such cases.

Motivation

There are several issues on BP learning algorithms:

(1) When the learning rate Z is too small, the learning algorithm converges very slowly. However, when Z is too large, the algorithm becomes unstable and diverges.

(2) Another peculiarity of the error surface that impacts the performance of the BP learning algorithm is the presence of local minima [6]. It is undesirable that the learning algorithm stops at a local minima if it is located far above a global minima.

(3) Neural network may be over-trained by using BP algorithms and obtain worse generalization performance. Thus, validation and suitable stopping methods are required in the cost function minimization procedure.

(4) Gradient-based learning is very time-consuming in most applications.

Previous Work

Model Architecture

ILSVRC 2014 Challenge Results

Conclusion

Critiques

References

  • [1]G.-B. Huang, Q.-Y. Zhu, and C.-K. Siew, “Extreme learning machine: A new learning scheme of feedforward neural networks,” in Proc. IJCNN,

Budapest, Hungary, Jul. 25–29, 2004, vol. 2, pp. 985–990.