User contributions for Njmurdoc
Jump to navigation
Jump to search
26 November 2009
- 16:4316:43, 26 November 2009 diff hist +209 stat841 →Kernel Trick
- 16:4016:40, 26 November 2009 diff hist +164 N File:Sep2.png A simple transformation to Euclidean distance from the origin, and it is possible to put a hyperplane through the data that separates it perfectly into two classes. current
- 16:3516:35, 26 November 2009 diff hist +74 N File:Unsep.png A set of points in 2d which no hyperplane can separate with good accuracy. current
- 16:2616:26, 26 November 2009 diff hist +164 stat841 →Kernel Trick
- 16:2016:20, 26 November 2009 diff hist +66 N File:Point 2d.png In 2 dimensions, a point is trapped by a rectangle (shown in red). current
- 16:1916:19, 26 November 2009 diff hist +72 N File:Point 3d.png A point moving in 3d space can easily escape a 2d prison (shown in red). current
- 12:2212:22, 26 November 2009 diff hist +87 stat841 →Using support vectors
- 12:1912:19, 26 November 2009 diff hist +6 m stat841 →Quadprog example
- 12:1512:15, 26 November 2009 diff hist +2 m stat841 →Using support vectors
- 12:1212:12, 26 November 2009 diff hist 0 m stat841 →Examining K.K.T. conditions
- 12:0712:07, 26 November 2009 diff hist +118 stat841 →Does SVM find a global minimum?
- 11:5311:53, 26 November 2009 diff hist +18 m stat841 →Applying KKT conditions
- 11:5111:51, 26 November 2009 diff hist +440 stat841 →Applying KKT conditions
- 11:3411:34, 26 November 2009 diff hist +131 stat841 →SVM: non-separable case
21 November 2009
- 15:3415:34, 21 November 2009 diff hist +292 stat841 →Minimizing MSE
- 15:2115:21, 21 November 2009 diff hist +245 stat841 →Stein's Lemma
17 November 2009
- 23:2423:24, 17 November 2009 diff hist +77 stat841 →Writing Lagrangian Form of Support Vector Machine
- 23:2323:23, 17 November 2009 diff hist −13 m stat841 →Optimizing The Support Vector Machine - November 16th, 2009: changing title
- 23:2223:22, 17 November 2009 diff hist +327 stat841 →Writing Lagrangian Form of Support Vector Machine
- 23:0423:04, 17 November 2009 diff hist +170 stat841 →Writing Lagrangian Form of Support Vector Machine
- 22:5922:59, 17 November 2009 diff hist +457 stat841 →Maximizing the Support Vector Machine
- 11:4511:45, 17 November 2009 diff hist 0 m stat841 →Regularization for Neural Network — Weight Decay: w -> u fix
- 00:0800:08, 17 November 2009 diff hist +7 m stat841 →Maximizing the Support Vector Machine
- 00:0700:07, 17 November 2009 diff hist +7 m stat841 →Maximizing the Support Vector Machine: missing </math>
- 00:0600:06, 17 November 2009 diff hist +2 m stat841 →Optimizing The Support Vector Machine - November 16th, 2009(in progress)
- 00:0000:00, 17 November 2009 diff hist +2,093 stat841 →Optimizing The Support Vector Machine - November 16th, 2009(in progress)
16 November 2009
- 23:3423:34, 16 November 2009 diff hist +1,217 stat841 →Adding Nov. 16th, just starting
- 23:1523:15, 16 November 2009 diff hist +96 stat841 →RBF Network for classification (A probabilistic point of view)
- 23:0923:09, 16 November 2009 diff hist −6 stat841 →RBF Network for classification (A probabilistic point of view): fixed y_k|x from x|y_k
13 November 2009
- 14:2614:26, 13 November 2009 diff hist 0 m stat841 →Introduction
- 14:2514:25, 13 November 2009 diff hist +8 m stat841 →Introduction: readability
- 14:2414:24, 13 November 2009 diff hist +243 stat841 →Introduction
- 14:1014:10, 13 November 2009 diff hist +231 stat841 →Introduction
11 November 2009
- 12:4312:43, 11 November 2009 diff hist +233 m stat841 →K-fold Cross-validation
8 November 2009
- 16:2916:29, 8 November 2009 diff hist +1 stat841 →Example of under and overfitting in R: fixed example values to match image.
- 16:2716:27, 8 November 2009 diff hist +1 m stat841 →Example of under and overfitting in R: wrong image
- 16:2616:26, 8 November 2009 diff hist +2,759 stat841 →Complexity Control - Nov 2, 2009
- 16:2416:24, 8 November 2009 diff hist +292 N File:Curvefitting-rex2.png Fitting degree 1, 2, and 10 polynomials, and trigonometric functions to a randomly generated quadratic sample. Black points represent training data, teal represent test data. Y values follow the same function, but x values of training and test data have current
- 16:0116:01, 8 November 2009 diff hist +253 N File:Curvefitting-rex.png Fitting degree 1, 2, and 10 polynomials to a randomly generated sample. Black points represent training data, teal represent test data. Y values follow the same function, but x values of training and test data have means 0 and 1 to help separate them. current
- 14:5814:58, 8 November 2009 diff hist +42 stat841 →Complexity Control October 30, 2009
3 November 2009
- 11:2911:29, 3 November 2009 diff hist +20 m stat841 →Back-propagation
- 11:2311:23, 3 November 2009 diff hist +692 stat841 →Back-propagation: weight initialization
31 October 2009
- 01:0701:07, 31 October 2009 diff hist −1 m stat841 →Neural Networks (NN) - October 28, 2009: spelling
12 October 2009
- 18:1718:17, 12 October 2009 diff hist +285 stat841 →Example in R
- 18:1518:15, 12 October 2009 diff hist +244 stat841 →Example in R: adding comments to plot section.
9 October 2009
- 15:1815:18, 9 October 2009 diff hist −1 m stat841 →Example in R: typo.
- 15:1415:14, 9 October 2009 diff hist +22 stat841 →Introduction to Fisher's Discriminant Analysis - October 7, 2009: made a section title
- 15:1315:13, 9 October 2009 diff hist +1,444 stat841 →Introduction to Fisher's Discriminant Analysis - October 7, 2009
- 14:4514:45, 9 October 2009 diff hist +101 N File:Pca-fda1 low.png Multivariate normal distributions centred at (1,1) and (5,3), showing PCA and FDA primary dimensions. current
- 13:4213:42, 9 October 2009 diff hist −1 statf09841Scribe No edit summary