User contributions for Aghodsib
Jump to navigation
Jump to search
15 January 2025
- 09:4509:45, 15 January 2025 diff hist +2 stat940W25 →Step 4: Conclusion
- 09:4409:44, 15 January 2025 diff hist −316 stat940W25 →Step 4: Conclusion
- 09:4309:43, 15 January 2025 diff hist −24 stat940W25 →Solution
- 09:4309:43, 15 January 2025 diff hist −25 stat940W25 →Question
- 09:4309:43, 15 January 2025 diff hist +100 stat940W25 →Step 4: Conclusion
- 09:4109:41, 15 January 2025 diff hist 0 stat940W25 →Step 4: Conclusion
- 09:3909:39, 15 January 2025 diff hist +3 stat940W25 →Step 4: Conclusion
- 09:3809:38, 15 January 2025 diff hist +5 stat940W25 →Step 4: Conclusion
- 09:3709:37, 15 January 2025 diff hist +6 stat940W25 →Step 4: Conclusion
- 09:3609:36, 15 January 2025 diff hist +107 stat940W25 →Step 4: Conclusion
- 09:3309:33, 15 January 2025 diff hist +34 stat940W25 No edit summary
- 09:3109:31, 15 January 2025 diff hist +540 stat940W25 →Exercise 2.1
13 January 2025
- 15:2015:20, 13 January 2025 diff hist +14 stat940W25 →Exercise 1.1
- 15:2015:20, 13 January 2025 diff hist +2 stat940W25 →Proof
- 15:1915:19, 13 January 2025 diff hist −18 stat940W25 →Exercise 1.1
- 15:1815:18, 13 January 2025 diff hist −91 stat940W25 →Exercise 1.1
- 15:1615:16, 13 January 2025 diff hist +6 stat940W25 →Notes on Exercises
- 15:1415:14, 13 January 2025 diff hist +61 stat940W25 →Exercise 1.1
- 15:1315:13, 13 January 2025 diff hist −1 stat940W25 →Exercise 1.1
- 15:0615:06, 13 January 2025 diff hist +456 stat940W25 No edit summary
12 January 2025
- 22:1222:12, 12 January 2025 diff hist +4,479 stat940W25 No edit summary
- 22:1122:11, 12 January 2025 diff hist +100,752 stat940W25 No edit summary Tag: Manual revert
- 22:0722:07, 12 January 2025 diff hist −9 stat940W25 No edit summary Tag: Reverted
- 22:0722:07, 12 January 2025 diff hist +13 stat940W25 No edit summary Tag: Reverted
- 22:0422:04, 12 January 2025 diff hist +550 stat940W25 No edit summary Tag: Reverted
- 22:0122:01, 12 January 2025 diff hist +4 stat940W25 No edit summary Tag: Reverted
- 22:0022:00, 12 January 2025 diff hist −6 stat940W25 No edit summary Tag: Reverted
- 21:5921:59, 12 January 2025 diff hist +48 stat940W25 No edit summary Tag: Reverted
- 21:5821:58, 12 January 2025 diff hist +8 stat940W25 No edit summary Tag: Reverted
- 21:5621:56, 12 January 2025 diff hist −104,967 stat940W25 Replaced content with " == Lecture 1: Perceptron == <div style="border: 2px solid #0073e6; background-color: #f0f8ff; padding: 10px; margin: 10px 0; border-radius: 5px;"> <b>Exercise:</b> == Question == Prove that the Perceptron Learning Algorithm converges in a finite number of steps if the dataset is linearly separable. '''Hint:''' Assume that the dataset \( \{(\mathbf{x}_i, y_i)\}_{i=1}^N \) is linearly separable, where \( \mathbf{x}_i \in \mathbb{R}^d \) are the input vectors, and \..." Tags: Replaced Reverted
- 21:5321:53, 12 January 2025 diff hist +3,375 stat940W25 →Lecture 1: Perceptron Tag: Reverted
- 21:5321:53, 12 January 2025 diff hist +27 stat940W25 No edit summary Tag: Reverted
- 21:4721:47, 12 January 2025 diff hist +37 stat940W25 No edit summary Tag: Reverted
- 21:4121:41, 12 January 2025 diff hist +20 N Template:ExerciseTemplate Created page with "this is an excersise" current
- 21:4121:41, 12 January 2025 diff hist −33 stat940W25 No edit summary Tag: Reverted
- 21:3921:39, 12 January 2025 diff hist +10 stat940W25 No edit summary Tag: Reverted
- 21:3821:38, 12 January 2025 diff hist +60 stat940W25 →Regularization in Deep Learning Tag: Reverted
- 21:3421:34, 12 January 2025 diff hist +131 stat940W25 No edit summary Tag: Reverted
- 20:5220:52, 12 January 2025 diff hist +58 stat940W25 No edit summary
- 20:5120:51, 12 January 2025 diff hist −61 main Page No edit summary current
- 20:5120:51, 12 January 2025 diff hist +7 main Page No edit summary Tag: Manual revert
- 20:5020:50, 12 January 2025 diff hist +105,229 N stat940W25 Created page with "== Regularization in Deep Learning == === Introduction === Regularization is a fundamental concept in machine learning, particularly in deep learning, where models with a high number of parameters are prone to overfitting. Overfitting occurs when a model learns the noise in the training data rather than the underlying distribution, leading to poor generalization on unseen data. Regularization techniques aim to constrain the model’s capacity, thus preventing overfitti..."
- 20:4920:49, 12 January 2025 diff hist −7 main Page No edit summary Tag: Reverted
- 20:4020:40, 12 January 2025 diff hist 0 main Page No edit summary Tag: Reverted
- 20:4020:40, 12 January 2025 diff hist 0 main Page No edit summary
- 20:3920:39, 12 January 2025 diff hist +10 main Page No edit summary
- 20:3520:35, 12 January 2025 diff hist −8 stat940F24 No edit summary current Tag: Manual revert
- 20:3520:35, 12 January 2025 diff hist +60 main Page No edit summary
11 January 2025
- 19:2319:23, 11 January 2025 diff hist +16 N stat940F25 Created page with "===Motivation===" current
- 13:3313:33, 11 January 2025 diff hist +62 main Page No edit summary