Gradient Episodic Memory for Continual Learning: Difference between revisions
No edit summary |
No edit summary |
||
Line 18: | Line 18: | ||
= Framework for Continual Learning = | = Framework for Continual Learning = | ||
== Task Descriptor == | == Task Descriptor == | ||
Task descriptor are integers <math>t_i=i \in \mathbb{Z} </math> which occurs in a collection where <math>t_1,...,t_n \in \mathcal{T}</math>. <math>t_i</math> could also possibly be a structured object which consist of descriptions on solving <math>i</math>-th task. For having more information in <math>t</math>, zero-shot learning could be achieved because relation between task could be detected using new task descriptor. | Task descriptor are integers <math>t_i=i \in \mathbb{Z} </math> which occurs in a collection where <math>t_1,...,t_n \in \mathcal{T}</math>. <math>t_i</math> could also possibly be a structured object which consist of descriptions on solving <math>i</math>-th task. For having more information in <math>t</math>, zero-shot learning could be achieved because relation between task could be detected using new task descriptor. | ||
Task descriptors are structured objects, describing how to solve tasks. They are integers <math>t_i=i \in \mathbb{Z} </math> which occurs in a collection where <math>t_1,...,t_n \in \mathcal{T}</math>. | |||
== Training Protocol == | == Training Protocol == |
Revision as of 22:07, 17 November 2018
Presented by
- Yu Xuan Lee
- Tsen Yee Heng
Background and Introduction
Supervised learning consist of a training set [math]\displaystyle{ D_{tx}=(x_i,y_i)^n_{i=1} }[/math], where [math]\displaystyle{ x_i \in X }[/math] and [math]\displaystyle{ y_i \in Y }[/math]. Empirical Risk Minimization (ERM) is one of the common supervised learning method used to minimize a loss function by having multiple passes over the training set.
[math]\displaystyle{ \frac{1}{|D_{tr}|}\textstyle \sum_{(x_i,y_i) \in D_{tr}} \ell (f(x_i),y_i) }[/math]
where [math]\displaystyle{ \ell :\mathcal {Y} \times \mathcal {Y} \to [0, \infty) }[/math]
Different to machine learning, datas are being observed sequentially, occurred recurrently, and stored limitedly for learning humans. Thus, the iid assumption is not applicable to ERM. One of the characteristics of ERM is "catastrophic forgetting", which is the problem of recalling past knowledge upon acquiring new ones. To overcome this problem, Gradient Episodic Memory (GEM) is introduced to alleviates forgetting on previous acquired knowledge, while solving new problems more efficiently.
Framework for Continual Learning
Task Descriptor
Task descriptor are integers [math]\displaystyle{ t_i=i \in \mathbb{Z} }[/math] which occurs in a collection where [math]\displaystyle{ t_1,...,t_n \in \mathcal{T} }[/math]. [math]\displaystyle{ t_i }[/math] could also possibly be a structured object which consist of descriptions on solving [math]\displaystyle{ i }[/math]-th task. For having more information in [math]\displaystyle{ t }[/math], zero-shot learning could be achieved because relation between task could be detected using new task descriptor.
Task descriptors are structured objects, describing how to solve tasks. They are integers [math]\displaystyle{ t_i=i \in \mathbb{Z} }[/math] which occurs in a collection where [math]\displaystyle{ t_1,...,t_n \in \mathcal{T} }[/math].