Roberta

From statwiki
Revision as of 04:10, 29 November 2020 by Dmaleki (talk | contribs) (Created page with "== Presented by == Danial Maleki == Introduction == Self-training methods in the NLP domain(Natural Language Processing) like ELMo[1], GPT[2], BERT[3], XLM[4], and XLNet[5] h...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Presented by

Danial Maleki

Introduction

Self-training methods in the NLP domain(Natural Language Processing) like ELMo[1], GPT[2], BERT[3], XLM[4], and XLNet[5] have shown significant improvements, but knowing which part the methods have the most contribution is challenging to determine. Roberta is a replication of BERT pretraining which is trying to investigate the effects of hyperparameters tuning and training set size.

Background