Roberta,全称为“robustly optimized bert pretraining approach”,是 bert(双向编码器表示)的一个改进版,它在自然语言处理(nlp)领域带来了革命性的突破。roberta 是由 facebook ai 开. Jul 1, 2021the masked language model task is the key to bert and roberta. The original roberta article explains it in section 4.1:
Jan 12, 2024although bert preceeded roberta, we may understand this observation to be somewhat applicable to roberta, which is very similar.