Sign in

mGPT: Few-Shot Learners Go Multilingual

By Oleh Shliazhko and others at
LogoHigher School of Economics
and
LogoUniversity of Oslo
Recent studies report that autoregressive language models can successfully solve many NLP tasks via zero- and few-shot learning paradigms, which opens up new possibilities for using the pre-trained language models. This paper introduces two autoregressive GPT-like models with 1.3 billion and 13 billion parameters trained on 60 languages from 25... Show more
October 12, 2023
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
mGPT: Few-Shot Learners Go Multilingual
Click on play to start listening