Sign in

MuPT: A Generative Symbolic Music Pretrained Transformer

By Xingwei Qu and others at
LogoUniversity of Manchester
and
LogoUniversity of Waterloo
In this paper, we explore the application of Large Language Models (LLMs) to the pre-training of music. While the prevalent use of MIDI in music modeling is well-established, our findings suggest that LLMs are inherently more compatible with ABC Notation, which aligns more closely with their design and strengths, thereby... Show more
September 10, 2024
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
MuPT: A Generative Symbolic Music Pretrained Transformer
Click on play to start listening