Sign in

Hazards from Increasingly Accessible Fine-Tuning of Downloadable Foundation Models

By Alan Chan and others at
LogoUniversité de Montréal
and
LogoUniversity of Cambridge
Public release of the weights of pretrained foundation models, otherwise known as downloadable access \citep{solaiman_gradient_2023}, enables fine-tuning without the prohibitive expense of pretraining. Our work argues that increasingly accessible fine-tuning of downloadable models may increase hazards. First, we highlight research to improve the accessibility of fine-tuning. We split our discussion... Show more
December 22, 2023
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Hazards from Increasingly Accessible Fine-Tuning of Downloadable Foundation Models
Click on play to start listening