Sign in

Scaling Language Model Size in Cross-Device Federated Learning

By Jae Hun Ro and others
Most studies in cross-device federated learning focus on small models, due to the server-client communication and on-device computation bottlenecks. In this work, we leverage various techniques for mitigating these bottlenecks to train larger language models in cross-device federated learning. With systematic applications of partial model training, quantization, efficient transfer learning,... Show more
June 24, 2022
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Scaling Language Model Size in Cross-Device Federated Learning
Click on play to start listening