Sign in

Knowledge Distillation via Weighted Ensemble of Teaching Assistants

By Durga Prasad Ganta and others
Knowledge distillation in machine learning is the process of transferring knowledge from a large model called the teacher to a smaller model called the student. Knowledge distillation is one of the techniques to compress the large network (teacher) to a smaller network (student) that can be deployed in small devices... Show more
June 23, 2022
=
0
Loading PDF…
Loading full text...
Similar articles
Loading recommendations...
=
0
x1
Knowledge Distillation via Weighted Ensemble of Teaching Assistants
Click on play to start listening