

Beschreibung
The book provides a timely coverage of the paradigm of knowledge distillationan efficient way of model compression. Knowledge distillation is positioned in a general setting of transfer learning, which effectively learns a lightweight student model from a lar...