Seminar by Cédric Maron: "Knowledge distillation based on monoclass teachers for edge infrastructure"
The Thursday, April 4, 2024
at 11:00 AM
Room F021b
Building F
Laboratoire Hubert Curien
18, rue du Professeur Benoît Lauras
42000 Saint-Etienne
"Knowledge distillation based on monoclass teachers for edge infrastructure"
Abstract
With the growing interest in neural network compression, several methods aiming to improve the networks accuracy have emerged. Data augmentation aims to enhance model robustness and generalization by increasing the diversity of the training dataset. Knowledge distillation, aims to transfer knowledge from a teacher network to a student network. Knowledge distillation is generally carried out using high-end GPUs because teacher network architectures are often too heavy to be implemented on the small resources present in the Edge. This paper proposes a new distillation method adapted to an edge computing infrastructure. By employing multiple monoclass teachers of small sizes, the proposed distillation method becomes applicable even within the constrained computing resources of the edge. The proposed method is evaluated with classical knowledge distillation based on bigger teacher network, using different data augmentation methods and using different amount of training data.
This seminar will be held in English.
Lien Cisco Webex : https://ujmstetienne.webex.com/ujmstetienne/j.php?MTID=m2420edd420397f35c746e069c49a4e78