Middle-level Distillation Method Based on Multi-source Information Matching Guided by Auxiliary Classification

Main Article Content

Yifan Zhang

Abstract

In traditional KD, the output of the large model (pseudo
label) is usually used to supervise the small model, and
the small model conforms to the real label and the teacher's pseudo label. Later work proposed feature loss as a
middle level supervisor to further mine teacher information. However, the performance of feature loss is not as
strong as the output KD, and the definition and implementation of feature loss are more comp licated, and it is not
robust to different T model structures. A simple, novel and effective auxiliary middle level distillation (AID)
middle level supervision method is raised, effectively enhancing the performance of the learner/S model (S model
for shor t) under the lecturer/ T model (T model for short). Specifically, we use the auxiliary branch as the
transformation of the student network, and use the feature distance, logits distance, sample pair relationship and
other multi source information matching to shrink the difference between the S model features and the T model's
advanced features. Use orthogonalization and logical normalization techniques to make auxiliary branches better
transfer feature knowledge. Our very novel method is the first in the KD research field to use multi source
information to match middle level supervision. We have achieved excellent results on common benchmarks. In
CIFAR100 and CIFAR10, the accuracy of 11 models increased by 5.46% and 2.49% respectively on average. In
ImageNet , AID achieves 1.57 times compression and 1.81 times acceleration without loss of accuracy. As teachers
for many models that perform well, we can improve student performance more effectively under the training of our
methods.

Article Details

How to Cite
Yifan Zhang. (2021). Middle-level Distillation Method Based on Multi-source Information Matching Guided by Auxiliary Classification. CONVERTER, 2021(7), 911-921. Retrieved from http://converter-magazine.info/index.php/converter/article/view/579
Section
Articles