Feature-map-level Online Adversarial Knowledge Distillation

Inseop Chung, Seonguk Park, Jangho Kim, Nojun Kwak
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2006-2015, 2020.

Abstract

Feature maps contain rich information about image intensity and spatial correlation. However, previous online knowledge distillation methods only utilize the class probabilities. Thus in this paper, we propose an online knowledge distillation method that transfers not only the knowledge of the class probabilities but also that of the feature map using the adversarial training framework. We train multiple networks simultaneously by employing discriminators to distinguish the feature map distributions of different networks. Each network has its corresponding discriminator which discriminates the feature map from its own as fake while classifying that of the other network as real. By training a network to fool the corresponding discriminator, it can learn the other network’s feature map distribution. We show that our method performs better than the conventional direct alignment method such as L1 and is more suitable for online distillation. Also, we propose a novel cyclic learning scheme for training more than two networks together. We have applied our method to various network architectures on the classification task and discovered a significant improvement of performance especially in the case of training a pair of a small network and a large one.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-chung20a, title = {Feature-map-level Online Adversarial Knowledge Distillation}, author = {Chung, Inseop and Park, Seonguk and Kim, Jangho and Kwak, Nojun}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2006--2015}, year = {2020}, editor = {Hal Daumé III and Aarti Singh}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/chung20a/chung20a.pdf}, url = { http://proceedings.mlr.press/v119/chung20a.html }, abstract = {Feature maps contain rich information about image intensity and spatial correlation. However, previous online knowledge distillation methods only utilize the class probabilities. Thus in this paper, we propose an online knowledge distillation method that transfers not only the knowledge of the class probabilities but also that of the feature map using the adversarial training framework. We train multiple networks simultaneously by employing discriminators to distinguish the feature map distributions of different networks. Each network has its corresponding discriminator which discriminates the feature map from its own as fake while classifying that of the other network as real. By training a network to fool the corresponding discriminator, it can learn the other network’s feature map distribution. We show that our method performs better than the conventional direct alignment method such as L1 and is more suitable for online distillation. Also, we propose a novel cyclic learning scheme for training more than two networks together. We have applied our method to various network architectures on the classification task and discovered a significant improvement of performance especially in the case of training a pair of a small network and a large one.} }
Endnote
%0 Conference Paper %T Feature-map-level Online Adversarial Knowledge Distillation %A Inseop Chung %A Seonguk Park %A Jangho Kim %A Nojun Kwak %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-chung20a %I PMLR %P 2006--2015 %U http://proceedings.mlr.press/v119/chung20a.html %V 119 %X Feature maps contain rich information about image intensity and spatial correlation. However, previous online knowledge distillation methods only utilize the class probabilities. Thus in this paper, we propose an online knowledge distillation method that transfers not only the knowledge of the class probabilities but also that of the feature map using the adversarial training framework. We train multiple networks simultaneously by employing discriminators to distinguish the feature map distributions of different networks. Each network has its corresponding discriminator which discriminates the feature map from its own as fake while classifying that of the other network as real. By training a network to fool the corresponding discriminator, it can learn the other network’s feature map distribution. We show that our method performs better than the conventional direct alignment method such as L1 and is more suitable for online distillation. Also, we propose a novel cyclic learning scheme for training more than two networks together. We have applied our method to various network architectures on the classification task and discovered a significant improvement of performance especially in the case of training a pair of a small network and a large one.
APA
Chung, I., Park, S., Kim, J. & Kwak, N.. (2020). Feature-map-level Online Adversarial Knowledge Distillation. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2006-2015 Available from http://proceedings.mlr.press/v119/chung20a.html .

Related Material