Efficient Model for Image Classification With Regularization Tricks

Taehyeon Kim, Jonghyup Kim, Seyoung Yun
Proceedings of the NeurIPS 2019 Competition and Demonstration Track, PMLR 123:13-26, 2020.

Abstract

In the MicroNet Challenge 2019, competitors attempted to design the neural network architecture with fewer resource budgets, e.g., the number of parameters and FLOPS. In this study, we describe the approaches of team KAIST, using which they won the second and third places, respectively, in the CIFAR-100 classification task in the contest. We solve the task into four steps. First, we design a novel baseline network appropriate for the CIFAR-100 dataset. Second, we train this network using our novel structural regularization methods, which penalize the orthogonality of weights and replace the ground-truth label of each data with a noise vector that has class-wise similarity information from the representative feature vectors of each class in the course of training. Third, we seek the most potent data-augmentation methods for significant improvements in accuracy. At last, we perform the sparse training via a pruning technique. Our final score is 0.0054, which represents 370x improvements over the baseline for the CIFAR-100 dataset. This is the only work that finished in the top 10 percent of both parameter storage and computation over the CIFAR-100 classification task. The source code is at {https://github.com/Kthyeon/}micronet_neurips_challenge.

Cite this Paper


BibTeX
@InProceedings{pmlr-v123-kim20a, title = {Efficient Model for Image Classification With Regularization Tricks}, author = {Kim, Taehyeon and Kim, Jonghyup and Yun, Seyoung}, booktitle = {Proceedings of the NeurIPS 2019 Competition and Demonstration Track}, pages = {13--26}, year = {2020}, editor = {Escalante, Hugo Jair and Hadsell, Raia}, volume = {123}, series = {Proceedings of Machine Learning Research}, month = {08--14 Dec}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v123/kim20a/kim20a.pdf}, url = {https://proceedings.mlr.press/v123/kim20a.html}, abstract = {In the MicroNet Challenge 2019, competitors attempted to design the neural network architecture with fewer resource budgets, e.g., the number of parameters and FLOPS. In this study, we describe the approaches of team KAIST, using which they won the second and third places, respectively, in the CIFAR-100 classification task in the contest. We solve the task into four steps. First, we design a novel baseline network appropriate for the CIFAR-100 dataset. Second, we train this network using our novel structural regularization methods, which penalize the orthogonality of weights and replace the ground-truth label of each data with a noise vector that has class-wise similarity information from the representative feature vectors of each class in the course of training. Third, we seek the most potent data-augmentation methods for significant improvements in accuracy. At last, we perform the sparse training via a pruning technique. Our final score is 0.0054, which represents 370x improvements over the baseline for the CIFAR-100 dataset. This is the only work that finished in the top 10 percent of both parameter storage and computation over the CIFAR-100 classification task. The source code is at {https://github.com/Kthyeon/}micronet_neurips_challenge.} }
Endnote
%0 Conference Paper %T Efficient Model for Image Classification With Regularization Tricks %A Taehyeon Kim %A Jonghyup Kim %A Seyoung Yun %B Proceedings of the NeurIPS 2019 Competition and Demonstration Track %C Proceedings of Machine Learning Research %D 2020 %E Hugo Jair Escalante %E Raia Hadsell %F pmlr-v123-kim20a %I PMLR %P 13--26 %U https://proceedings.mlr.press/v123/kim20a.html %V 123 %X In the MicroNet Challenge 2019, competitors attempted to design the neural network architecture with fewer resource budgets, e.g., the number of parameters and FLOPS. In this study, we describe the approaches of team KAIST, using which they won the second and third places, respectively, in the CIFAR-100 classification task in the contest. We solve the task into four steps. First, we design a novel baseline network appropriate for the CIFAR-100 dataset. Second, we train this network using our novel structural regularization methods, which penalize the orthogonality of weights and replace the ground-truth label of each data with a noise vector that has class-wise similarity information from the representative feature vectors of each class in the course of training. Third, we seek the most potent data-augmentation methods for significant improvements in accuracy. At last, we perform the sparse training via a pruning technique. Our final score is 0.0054, which represents 370x improvements over the baseline for the CIFAR-100 dataset. This is the only work that finished in the top 10 percent of both parameter storage and computation over the CIFAR-100 classification task. The source code is at {https://github.com/Kthyeon/}micronet_neurips_challenge.
APA
Kim, T., Kim, J. & Yun, S.. (2020). Efficient Model for Image Classification With Regularization Tricks. Proceedings of the NeurIPS 2019 Competition and Demonstration Track, in Proceedings of Machine Learning Research 123:13-26 Available from https://proceedings.mlr.press/v123/kim20a.html.

Related Material