Self-Paced Multi-Label Learning with Diversity

Seyed Amjad Seyedi, S. Siamak Ghodsi, Fardin Akhlaghian, Mahdi Jalili, Parham Moradi
Proceedings of The Eleventh Asian Conference on Machine Learning, PMLR 101:790-805, 2019.

Abstract

The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning process. Besides, the utilization of a diversity maintenance approach avoids overfitting on a subset of easy labels. In this paper, we propose a self-paced multi-label learning with diversity (SPMLD) which aims to cover diverse labels with respect to its learning pace. In addition, the proposed framework is applied to an efficient correlation-based multi-label method. The non-convex objective function is optimized by an extension of the block coordinate descent algorithm. Empirical evaluations on real-world datasets with different dimensions of features and labels imply the effectiveness of the proposed predictive model.

Cite this Paper


BibTeX
@InProceedings{pmlr-v101-seyedi19a, title = {Self-Paced Multi-Label Learning with Diversity}, author = {Seyedi, Seyed Amjad and Ghodsi, S. Siamak and Akhlaghian, Fardin and Jalili, Mahdi and Moradi, Parham}, booktitle = {Proceedings of The Eleventh Asian Conference on Machine Learning}, pages = {790--805}, year = {2019}, editor = {Lee, Wee Sun and Suzuki, Taiji}, volume = {101}, series = {Proceedings of Machine Learning Research}, month = {17--19 Nov}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v101/seyedi19a/seyedi19a.pdf}, url = {https://proceedings.mlr.press/v101/seyedi19a.html}, abstract = {The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning process. Besides, the utilization of a diversity maintenance approach avoids overfitting on a subset of easy labels. In this paper, we propose a self-paced multi-label learning with diversity (SPMLD) which aims to cover diverse labels with respect to its learning pace. In addition, the proposed framework is applied to an efficient correlation-based multi-label method. The non-convex objective function is optimized by an extension of the block coordinate descent algorithm. Empirical evaluations on real-world datasets with different dimensions of features and labels imply the effectiveness of the proposed predictive model.} }
Endnote
%0 Conference Paper %T Self-Paced Multi-Label Learning with Diversity %A Seyed Amjad Seyedi %A S. Siamak Ghodsi %A Fardin Akhlaghian %A Mahdi Jalili %A Parham Moradi %B Proceedings of The Eleventh Asian Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Wee Sun Lee %E Taiji Suzuki %F pmlr-v101-seyedi19a %I PMLR %P 790--805 %U https://proceedings.mlr.press/v101/seyedi19a.html %V 101 %X The major challenge of learning from multi-label data has arisen from the overwhelming size of label space which makes this problem NP-hard. This problem can be alleviated by gradually involving easy to hard tags into the learning process. Besides, the utilization of a diversity maintenance approach avoids overfitting on a subset of easy labels. In this paper, we propose a self-paced multi-label learning with diversity (SPMLD) which aims to cover diverse labels with respect to its learning pace. In addition, the proposed framework is applied to an efficient correlation-based multi-label method. The non-convex objective function is optimized by an extension of the block coordinate descent algorithm. Empirical evaluations on real-world datasets with different dimensions of features and labels imply the effectiveness of the proposed predictive model.
APA
Seyedi, S.A., Ghodsi, S.S., Akhlaghian, F., Jalili, M. & Moradi, P.. (2019). Self-Paced Multi-Label Learning with Diversity. Proceedings of The Eleventh Asian Conference on Machine Learning, in Proceedings of Machine Learning Research 101:790-805 Available from https://proceedings.mlr.press/v101/seyedi19a.html.

Related Material