Maximum Margin Learning with Incomplete Data: Learning Networks instead of Tables

Sandor Szedmak, Yizhao Ni, Steve R. Gunn
Proceedings of the First Workshop on Applications of Pattern Analysis, PMLR 11:96-102, 2010.

Abstract

In this paper we address the problem of predicting when the available data is incomplete. We show that changing the generally accepted table-wise view of the sample items into a graph representable one allows us to solve these kind of problems in a very concise way by using the well known convex, one-class classification based, optimisation framework. The use of the one-class formulation in the learning phase and in the prediction as well makes the entire procedure highly consistent. The graph representation can express the complex interdependencies among the data sources. The underlying optimisation problem can be transformed into a on-line algorithm, e.g. a perceptron type one, and in this way it can deal with data sets of million items. This framework covers and encompasses supervised, semi-supervised and some unsupervised learning problems. Furthermore, the data sources can be chosen as not only simple binary variables or vectors but text documents, images or even graphs with complex internal structures.

Cite this Paper


BibTeX
@InProceedings{pmlr-v11-szedmak10a, title = {Maximum Margin Learning with Incomplete Data: Learning Networks instead of Tables}, author = {Szedmak, Sandor and Ni, Yizhao and Gunn, Steve R.}, booktitle = {Proceedings of the First Workshop on Applications of Pattern Analysis}, pages = {96--102}, year = {2010}, editor = {Diethe, Tom and Cristianini, Nello and Shawe-Taylor, John}, volume = {11}, series = {Proceedings of Machine Learning Research}, address = {Cumberland Lodge, Windsor, UK}, month = {01--03 Sep}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v11/szedmak10a/szedmak10a.pdf}, url = {https://proceedings.mlr.press/v11/szedmak10a.html}, abstract = {In this paper we address the problem of predicting when the available data is incomplete. We show that changing the generally accepted table-wise view of the sample items into a graph representable one allows us to solve these kind of problems in a very concise way by using the well known convex, one-class classification based, optimisation framework. The use of the one-class formulation in the learning phase and in the prediction as well makes the entire procedure highly consistent. The graph representation can express the complex interdependencies among the data sources. The underlying optimisation problem can be transformed into a on-line algorithm, e.g. a perceptron type one, and in this way it can deal with data sets of million items. This framework covers and encompasses supervised, semi-supervised and some unsupervised learning problems. Furthermore, the data sources can be chosen as not only simple binary variables or vectors but text documents, images or even graphs with complex internal structures.} }
Endnote
%0 Conference Paper %T Maximum Margin Learning with Incomplete Data: Learning Networks instead of Tables %A Sandor Szedmak %A Yizhao Ni %A Steve R. Gunn %B Proceedings of the First Workshop on Applications of Pattern Analysis %C Proceedings of Machine Learning Research %D 2010 %E Tom Diethe %E Nello Cristianini %E John Shawe-Taylor %F pmlr-v11-szedmak10a %I PMLR %P 96--102 %U https://proceedings.mlr.press/v11/szedmak10a.html %V 11 %X In this paper we address the problem of predicting when the available data is incomplete. We show that changing the generally accepted table-wise view of the sample items into a graph representable one allows us to solve these kind of problems in a very concise way by using the well known convex, one-class classification based, optimisation framework. The use of the one-class formulation in the learning phase and in the prediction as well makes the entire procedure highly consistent. The graph representation can express the complex interdependencies among the data sources. The underlying optimisation problem can be transformed into a on-line algorithm, e.g. a perceptron type one, and in this way it can deal with data sets of million items. This framework covers and encompasses supervised, semi-supervised and some unsupervised learning problems. Furthermore, the data sources can be chosen as not only simple binary variables or vectors but text documents, images or even graphs with complex internal structures.
RIS
TY - CPAPER TI - Maximum Margin Learning with Incomplete Data: Learning Networks instead of Tables AU - Sandor Szedmak AU - Yizhao Ni AU - Steve R. Gunn BT - Proceedings of the First Workshop on Applications of Pattern Analysis DA - 2010/09/30 ED - Tom Diethe ED - Nello Cristianini ED - John Shawe-Taylor ID - pmlr-v11-szedmak10a PB - PMLR DP - Proceedings of Machine Learning Research VL - 11 SP - 96 EP - 102 L1 - http://proceedings.mlr.press/v11/szedmak10a/szedmak10a.pdf UR - https://proceedings.mlr.press/v11/szedmak10a.html AB - In this paper we address the problem of predicting when the available data is incomplete. We show that changing the generally accepted table-wise view of the sample items into a graph representable one allows us to solve these kind of problems in a very concise way by using the well known convex, one-class classification based, optimisation framework. The use of the one-class formulation in the learning phase and in the prediction as well makes the entire procedure highly consistent. The graph representation can express the complex interdependencies among the data sources. The underlying optimisation problem can be transformed into a on-line algorithm, e.g. a perceptron type one, and in this way it can deal with data sets of million items. This framework covers and encompasses supervised, semi-supervised and some unsupervised learning problems. Furthermore, the data sources can be chosen as not only simple binary variables or vectors but text documents, images or even graphs with complex internal structures. ER -
APA
Szedmak, S., Ni, Y. & Gunn, S.R.. (2010). Maximum Margin Learning with Incomplete Data: Learning Networks instead of Tables. Proceedings of the First Workshop on Applications of Pattern Analysis, in Proceedings of Machine Learning Research 11:96-102 Available from https://proceedings.mlr.press/v11/szedmak10a.html.

Related Material