Context-Aware Zero-Shot Learning for Object Recognition

Eloi Zablocki, Patrick Bordes, Laure Soulier, Benjamin Piwowarski, Patrick Gallinari
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:7292-7303, 2019.

Abstract

Zero-Shot Learning (ZSL) aims at classifying unlabeled objects by leveraging auxiliary knowledge, such as semantic representations. A limitation of previous approaches is that only intrinsic properties of objects, e.g. their visual appearance, are taken into account while their context, e.g. the surrounding objects in the image, is ignored. Following the intuitive principle that objects tend to be found in certain contexts but not others, we propose a new and challenging approach, context-aware ZSL, that leverages semantic representations in a new way to model the conditional likelihood of an object to appear in a given context. Finally, through extensive experiments conducted on Visual Genome, we show that contextual information can substantially improve the standard ZSL approach and is robust to unbalanced classes.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-zablocki19a, title = {Context-Aware Zero-Shot Learning for Object Recognition}, author = {Zablocki, Eloi and Bordes, Patrick and Soulier, Laure and Piwowarski, Benjamin and Gallinari, Patrick}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {7292--7303}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/zablocki19a/zablocki19a.pdf}, url = {https://proceedings.mlr.press/v97/zablocki19a.html}, abstract = {Zero-Shot Learning (ZSL) aims at classifying unlabeled objects by leveraging auxiliary knowledge, such as semantic representations. A limitation of previous approaches is that only intrinsic properties of objects, e.g. their visual appearance, are taken into account while their context, e.g. the surrounding objects in the image, is ignored. Following the intuitive principle that objects tend to be found in certain contexts but not others, we propose a new and challenging approach, context-aware ZSL, that leverages semantic representations in a new way to model the conditional likelihood of an object to appear in a given context. Finally, through extensive experiments conducted on Visual Genome, we show that contextual information can substantially improve the standard ZSL approach and is robust to unbalanced classes.} }
Endnote
%0 Conference Paper %T Context-Aware Zero-Shot Learning for Object Recognition %A Eloi Zablocki %A Patrick Bordes %A Laure Soulier %A Benjamin Piwowarski %A Patrick Gallinari %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-zablocki19a %I PMLR %P 7292--7303 %U https://proceedings.mlr.press/v97/zablocki19a.html %V 97 %X Zero-Shot Learning (ZSL) aims at classifying unlabeled objects by leveraging auxiliary knowledge, such as semantic representations. A limitation of previous approaches is that only intrinsic properties of objects, e.g. their visual appearance, are taken into account while their context, e.g. the surrounding objects in the image, is ignored. Following the intuitive principle that objects tend to be found in certain contexts but not others, we propose a new and challenging approach, context-aware ZSL, that leverages semantic representations in a new way to model the conditional likelihood of an object to appear in a given context. Finally, through extensive experiments conducted on Visual Genome, we show that contextual information can substantially improve the standard ZSL approach and is robust to unbalanced classes.
APA
Zablocki, E., Bordes, P., Soulier, L., Piwowarski, B. & Gallinari, P.. (2019). Context-Aware Zero-Shot Learning for Object Recognition. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:7292-7303 Available from https://proceedings.mlr.press/v97/zablocki19a.html.

Related Material