Information Retrieval by Inferring Implicit Queries from Eye Movements

David R. Hardoon, John Shawe-Taylor, Antti Ajanki, Kai Puolamäki, Samuel Kaski
; Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, PMLR 2:179-186, 2007.

Abstract

We introduce a new search strategy, in which the information retrieval (IR) query is inferred from eye movements measured when the user is reading text during an IR task. In training phase, we know the users’ interest, that is, the relevance of training documents. We learn a predictor that produces a “query” given the eye movements; the target of learning is an “optimal” query that is computed based on the known relevance of the training documents. Assuming the predictor is universal with respect to the users’ interests, it can also be applied to infer the implicit query when we have no prior knowledge of the users’ interests. The result of an empirical study is that it is possible to learn the implicit query from a small set of read documents, such that relevance predictions for a large set of unseen documents are ranked significantly better than by random guessing.

Cite this Paper


BibTeX
@InProceedings{pmlr-v2-hardoon07a, title = {Information Retrieval by Inferring Implicit Queries from Eye Movements}, author = {David R. Hardoon and John Shawe-Taylor and Antti Ajanki and Kai Puolamäki and Samuel Kaski}, booktitle = {Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics}, pages = {179--186}, year = {2007}, editor = {Marina Meila and Xiaotong Shen}, volume = {2}, series = {Proceedings of Machine Learning Research}, address = {San Juan, Puerto Rico}, month = {21--24 Mar}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v2/hardoon07a/hardoon07a.pdf}, url = {http://proceedings.mlr.press/v2/hardoon07a.html}, abstract = {We introduce a new search strategy, in which the information retrieval (IR) query is inferred from eye movements measured when the user is reading text during an IR task. In training phase, we know the users’ interest, that is, the relevance of training documents. We learn a predictor that produces a “query” given the eye movements; the target of learning is an “optimal” query that is computed based on the known relevance of the training documents. Assuming the predictor is universal with respect to the users’ interests, it can also be applied to infer the implicit query when we have no prior knowledge of the users’ interests. The result of an empirical study is that it is possible to learn the implicit query from a small set of read documents, such that relevance predictions for a large set of unseen documents are ranked significantly better than by random guessing.} }
Endnote
%0 Conference Paper %T Information Retrieval by Inferring Implicit Queries from Eye Movements %A David R. Hardoon %A John Shawe-Taylor %A Antti Ajanki %A Kai Puolamäki %A Samuel Kaski %B Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2007 %E Marina Meila %E Xiaotong Shen %F pmlr-v2-hardoon07a %I PMLR %J Proceedings of Machine Learning Research %P 179--186 %U http://proceedings.mlr.press %V 2 %W PMLR %X We introduce a new search strategy, in which the information retrieval (IR) query is inferred from eye movements measured when the user is reading text during an IR task. In training phase, we know the users’ interest, that is, the relevance of training documents. We learn a predictor that produces a “query” given the eye movements; the target of learning is an “optimal” query that is computed based on the known relevance of the training documents. Assuming the predictor is universal with respect to the users’ interests, it can also be applied to infer the implicit query when we have no prior knowledge of the users’ interests. The result of an empirical study is that it is possible to learn the implicit query from a small set of read documents, such that relevance predictions for a large set of unseen documents are ranked significantly better than by random guessing.
RIS
TY - CPAPER TI - Information Retrieval by Inferring Implicit Queries from Eye Movements AU - David R. Hardoon AU - John Shawe-Taylor AU - Antti Ajanki AU - Kai Puolamäki AU - Samuel Kaski BT - Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics PY - 2007/03/11 DA - 2007/03/11 ED - Marina Meila ED - Xiaotong Shen ID - pmlr-v2-hardoon07a PB - PMLR SP - 179 DP - PMLR EP - 186 L1 - http://proceedings.mlr.press/v2/hardoon07a/hardoon07a.pdf UR - http://proceedings.mlr.press/v2/hardoon07a.html AB - We introduce a new search strategy, in which the information retrieval (IR) query is inferred from eye movements measured when the user is reading text during an IR task. In training phase, we know the users’ interest, that is, the relevance of training documents. We learn a predictor that produces a “query” given the eye movements; the target of learning is an “optimal” query that is computed based on the known relevance of the training documents. Assuming the predictor is universal with respect to the users’ interests, it can also be applied to infer the implicit query when we have no prior knowledge of the users’ interests. The result of an empirical study is that it is possible to learn the implicit query from a small set of read documents, such that relevance predictions for a large set of unseen documents are ranked significantly better than by random guessing. ER -
APA
Hardoon, D.R., Shawe-Taylor, J., Ajanki, A., Puolamäki, K. & Kaski, S.. (2007). Information Retrieval by Inferring Implicit Queries from Eye Movements. Proceedings of the Eleventh International Conference on Artificial Intelligence and Statistics, in PMLR 2:179-186

Related Material