Optimal Sequential Maximization: One Interview is Enough!

Moein Falahatgar, Alon Orlitsky, Venkatadheeraj Pichapati
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:2975-2984, 2020.

Abstract

Maximum selection under probabilistic queries \emph{(probabilistic maximization)} is a fundamental algorithmic problem arising in numerous theoretical and practical contexts. We derive the first query-optimal sequential algorithm for probabilistic-maximization. Departing from previous assumptions, the algorithm and performance guarantees apply even for infinitely many items, hence in particular do not require a-priori knowledge of the number of items. The algorithm has linear query complexity, and is optimal also in the streaming setting. To derive these results we consider a probabilistic setting where several candidates for a position are asked multiple questions with the goal of finding who has the highest probability of answering interview questions correctly. Previous work minimized the total number of questions asked by alternating back and forth between the best performing candidates, in a sense, inviting them to multiple interviews. We show that the same order-wise selection accuracy can be achieved by querying the candidates sequentially, never returning to a previously queried candidate. Hence one interview is enough!

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-falahatgar20a, title = {Optimal Sequential Maximization: One Interview is Enough!}, author = {Falahatgar, Moein and Orlitsky, Alon and Pichapati, Venkatadheeraj}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {2975--2984}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/falahatgar20a/falahatgar20a.pdf}, url = {https://proceedings.mlr.press/v119/falahatgar20a.html}, abstract = {Maximum selection under probabilistic queries \emph{(probabilistic maximization)} is a fundamental algorithmic problem arising in numerous theoretical and practical contexts. We derive the first query-optimal sequential algorithm for probabilistic-maximization. Departing from previous assumptions, the algorithm and performance guarantees apply even for infinitely many items, hence in particular do not require a-priori knowledge of the number of items. The algorithm has linear query complexity, and is optimal also in the streaming setting. To derive these results we consider a probabilistic setting where several candidates for a position are asked multiple questions with the goal of finding who has the highest probability of answering interview questions correctly. Previous work minimized the total number of questions asked by alternating back and forth between the best performing candidates, in a sense, inviting them to multiple interviews. We show that the same order-wise selection accuracy can be achieved by querying the candidates sequentially, never returning to a previously queried candidate. Hence one interview is enough!} }
Endnote
%0 Conference Paper %T Optimal Sequential Maximization: One Interview is Enough! %A Moein Falahatgar %A Alon Orlitsky %A Venkatadheeraj Pichapati %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-falahatgar20a %I PMLR %P 2975--2984 %U https://proceedings.mlr.press/v119/falahatgar20a.html %V 119 %X Maximum selection under probabilistic queries \emph{(probabilistic maximization)} is a fundamental algorithmic problem arising in numerous theoretical and practical contexts. We derive the first query-optimal sequential algorithm for probabilistic-maximization. Departing from previous assumptions, the algorithm and performance guarantees apply even for infinitely many items, hence in particular do not require a-priori knowledge of the number of items. The algorithm has linear query complexity, and is optimal also in the streaming setting. To derive these results we consider a probabilistic setting where several candidates for a position are asked multiple questions with the goal of finding who has the highest probability of answering interview questions correctly. Previous work minimized the total number of questions asked by alternating back and forth between the best performing candidates, in a sense, inviting them to multiple interviews. We show that the same order-wise selection accuracy can be achieved by querying the candidates sequentially, never returning to a previously queried candidate. Hence one interview is enough!
APA
Falahatgar, M., Orlitsky, A. & Pichapati, V.. (2020). Optimal Sequential Maximization: One Interview is Enough!. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:2975-2984 Available from https://proceedings.mlr.press/v119/falahatgar20a.html.

Related Material