Shape Constraints for Set Functions

Andrew Cotter, Maya Gupta, Heinrich Jiang, Erez Louidor, James Muller, Tamann Narayan, Serena Wang, Tao Zhu
Proceedings of the 36th International Conference on Machine Learning, PMLR 97:1388-1396, 2019.

Abstract

Set functions predict a label from a permutation-invariant variable-size collection of feature vectors. We propose making set functions more understandable and regularized by capturing domain knowledge through shape constraints. We show how prior work in monotonic constraints can be adapted to set functions, and then propose two new shape constraints designed to generalize the conditioning role of weights in a weighted mean. We show how one can train standard functions and set functions that satisfy these shape constraints with a deep lattice network. We propose a nonlinear estimation strategy we call the semantic feature engine that uses set functions with the proposed shape constraints to estimate labels for compound sparse categorical features. Experiments on real-world data show the achieved accuracy is similar to deep sets or deep neural networks, but provides guarantees on the model behavior, which makes it easier to explain and debug.

Cite this Paper


BibTeX
@InProceedings{pmlr-v97-cotter19a, title = {Shape Constraints for Set Functions}, author = {Cotter, Andrew and Gupta, Maya and Jiang, Heinrich and Louidor, Erez and Muller, James and Narayan, Tamann and Wang, Serena and Zhu, Tao}, booktitle = {Proceedings of the 36th International Conference on Machine Learning}, pages = {1388--1396}, year = {2019}, editor = {Chaudhuri, Kamalika and Salakhutdinov, Ruslan}, volume = {97}, series = {Proceedings of Machine Learning Research}, month = {09--15 Jun}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v97/cotter19a/cotter19a.pdf}, url = {https://proceedings.mlr.press/v97/cotter19a.html}, abstract = {Set functions predict a label from a permutation-invariant variable-size collection of feature vectors. We propose making set functions more understandable and regularized by capturing domain knowledge through shape constraints. We show how prior work in monotonic constraints can be adapted to set functions, and then propose two new shape constraints designed to generalize the conditioning role of weights in a weighted mean. We show how one can train standard functions and set functions that satisfy these shape constraints with a deep lattice network. We propose a nonlinear estimation strategy we call the semantic feature engine that uses set functions with the proposed shape constraints to estimate labels for compound sparse categorical features. Experiments on real-world data show the achieved accuracy is similar to deep sets or deep neural networks, but provides guarantees on the model behavior, which makes it easier to explain and debug.} }
Endnote
%0 Conference Paper %T Shape Constraints for Set Functions %A Andrew Cotter %A Maya Gupta %A Heinrich Jiang %A Erez Louidor %A James Muller %A Tamann Narayan %A Serena Wang %A Tao Zhu %B Proceedings of the 36th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Ruslan Salakhutdinov %F pmlr-v97-cotter19a %I PMLR %P 1388--1396 %U https://proceedings.mlr.press/v97/cotter19a.html %V 97 %X Set functions predict a label from a permutation-invariant variable-size collection of feature vectors. We propose making set functions more understandable and regularized by capturing domain knowledge through shape constraints. We show how prior work in monotonic constraints can be adapted to set functions, and then propose two new shape constraints designed to generalize the conditioning role of weights in a weighted mean. We show how one can train standard functions and set functions that satisfy these shape constraints with a deep lattice network. We propose a nonlinear estimation strategy we call the semantic feature engine that uses set functions with the proposed shape constraints to estimate labels for compound sparse categorical features. Experiments on real-world data show the achieved accuracy is similar to deep sets or deep neural networks, but provides guarantees on the model behavior, which makes it easier to explain and debug.
APA
Cotter, A., Gupta, M., Jiang, H., Louidor, E., Muller, J., Narayan, T., Wang, S. & Zhu, T.. (2019). Shape Constraints for Set Functions. Proceedings of the 36th International Conference on Machine Learning, in Proceedings of Machine Learning Research 97:1388-1396 Available from https://proceedings.mlr.press/v97/cotter19a.html.

Related Material