Multidimensional Shape Constraints

Maya Gupta, Erez Louidor, Oleksandr Mangylov, Nobu Morioka, Taman Narayan, Sen Zhao
Proceedings of the 37th International Conference on Machine Learning, PMLR 119:3918-3928, 2020.

Abstract

We propose new multi-input shape constraints across four intuitive categories: complements, diminishers, dominance, and unimodality constraints. We show these shape constraints can be checked and even enforced when training machine-learned models for linear models, generalized additive models, and the nonlinear function class of multi-layer lattice models. Real-world experiments illustrate how the different shape constraints can be used to increase explainability and improve regularization, especially for non-IID train-test distribution shift.

Cite this Paper


BibTeX
@InProceedings{pmlr-v119-gupta20b, title = {Multidimensional Shape Constraints}, author = {Gupta, Maya and Louidor, Erez and Mangylov, Oleksandr and Morioka, Nobu and Narayan, Taman and Zhao, Sen}, booktitle = {Proceedings of the 37th International Conference on Machine Learning}, pages = {3918--3928}, year = {2020}, editor = {III, Hal Daumé and Singh, Aarti}, volume = {119}, series = {Proceedings of Machine Learning Research}, month = {13--18 Jul}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v119/gupta20b/gupta20b.pdf}, url = {https://proceedings.mlr.press/v119/gupta20b.html}, abstract = {We propose new multi-input shape constraints across four intuitive categories: complements, diminishers, dominance, and unimodality constraints. We show these shape constraints can be checked and even enforced when training machine-learned models for linear models, generalized additive models, and the nonlinear function class of multi-layer lattice models. Real-world experiments illustrate how the different shape constraints can be used to increase explainability and improve regularization, especially for non-IID train-test distribution shift.} }
Endnote
%0 Conference Paper %T Multidimensional Shape Constraints %A Maya Gupta %A Erez Louidor %A Oleksandr Mangylov %A Nobu Morioka %A Taman Narayan %A Sen Zhao %B Proceedings of the 37th International Conference on Machine Learning %C Proceedings of Machine Learning Research %D 2020 %E Hal Daumé III %E Aarti Singh %F pmlr-v119-gupta20b %I PMLR %P 3918--3928 %U https://proceedings.mlr.press/v119/gupta20b.html %V 119 %X We propose new multi-input shape constraints across four intuitive categories: complements, diminishers, dominance, and unimodality constraints. We show these shape constraints can be checked and even enforced when training machine-learned models for linear models, generalized additive models, and the nonlinear function class of multi-layer lattice models. Real-world experiments illustrate how the different shape constraints can be used to increase explainability and improve regularization, especially for non-IID train-test distribution shift.
APA
Gupta, M., Louidor, E., Mangylov, O., Morioka, N., Narayan, T. & Zhao, S.. (2020). Multidimensional Shape Constraints. Proceedings of the 37th International Conference on Machine Learning, in Proceedings of Machine Learning Research 119:3918-3928 Available from https://proceedings.mlr.press/v119/gupta20b.html.

Related Material