Fixing Mini-batch Sequences with Hierarchical Robust Partitioning

Shengjie Wang, Wenruo Bai, Chandrashekhar Lavania, Jeff Bilmes
Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, PMLR 89:3352-3361, 2019.

Abstract

We propose a general and efficient hierarchical robust partitioning framework to generate a deterministic sequence of mini-batches, one that offers assurances of being high quality, unlike a randomly drawn sequence. We compare our deterministically generated mini-batch sequences to randomly generated sequences; we show that, on a variety of deep learning tasks, the deterministic sequences significantly beat the mean and worst case performance of the random sequences, and often outperforms the best of the random sequences. Our theoretical contributions include a new algorithm for the robust submodular partition problem subject to cardinality constraints (which is used to construct mini-batch sequences), and show in general that the algorithm is fast and has good theoretical guarantees; we also show a more efficient hierarchical variant of the algorithm with similar guarantees under mild assumptions.

Cite this Paper


BibTeX
@InProceedings{pmlr-v89-wang19e, title = {Fixing Mini-batch Sequences with Hierarchical Robust Partitioning}, author = {Wang, Shengjie and Bai, Wenruo and Lavania, Chandrashekhar and Bilmes, Jeff}, booktitle = {Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics}, pages = {3352--3361}, year = {2019}, editor = {Chaudhuri, Kamalika and Sugiyama, Masashi}, volume = {89}, series = {Proceedings of Machine Learning Research}, month = {16--18 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v89/wang19e/wang19e.pdf}, url = {https://proceedings.mlr.press/v89/wang19e.html}, abstract = {We propose a general and efficient hierarchical robust partitioning framework to generate a deterministic sequence of mini-batches, one that offers assurances of being high quality, unlike a randomly drawn sequence. We compare our deterministically generated mini-batch sequences to randomly generated sequences; we show that, on a variety of deep learning tasks, the deterministic sequences significantly beat the mean and worst case performance of the random sequences, and often outperforms the best of the random sequences. Our theoretical contributions include a new algorithm for the robust submodular partition problem subject to cardinality constraints (which is used to construct mini-batch sequences), and show in general that the algorithm is fast and has good theoretical guarantees; we also show a more efficient hierarchical variant of the algorithm with similar guarantees under mild assumptions.} }
Endnote
%0 Conference Paper %T Fixing Mini-batch Sequences with Hierarchical Robust Partitioning %A Shengjie Wang %A Wenruo Bai %A Chandrashekhar Lavania %A Jeff Bilmes %B Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2019 %E Kamalika Chaudhuri %E Masashi Sugiyama %F pmlr-v89-wang19e %I PMLR %P 3352--3361 %U https://proceedings.mlr.press/v89/wang19e.html %V 89 %X We propose a general and efficient hierarchical robust partitioning framework to generate a deterministic sequence of mini-batches, one that offers assurances of being high quality, unlike a randomly drawn sequence. We compare our deterministically generated mini-batch sequences to randomly generated sequences; we show that, on a variety of deep learning tasks, the deterministic sequences significantly beat the mean and worst case performance of the random sequences, and often outperforms the best of the random sequences. Our theoretical contributions include a new algorithm for the robust submodular partition problem subject to cardinality constraints (which is used to construct mini-batch sequences), and show in general that the algorithm is fast and has good theoretical guarantees; we also show a more efficient hierarchical variant of the algorithm with similar guarantees under mild assumptions.
APA
Wang, S., Bai, W., Lavania, C. & Bilmes, J.. (2019). Fixing Mini-batch Sequences with Hierarchical Robust Partitioning. Proceedings of the Twenty-Second International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 89:3352-3361 Available from https://proceedings.mlr.press/v89/wang19e.html.

Related Material