Flexible and Scalable Deep Learning with MMLSpark

Mark Hamilton, Sudarshan Raghunathan, Akshaya Annavajhala, Danil Kirsanov, Eduardo Leon, Eli Barzilay, Ilya Matiach, Joe Davison, Maureen Busch, Miruna Oprescu, Ratan Sur, Roope Astala, Tong Wen, ChangYoung Park
Proceedings of The 4th International Conference on Predictive Applications and APIs, PMLR 82:11-22, 2018.

Abstract

In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework Apache Spark. To achieve this, we have contributed Java Language bindings to the Cognitive Toolkit, and added several new components to the Spark ecosystem. In addition, we also integrate the popular image processing library OpenCV with Spark, and present a tool for the automated generation of PySpark wrappers from any SparkML estimator and use this tool to expose all work to the PySpark ecosystem. Finally, we provide a large library of tools for working and developing within the Spark ecosystem. We apply this work to the automated classification of Snow Leopards from camera trap images, and provide an end to end solution for the non-profit conservation organization, the Snow Leopard Trust.

Cite this Paper


BibTeX
@InProceedings{pmlr-v82-hamilton18a, title = {Flexible and Scalable Deep Learning with MMLSpark}, author = {Hamilton, Mark and Raghunathan, Sudarshan and Annavajhala, Akshaya and Kirsanov, Danil and Leon, Eduardo and Barzilay, Eli and Matiach, Ilya and Davison, Joe and Busch, Maureen and Oprescu, Miruna and Sur, Ratan and Astala, Roope and Wen, Tong and Park, ChangYoung}, booktitle = {Proceedings of The 4th International Conference on Predictive Applications and APIs}, pages = {11--22}, year = {2018}, editor = {Hardgrove, Claire and Dorard, Louis and Thompson, Keiran}, volume = {82}, series = {Proceedings of Machine Learning Research}, month = {24--25 Oct}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v82/hamilton18a/hamilton18a.pdf}, url = {https://proceedings.mlr.press/v82/hamilton18a.html}, abstract = {In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework Apache Spark. To achieve this, we have contributed Java Language bindings to the Cognitive Toolkit, and added several new components to the Spark ecosystem. In addition, we also integrate the popular image processing library OpenCV with Spark, and present a tool for the automated generation of PySpark wrappers from any SparkML estimator and use this tool to expose all work to the PySpark ecosystem. Finally, we provide a large library of tools for working and developing within the Spark ecosystem. We apply this work to the automated classification of Snow Leopards from camera trap images, and provide an end to end solution for the non-profit conservation organization, the Snow Leopard Trust.} }
Endnote
%0 Conference Paper %T Flexible and Scalable Deep Learning with MMLSpark %A Mark Hamilton %A Sudarshan Raghunathan %A Akshaya Annavajhala %A Danil Kirsanov %A Eduardo Leon %A Eli Barzilay %A Ilya Matiach %A Joe Davison %A Maureen Busch %A Miruna Oprescu %A Ratan Sur %A Roope Astala %A Tong Wen %A ChangYoung Park %B Proceedings of The 4th International Conference on Predictive Applications and APIs %C Proceedings of Machine Learning Research %D 2018 %E Claire Hardgrove %E Louis Dorard %E Keiran Thompson %F pmlr-v82-hamilton18a %I PMLR %P 11--22 %U https://proceedings.mlr.press/v82/hamilton18a.html %V 82 %X In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework Apache Spark. To achieve this, we have contributed Java Language bindings to the Cognitive Toolkit, and added several new components to the Spark ecosystem. In addition, we also integrate the popular image processing library OpenCV with Spark, and present a tool for the automated generation of PySpark wrappers from any SparkML estimator and use this tool to expose all work to the PySpark ecosystem. Finally, we provide a large library of tools for working and developing within the Spark ecosystem. We apply this work to the automated classification of Snow Leopards from camera trap images, and provide an end to end solution for the non-profit conservation organization, the Snow Leopard Trust.
APA
Hamilton, M., Raghunathan, S., Annavajhala, A., Kirsanov, D., Leon, E., Barzilay, E., Matiach, I., Davison, J., Busch, M., Oprescu, M., Sur, R., Astala, R., Wen, T. & Park, C.. (2018). Flexible and Scalable Deep Learning with MMLSpark. Proceedings of The 4th International Conference on Predictive Applications and APIs, in Proceedings of Machine Learning Research 82:11-22 Available from https://proceedings.mlr.press/v82/hamilton18a.html.

Related Material