Flexible and Scalable Deep Learning with MMLSpark

Proceedings of Machine Learning Research 82 (2017) 11-22, 4th International Conference on Predictive Applications and APIs In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework...

Full description

Saved in:
Bibliographic Details
Main Authors Hamilton, Mark, Raghunathan, Sudarshan, Annavajhala, Akshaya, Kirsanov, Danil, de Leon, Eduardo, Barzilay, Eli, Matiach, Ilya, Davison, Joe, Busch, Maureen, Oprescu, Miruna, Sur, Ratan, Astala, Roope, Wen, Tong, Park, ChangYoung
Format Journal Article
LanguageEnglish
Published 11.04.2018
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Proceedings of Machine Learning Research 82 (2017) 11-22, 4th International Conference on Predictive Applications and APIs In this work we detail a novel open source library, called MMLSpark, that combines the flexible deep learning library Cognitive Toolkit, with the distributed computing framework Apache Spark. To achieve this, we have contributed Java Language bindings to the Cognitive Toolkit, and added several new components to the Spark ecosystem. In addition, we also integrate the popular image processing library OpenCV with Spark, and present a tool for the automated generation of PySpark wrappers from any SparkML estimator and use this tool to expose all work to the PySpark ecosystem. Finally, we provide a large library of tools for working and developing within the Spark ecosystem. We apply this work to the automated classification of Snow Leopards from camera trap images, and provide an end to end solution for the non-profit conservation organization, the Snow Leopard Trust.
DOI:10.48550/arxiv.1804.04031