关注 spark技术分享,
撸spark源码 玩spark最佳实践

Distributed Deep Learning on Spark

Distributed Deep Learning on Spark (using Yahoo’s Caffe-on-Spark)

Read the article Large Scale Distributed Deep Learning on Hadoop Clusters to learn about Distributed Deep Learning using Caffe-on-Spark:

To enable deep learning on these enhanced Hadoop clusters, we developed a comprehensive distributed solution based upon open source software libraries, Apache Spark and Caffe. One can now submit deep learning jobs onto a (Hadoop YARN) cluster of GPU nodes (using spark-submit).

Caffe-on-Spark is a result of Yahoo’s early steps in bringing Apache Hadoop ecosystem and deep learning together on the same heterogeneous (GPU+CPU) cluster that may be open sourced depending on interest from the community.

In the comments to the article, some people announced their plans of using it with AWS GPU cluster.

赞(0) 打赏
未经允许不得转载:spark技术分享 » Distributed Deep Learning on Spark
分享到: 更多 (0)

关注公众号:spark技术分享

联系我们联系我们

觉得文章有用就打赏一下文章作者

支付宝扫一扫打赏

微信扫一扫打赏