The use of distributed programming is increasing. High availability requires multiple machines and often multiple data centers. Machine learning and AI models are run as parallel tasks on clusters to reduce training time. But distributed programming has always been hard to do--at least, until now. This report shows you an easier way.
Dean Wampler from Anyscale introduces you to Ray, an open source project that provides a concise and intuitive Python API for defining tasks that need to be distributed. Built by researchers at UC Berkeley, Ray does most of the tedious work of running workloads at massive scale. For the majority of distributed workloads, this guide shows you how Ray provides a flexible, efficient, and intuitive way to get work done.