Spark

Apache Spark is a cluster computing platform designed to be fast and general-purpose. It is designed for big data processing and extends the popular MapReduce model, in order to efficiently support more types of computation, with better performance. Spark includes efficient implementations of a number of transformations and actions that can be composed together, in order to perform data processing and analysis. Spark distributes these operations across a cluster, while abstracting away many of the underlying implementation details. Spark was designed with a focus on scalability and efficiency.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset