Python, Java or Scala? What to Use for Your next Spark Project?

Article posted on : link to source

Apache Spark is a leading general data processing platform that runs programs 100 times faster in memory and 10 times faster on disk than the traditional choice for Big Data applications, Hadoop.

Spark is a project from Apache that the company likes to sell as a “lightning fast cluster computing” platform. A dilemma amongst the developers and users of the Spark platform is about the best programming language to be used for developing Apache Spark solutions. There are three languages that Apache Spark supports- Java, Python, and Scala.

Choosing a programming language out of the three is a subjective matter that depends on various factors, like the programmer’s comfort and skills, the project’s requirements, etc.

Why Leave out Java?

While Java has been programmer’s favorite language for decades now, it lags behind when delivering the value that Scala and Python do.

First, it is verbose as compared to Python and Scala. Second, while the latest versions of the Java programming language allowed lambda and Streaming APIs, they don’t even compare to what Scala offers.

Java also does not support REPL- the Read-Evaluate-Print loop interactive shell that is crucial for all developers who work on Big Data analytics and Data science.

Conclusively, any new features in Apache Spark …

Read More on Datafloq

%d bloggers like this: