Chapter 6. Notebooks and Dataflows with Spark and Hadoop

There are many tools available for interactive analytics and to provide visualizations on Spark and Hadoop platforms. Some of the more important tools are the IPython Notebook (Jupyter), Spark Notebook, Ispark, Hue, Spark Kernel, Jove Notebook, Beaker Notebook, and Databricks Cloud. All of these notebooks are open source, except Databricks Cloud. This chapter is aimed at introducing and using some of the important interactive analytics tools using notebooks and a dataflow engine called NiFi. This chapter is divided into the following subtopics:

  • Introducing web-based notebooks
  • Introducing Jupyter
  • Introducing Apache Zeppelin
  • Using the Livy REST job server and Hue Notebooks
  • Introducing Apache NiFi for dataflows

Introducing web-based notebooks

We have worked with the Spark shell and applications in previous chapters. The shell provides great features, such as trying out code quickly and checking results interactively. However, when code becomes larger, it is difficult to edit some lines and re-execute the code. This is where applications are useful in which the entire script is saved in a file and submitted. However, in this way, you lose powerful Read, Evaluate, Print, and Loop (REPL) features of the shell. Notebooks solve this problem by providing features of both the shell and application in a web browser.

Web-based notebooks are files that contain the input code and output such as results and graphs from an interactive session. They also contain additional information, such as documentation, mathematical expressions, and media related to an interactive session. They are stored in the JSON format and can be shared with anybody across the organization or externally. It is easy to view the existing notebooks on the web using nbviewer or the ZeppelinHub Viewer. Notebooks are extremely useful for later executions and re-evaluations or sharing with anybody else without installing, setting up, and running the code.

Notebooks are nothing but executable documents, which will enhance developer productivity and reduce the complexity.

Note

All programs in this chapter are executed on HDP 2.4 VM, except the Hue notebook with Livy program, which is executed on CDH 5.8 VM. For other environments, file paths might change. However, the concepts are the same in any environment.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.255.87