Observing the running and completed Spark jobs

To access and observe the running and the completed Spark jobs, open http://spark_driver_host:4040 in a web browser. Note that you will have to replace spark_driver_host with an IP address or hostname accordingly.

Note that if multiple SparkContexts are running on the same host, they will bind to successive ports beginning with 4040, 4041, 4042, and so on. By default, this information will be available for the duration of your Spark application only. This means that when your Spark job finishes its execution, the binding will no longer be valid or accessible.

Now, to access the active jobs that are still executing, click on the Active Jobs link and you will see the related information of those jobs. On the other hand, to access the status of the completed jobs, click on Completed Jobs and you will see the information as DAG style as discussed in the preceding section.

Figure 16: Observing the running and completed Spark jobs

You can achieve these by clicking on the job description link under the Active Jobs or Completed Jobs.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.205.183