Client mode

In this mode, the Mesos framework works in such a way that the Spark job is launched on the client machine directly. It then waits for the computed results, also called the driver output. To interact properly with the Mesos, the driver, however, expects that there are some application-specific configurations specified in SPARK_HOME/conf/spark-env.sh . To make this happened, modify the spark-env.sh.template file at $SPARK_HOME /conf, and before using this client mode, in your spark-env.sh, set the following environment variables:

$ export MESOS_NATIVE_JAVA_LIBRARY=<path to libmesos.so>

This path is typically /usr/local /lib/libmesos.so on Ubuntu. On the other hand, on macOS X, the same library is called libmesos.dylib instead of libmesos.so:

$ export SPARK_EXECUTOR_URI=<URL of spark-2.1.0.tar.gz uploaded above>

Now, when submitting and starting a Spark application to be executed on the cluster, you will have to pass the Mesos :// HOST:PORT as the master URL. This is usually done while creating the SparkContext in your Spark application development as follows:

val conf = new SparkConf()              
.setMaster("mesos://HOST:5050")
.setAppName("My app")
.set("spark.executor.uri", "<path to spark-2.1.0.tar.gz uploaded above>")
val sc = new SparkContext(conf)

The second option of doing so is using the spark-submit script and configure spark.executor.uri in the SPARK_HOME/conf/spark-defaults.conf file. When running a shell, the spark.executor.uri parameter is inherited from SPARK_EXECUTOR_URI, so it does not need to be redundantly passed in as a system property. Just use the following command to access the client mode from your Spark shell:

$ SPARK_HOME/bin/spark-shell --master mesos://host:5050
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
13.59.160.92