Preparation of the environment

In this Chapter, instead of using Spark shell, we will build two standalone Spark applications using Scala API: one for model preparation and the second for model deployment. In the case of Spark, the Spark application is a normal Scala application with a main method that serves as an entry point for execution. For example, here is a skeleton of application for model training:

object Chapter8 extends App {

val spark = SparkSession.builder()
.master("local[*]")
.appName("Chapter8")
.getOrCreate()

val sc = spark.sparkContext
sc.setLogLevel("WARN")
script(spark, sc, spark.sqlContext)


def script(spark: SparkSession, sc: SparkContext, sqlContext: SQLContext): Unit = {
// ...code of application } }

Moreover, we will try to extract parts, which can be shared between both applications, into a library. This will allow us to follow the DRY (do-not-repeat-yourself) principle:

object Chapter8Library {
// ...code of library }
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.119.138.184