How it works...

The basic workflow for DataFrame using SQL is to first populate the DataFrame either through internal Scala data structures or via external data sources first, and then use the createOrReplaceTempView() call to register the DataFrame as a SQL-like artifact.

When you use DataFrames, you have the benefit of additional metadata that Spark stores (whether API or SQL approach) which can benefit you during the coding and execution.

While RDDs are still the workhorses of core Spark, the trend is toward the DataFrame approach which has successfully shown its capabilities in languages such as Python/Pandas or R.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.254.138