collect

collect() simply collects all elements in the RDD and sends it to the Driver.

Shown here is an example showing what collect function essentially does. When you call collect on an RDD, the Driver collects all the elements of the RDD by pulling them into the Driver.

Calling collect on large RDDs will cause out-of-memory issues on the Driver.

Shown below is the code to collect the content of the RDD and display it:

scala> rdd_two.collect
res25: Array[String] = Array(Apache Spark provides programmers with an application programming interface centered on a data structure called the resilient distributed dataset (RDD), a read-only multiset of data items distributed over a cluster of machines, that is maintained in a fault-tolerant way., It was developed in response to limitations in the MapReduce cluster computing paradigm, which forces a particular linear dataflow structure on distributed programs., "MapReduce programs read input data from disk, map a function across the data, reduce the results of the map, and store reduction results on disk. ", Spark's RDDs function as a working set for distributed programs that offers a (deliberately) restricted form of distributed shared memory., The availability of RDDs facilitates t...

The following is an illustration of collect(). Using collect, the Driver is pulling all the elements of the RDD from all partitions.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.204.201