Shared variable

Spark segregates the job into the smallest possible operation, a closure, running on different nodes and each having a copy of all the variables of the Spark job. Any changes made to these variables do not get reflected in the driver program and so, to overcome this limitation, Spark provides two special variables: broadcast variables and accumulators (also called shared variables).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.137.216.175