Logging

Finally, logging can be configured through the log4j.properties file under your Spark application tree, as discussed in the preceding section. Spark uses log4j for logging. There are several valid logging levels supported by log4j with Spark; they are as follows:

Log Level Usages
OFF This is the most specific, which allows no logging at all
FATAL This is the most specific one that shows fatal errors with little data
ERROR This shows only the general errors
WARN This shows warnings that are recommended to be fixed but not mandatory
INFO This shows information required for your Spark job
DEBUG While debugging, those logs will be printed
TRACE This provides the least specific error trace with a lot of data
ALL Least specific message with all data
Table 1: Log level with log4j and Spark

You can set up the default logging for Spark shell in conf/log4j.properties. In standalone Spark applications or while in a Spark Shell session, use conf/log4j.properties.template as a starting point. In an earlier section of this chapter, we suggested you put the log4j.properties file under your project directory while working on an IDE-based environment like Eclipse. However, to disable logging completely, you should use the following conf/log4j.properties.template as log4j.properties . Just set the log4j.logger.org flags as OFF, as follows:

log4j.logger.org=OFF

In the next section, we will discuss some common mistakes made by the developer or programmer while developing and submitting Spark jobs.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.216.231.245