Treat logs as event streams

Enterprise applications traditionally write logs to log files on disk. Some engineers argue that this information is one of the most important insights into the application. The software project usually includes configuration of the contents and format of these logfiles. However, storing log data in log files is first of all just an output format, usually having a single log event per line.

This principle of 12-factor applications argues that logging should be treated as a stream of log events, that are emitted by the application. Applications should, however, not concern themselves with routing and storing the log file into specific output formats. Instead they log to the process' standard output. The standard out is captured and processed by the runtime environment.

This approach is uncommon to most enterprise developers with all logging frameworks, output formats and tools being around. However, environments where a lot of services run in parallel need to capture and process log events externally anyway. Solutions such as Elasticsearch, Logstash, and Kibana have proven themselves well to process and comprehend complex situations with log events from several sources. Storing log events in log files not necessarily supports these approaches.

Logging to the application's standard out not only simplifies development, since routing and storing is not a responsibility of the application anymore. It also reduces the need for external dependencies, such as logging frameworks. Zero-dependency applications support this approach. The environment such as a container orchestration framework takes care of capturing and routing the event stream. In Chapter 9, Monitoring, Performance, and Logging, we will cover the topic of logging, its necessity and shortcomings.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
52.15.129.90