Kafka in the purview of an SCV use case

The usage of this technology (Apache Kafka) in the purview of SCV can be summarized very well by this figure:

Figure 18: Kafka technology usage in SCV use case

In this chapter, we looked at the publishing, message broker and consuming aspects of information. In the previous chapter, we used Kafka both as a channel as well as a sink. While using Kafka as a channel, it was acting both as a producer as well as a consumer; while using Kafka as a sink it was more of doing a producer function.

What this basically means is that we intend to use Kafka more as a message broker and as a channel so that we can define acquisition and ingestion interfaces around it from a single customer view perspective. This information could be a mix of structured and unstructured information exchanged via messages with standard data formats like XML/JSON. In the previous chapter, we saw how we can acquire both structured as well as unstructured data as a Spool file into Kafka. There could be additional/custom interfaces built with custom serializers/sinks, making Kafka as the central broker of message/events disseminated into target systems via an ingestion layer.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.202.27