How it works...

This recipe combines ideas from previous recipes in this chapter. As in previous recipes, we set up a Kafka consumer and producer. This recipe uses the synchronous producer from the Using Kafka with Sarama recipe, but could have also used an asynchronous producer instead. Once a message is received, we enqueue it on an in channel just as we did in the Goflow for dataflow programming recipe. We modify the components from this recipe to convert our incoming string to uppercase, rather than Base64-encoding it. We reuse the print components and the resultant network configuration is similar.

The end result is that all messages received through the Kafka consumer are transported into our flow-based work pipeline to be operated on. This allows us to instrument our pipeline components to be modular and reusable, and we can use the same component multiple times in different configurations. Similarly, we'll receive traffic from any producer that writes to Kafka, so we can multiplex producers into a single data stream.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.149.214.32