Example 1 – The batching or aggregator pattern in Logic Apps

Batch processing is a critical requirement for most organizations. With event-based patterns and cloud consumption models, working with batch files is cost-effective and provides the end user with better insights into the business data. Logic Apps has built-in connectors for batch-processing use cases, in which the batch connector groups related messages and events in a collection until a specific criteria is met.

To understand this more clearly, let's take the example of a social media website. When we post an update on a social media site, we may get some comments. To analyze those comments, it is important to batch them up and pass them to a central repository such as a data lake for analytical purposes, or Cognitive Services for sentiment analysis.

In this example, we have used a Cosmos graph database to trigger Azure Functions whenever a comment is added to a specific post:

In the last chapter, we covered how to trigger Logic Apps on the Cosmos change feed. We will continue our learning now and build two Logic Apps instances. One is a sender, which will listen for HTTP posts, and another is a receiver, which will control batch processing and send the batch result to Azure Data Lake through the Logic Apps Data Lake Connector:

  1. The first step is to define the batch receiver Logic Apps instance. To do this, create a new Logic Apps workflow in the specified resource group.
  2. In Logic App workflow designer, search for the batch connector trigger and populate the required batch trigger connector properties such as batch mode, batch name, release condition, message count, batch size, and schedule.
  3. Set the batch mode to inline if you have a single release condition. If you have multiple release conditions, set the batch mode to integration account. In an integration account, you can maintain multiple batch conditions that can be used by Logic Apps at runtime.
  4. In the batch name property, provide a valid batch name and populate the required release condition. In this case, we have set multiple release conditions based on the message count, the batch size, and the schedule.
  1. Add a Data Lake Connector action to the Logic Apps workflow. This connector will upload a document to Azure Data Lake:

  1. The next step is to define a batch receiver Logic App workflow. In the Logic Apps batch sender, use the HTTP request trigger along with the configuration for the batch receiver connector.
  1. In the batch receiver connector action configuration, you need to set the message content, the batch name (the receiver Logic App batch name), the partition name, and the message ID, along with the trigger name, as shown in the following screenshot:

  1. The next step is to test the Logic App batch process. To do this, add documents in the Cosmos graph database and use the CosmosDB change feed feature to initiate batch processing:

  1. Based on the batch receiver configuration for release, Logic Apps will run multiple instances of the change feed until any condition (such as the batch size, the message count, or schedule) is satisfied.
  1. In the next step, after the batch exits the process, the batched message will be posted to Azure Data Lake through a standard Data Lake Connector, as follows:

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.140.188.16