Network Congestion and Latency

In our monolithic application, when a request comes in, it’s processed by some instance of the application and a response is sent. Libraries are deployed with the application and calls to these libraries are fast and in-process. The application can make a small number of requests to a database or other services.

In the case of a microservices architecture, a single request to the application can result in multiple requests to a number of services. Each of the services could be making requests on their databases, caches, and other services. This I/O explosion can have a negative impact on the performance of an application and as a result optimizations in network communication becomes important. As shown in Figure 1.5, a call to a monolithic application on the left lands on a service instance and makes one or more calls to a database to handle the response. However, in the microservices architecture on the right side of the diagram, calls to the various components are now made across the network.

Image

FIGURE 1.5: An external request results in many more internal requests when compared to a monolith

Most of the overhead in internal service calls are often in the data serialization and deserialization. In addition to data caching and replication to reduce the number of requests, efficient serialization formats can be used. Sometimes the same data is passed from one service to the next where it’s deserialized and serialized multiple times. Using a common serialization format across the services can reduce some of these steps by enabling one service to pass data along to another service without having to reserialize all of it again.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.138.35.255