Dealing with Concurrency

As we saw in the previous chapter, when working on any large-scale enterprise application, we deal with a lot of data. This data is processed in a synchronous manner and the results are sent only after the data processing for a particular process is complete. This kind of model is absolutely fine when the data being processed in individual requests is not large. But consider a situation where a lot of data needs to be processed before a response is generated. What happens then? The answer is, slow application response times.

We need a better solution. A solution that will allow us to process data in parallel, resulting in faster application responses. But how do we achieve this? The answer to the question is concurrency.

This chapter will introduce techniques of employing concurrency in your applications and how it can be leveraged to reduce the response times of your application.

During the course of the chapter, we will see how we can spawn multiple processes to deal with heavy workloads, or utilize threading to hand off blocking, unrelated tasks so as to return the actual results early.

In this chapter, you will learn about the following topics:

  • Launching and working with multiple processes
  • Establishing communication between multiple processes
  • Taking a multithreaded approach in an application
  • The limitations of a multithreaded approach
  • Avoiding common problems with concurrency

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.96.105