Concurrency

In concurrent computing, the program is decomposed into multiple threads to execute the control with distinct responsibilities. Multithreads may run either simultaneously or in turn based on the system infrastructure. By design, there are two main concepts. They are as follows:

  • How are the processes scheduled to execute?
  • How does the coordination process synchronize to produce the result?

Let's take a simple example. The given business problem statement is converted into a software program. Multiple instructions of the program are decomposed into logical independent splits, named Thread.

Therefore, the program/problem 'p' is split into 't' threads (t1, t2,...tN), the execution target at one CPU processor.

The scheduler component drafts the execution order after making several independent instructions associated with threads. The coordination process is quite interesting in the way that multiple threads of control cooperate to complete the task for producing the end result. Thus, concurrency is virtual parallelism through a shared mode, such as time slicing.

As we know, concurrency systems have several processes that are executing simultaneously. These processes can interact with each other while they are executing. On the flip side, concurrency may trigger indeterminacy among the resulting outcome that leads to issues such as deadlock and starvation. Again, these design challenges are properly addressed during the implementation phase.

As a result, concurrent programming is implemented by splitting the existing task into multiple factors to be executed. Concisely, concurrency achieves logical execution of multiple tasks at once.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.145.55.198