5.7. Coordination Data Structures and Threading Enhancements

In .NET 4.0, the thread pool has been enhanced, and a number of new synchronization classes have been introduced.

5.7.1. Thread Pool Enhancements

Creating many threads to perform small amounts of work can actually end up taking longer than performing the work on a single thread. This is due to time slicing and the overhead involved in locking, and adding and removing items to the thread pools queue.

Previously, the queue of work in the thread pool was held in a linked list structure and utilized a monitor lock. Microsoft improved this by changing to a data structure that is lock-free and involves the garbage collector (GC) doing less work. Microsoft says that this new structure is very similar to ConcurrentQueue (discussed shortly).

The great news is that you should find that if your existing applications are using the thread pool and you upgrade them to .NET 4.0, then your application's performance should be improved with no changes to your code required.

5.7.2. Thread.Yield()

Calling the new Thread.Yield() method tells the thread to give its remaining time with the processor (time slice) to another thread. It is up to the operating system to select the thread that receives the additional time. The thread that yield is called on is then rescheduled in the future. Note that yield is restricted to the processor/core that the yielded thread is operating within.

5.7.3. Monitor.Enter()

The Monitor.Enter() method has a new overload that takes a Boolean parameter by reference and sets it to true if the monitor call is successful. For example:

bool gotLock = false;
object lockObject = new object();

try
{
    Monitor.Enter(lockObject, ref gotLock);
    //Do stuff
}
finally
{
    if (gotLock)
    {
        Monitor.Exit(lockObject);
    }
}

5.7.4. Concurrent Collections

The concurrent collection classes are thread-safe versions of many of the existing collection classes that should be used for multithreaded or parallelized applications. They can all be found lurking in the System.Collections.Concurrent namespace.

Using the concurrent classes reduces the locking code that you will need to write in most common situations (e.g., adding/removing items locking will be take care of for you). The MSDN documentation states that these classes will also offer superior performance to ArrayList and generic list classes when accessed from multiple threads.

5.7.4.1. ConcurrentStack

Thread-safe version of Stack (LIFO collection).

5.7.4.2. ConcurrentQueue

This is the thread-safe version of Queue (FIFO collection).

5.7.4.3. ConcurrentDictionary

This is the thread-safe version of the Dictionary class.

5.7.4.4. ConcurrentBag

ConcurrentBag is a thread-safe, unordered, high-performance collection of items contained in System.dll. ConcurrentBags are used when it is not important to maintain the order of items in the collection. ConcurrentBags also allow the insertion of duplicates.

ConcurrentBags can be very useful in multithreaded environments because each thread that accesses the bag has its own dequeue. When the dequeue is empty for an individual thread, it will then access the bottom of another thread's dequeue, reducing the chance of contention occurring. Note that this same technique is used within the thread pool for providing load balancing.

5.7.4.5. BlockingCollection

BlockingCollection is a collection that enforces upper and lower boundaries in a thread-safe manner. If you attempt to add an item when the upper or lower bounds have been reached, the operation will be blocked, and execution will pause. If, on the other hand, you attempt to remove an item when the BlockingCollection is empty, this operation will also be blocked.

This is useful for a number of scenarios, such as the following:

  • Increasing performance by allowing threads to both retrieve and add data from it. For example, it could read from a disk or network while another processes items.

  • Preventing additions to a collection until the existing items are processed.

The following example creates two threads: one that will read from the blocking collection and another to add items to it. Note that we can enumerate through the collection and add to it at the same time, which is not possible with previous collection types.

It is important to note that the enumeration will continue indefinitely until the CompleteAdding() method is called.


using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Dynamic;
using System.Threading.Tasks;
using System.Diagnostics;
using System.Threading;
using System.Collections.Concurrent;

namespace ConsoleApplication7
{
   class Program
   {
      public static BlockingCollection<string> blockingCol =
      new BlockingCollection<string>(5);
      public static string[] Alphabet = new string[5] { "a", "b", "c", "d", "e" };

      static void Main(string[] args)
      {
         ThreadPool.QueueUserWorkItem(new WaitCallback(ReadItems));
         Console.WriteLine("Created thread to read items");

         //Creating thread to read items; note how we are already enumurating collection!
         ThreadPool.QueueUserWorkItem(new WaitCallback(AddItems));
         Console.WriteLine("Created thread that will add items");

         //Stop app from closing
         Console.ReadKey();
      }

      public static void AddItems(object StateInfo)
      {
         int i = 0;
         while (i < 200)
         {
            blockingCol.Add(i++.ToString());

Thread.Sleep(10);
         }
      }

public static void ReadItems(object StateInfo)
      {
         //Warning: this will run forever unless blockingCol.CompleteAdding() is called
         foreach (object o in blockingCol.GetConsumingEnumerable())
         {
            Console.WriteLine("Read item: " + o.ToString());
         }
      }

   }
}

5.7.5. Synchronization Primitives

.NET 4.0 introduces a number of synchronization classes (discussed in the following sections).

5.7.5.1. Barrier

The Barrier class allows you to synchronize threads at a specific point. The MSDN documentation has a good analogy: the Barrier class works a bit like a few friends driving from different cities and agreeing to meet up at a gas station (the barrier) before continuing their journey.

The following example creates two threads: one thread will take twice as long as the other to complete its work. When both threads have completed their work, execution will continue after the call to SignalAndWait() has been made by both threads.

using System.Threading;

class Program
{
   static Barrier MyBarrier;
   static void Main(string[] args)
   {
      //There will be two participants in this barrier
      MyBarrier = new Barrier(2);

      Thread shortTask = new Thread(new ThreadStart(DoSomethingShort));
      shortTask.Start();

      Thread longTask = new Thread(new ThreadStart(DoSomethingLong));
      longTask.Start();

      Console.ReadKey();
   }

static void DoSomethingShort()
   {
      Console.WriteLine("Doing a short task for 5 seconds");
      Thread.Sleep(5000);
      Console.WriteLine("Completed short task");
      MyBarrier.SignalAndWait();

      Console.WriteLine("Off we go from short task!");
   }

   static void DoSomethingLong()
   {
      Console.WriteLine("Doing a long task for 10 seconds");
      Thread.Sleep(10000);
      Console.WriteLine("Completed a long task");
      MyBarrier.SignalAndWait();
      Console.WriteLine("Off we go from long task!");
   }

}

The Barrier class also allows you to change participants at runtime through the AddParticipant() and RemoveParticipant()methods.

5.7.6. Cancellation Tokens

Cancellation tokens are structs that provide a consistent means of cancellation. You might want to use a cancellation token to cancel a function or task that is taking too long or using too much of a machine's resources. Support is provided in many of the Task and PLINQ methods for the use of cancellation tokens.

To use cancellation tokens, you first need to create a CancellationTokenSource. Then you can utilize it to pass a cancellation token into the target method by using the Token property.

Within your method, you can then check the token's IsCancellationRequested property and throw an operation-cancelled exception if you find this to be true (e.g., if a cancellation has occurred).

When you want to perform a cancellation, you simply need to call the Cancel() method on the cancellation source, which will then set the token's IsCancellationRequested() method to true. This sounds more complex than it actually is; the following example demonstrates this process:

static CancellationTokenSource cts = new CancellationTokenSource();

static void Main(string[] args)
{
   Task t = Task.Factory.StartNew(() => DoSomething(), cts.Token);
   System.Threading.Thread.Sleep(2000);
   cts.Cancel();
   Console.ReadKey();
}

public static void DoSomething()
{
   try
   {
      while (true)
      {
         Console.WriteLine("doing stuff");
         if (cts.Token.IsCancellationRequested == true)
         {
            Console.WriteLine("cancelled");
            throw new OperationCanceledException(cts.Token);
         }
      }
   }
   catch (OperationCanceledException ex)
   {
      //operation cancelled; do any clean up here
      Console.WriteLine("Cancellation occurred");
   }
}

5.7.6.1. CountDownEvent

The new CountDownEvent is initialized with an integer value and can block code until the value reaches 0 (the value is decremented by calling the signal method).

CountDownEvent is particularly useful for keeping track of scenarios in which many threads have been forked. The following example blocks until the count has been decremented twice:

using System.Collections.Concurrent;
using System.Threading;

namespace Chapter5
{

   static CountdownEvent CountDown = new CountdownEvent(2);

   static void Main(string[] args)
   {
      ThreadPool.QueueUserWorkItem(new WaitCallback(CountDownDeduct));
      ThreadPool.QueueUserWorkItem(new WaitCallback(CountDownDeduct));

      //Wait until countdown decremented by DecrementCountDown method
      CountDown.Wait();
      Console.WriteLine("Completed");
      Console.ReadKey();
   }

static void CountDownDeduct(object StateInfo)
   {
      System.Threading.Thread.Sleep(5000);
      Console.WriteLine("Deducting 1 from countdown");
      CountDown.Signal();
   }
}

5.7.6.2. ManualResetEventSlim and SemaphoreSlim

ManualResetEventSlim and SemaphoreSlim are lightweight versions of the existing ManualResetEvent and Semaphore classes. The new classes do not use resource-expensive kernel features as their predecessors did.

5.7.6.3. SpinLock

SpinLock forces a program to loop until it can obtain and lock access to a particular resource. This should be used when you don't have to wait too long. Although looping (rather than handing control over to another thread) sounds like a wasteful thing to do, it can potentially be much more efficient than stopping to process other threads because it avoids a context switch (a resource-intensive process in which the current CPU state is stored and a new state is loaded).

private static SpinLock MySpinLock = new SpinLock();

static void Main(string[] args)
{
   bool Locked = false;
   try
   {
      MySpinLock.Enter(ref Locked);
      //Work that requires lock would be done here
   }
   finally
   {
      if (Locked)
      {
         MySpinLock.Exit();
      }
   }
}

5.7.6.4. ThreadLocal<T>

ThreadLocal is a lazy-initialized variable for each thread (see Chapter 4 for more info about lazy-initialized variables).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.218.97.75