Export (0) Print
Expand All

Developing Multithreaded Applications for the .NET Compact Framework

.NET Compact Framework 1.0
 

Maarten Struys
PTS Software bv

June 2005

Applies to:
   .NET Compact Framework
   Windows CE

Summary: This article explores the multithreading capabilities of the .NET Compact Framework. You will learn about underlying Windows CE concepts like processes and threads. You will also learn when it is necessary to create multithreaded applications. The managed classes available for multithreaded applications will be explored in great detail. In addition, you will learn about different methods to synchronize multiple threads. The article is filled with code examples that are taken from the downloadable sample code, which you should have available when reading the article. All sample code will run in the emulators that ship with Visual Studio .NET 2003. Reading this article will help you create well-performing, safe multithreaded applications. (31 printed pages)


Download CSMultithreading.msi from the Microsoft Download Center.


Download VBMultithreading.msi from the Microsoft Download Center.

Contents

Windows CE Processes and Threads
.NET Compact Framework Threads
Alternative Approach - ThreadPool
Periodic Execution - Timer Class
User Interface Controls Inside Worker Threads
Thread Synchronization
Thread Safe Classes
Conclusion

Windows CE Processes and Threads

The multithreading capabilities of the .NET Compact Framework are built on top of the functionality that the Windows CE operating system makes available. Understanding the difference between processes and threads in Windows CE will lead to better understanding of multithreaded applications.

Windows CE Processes

In Windows CE, as in any other Windows operating system, a process is simply a placeholder for an application. A process can be considered a running application, but it does not execute any code. Instead, it provides an application with a 32-megabyte virtual address space, a default heap, and possibly other resources, such as files and dynamic-link libraries. A process also contains at least one thread, the primary thread. In Windows CE, at most 32 processes can run simultaneously.

Windows CE Threads

A thread in any Windows operating system is a unit of execution. In other words, threads execute code. When a process starts running, it contains only one thread, but it has the capability of starting other threads. Only memory limits the number of threads in a process.

Windows CE is a preemptive multithreading operating system. It allows multiple threads to share the processor. When the operating system schedules a thread, the thread must save its entire state to be able to continue execution the next time the operating system allows it to. To save its entire state, each thread must at least own a stack and a copy of all registers.

Because the operating system switches frequently between threads, the threads appear to run simultaneously to the user. An important concept to keep in mind is that the operating system schedules threads, not processes. The scheduler, the part of the operating system that is responsible for scheduling threads, is priority based. Threads that have higher priorities always run before or preempt threads that have lower priorities. Multiple threads with the same priority run in a round-robin fashion with each thread receiving a quantum, or slice, of execution time.

Threads can voluntarily give up processor time by waiting for a particular signal or synchronization object. Developers commonly use synchronization objects in multithreaded applications to efficiently wait until the thread can continue to run. If you target battery-powered devices, it is very important to do as little processing as possible because using the processor drains the battery. Because using synchronization objects hardly use any processor time, this method is the preferred way to wait for events to happen. You should always avoid using polling to wait for (external) events.

Multiple Threads Within a Single Application

You might wonder why having multiple threads in a single application is useful. A simple example can help illustrate the reason. In Figure 1, you see an application consisting of a single form with two buttons and a label.

Click here for larger image

Figure 1. A simple multithreaded application. Click on thumbnail for larger image.

When you click Start processing, the label on the form is updated, a time-consuming operation is started, and you seemingly have the ability to stop the operation by clicking Stop processing. The following code shows the event handlers for the Start processing and Stop processing buttons in addition to the function that simulates the actual operation that needs to be performed.

  
    private void button1_Click(object sender, System.EventArgs e)
    {
        label1.Text = "Worker Function started";
        button1.Enabled = false;
        button2.Enabled = true;
        workerFunctionDone = false;
        MyWorkerFunction();
    }
 
    private void button2_Click(object sender, System.EventArgs e)
    {
        workerFunctionDone = true;
        label1.Text = "Worker Function terminated";
        button2.Enabled = false;
        button1.Enabled = true;
    }
  
    private void MyWorkerFunction()
    {
        while (! workerFunctionDone)
        {
            // Simulate some processing
            Thread.Sleep(1000);
        }
    }

The preceding code assumes that you are using the "System.Threading" namespace to get easier access to the Thread class. The most important action in the button1_Click event handler is to call MyWorkerFunction, the function that simulates processing. The most important action of the button2_Click event handler is setting the Boolean class-level instance variable workerFunctionDone to true, allowing MyWorkerFunction to terminate.

The button event handlers are called in response to user actions, but they are all executed on the main thread of the application. Because all code in this sample application runs inside the same thread, a problem occurs. While code in MyWorkerFunction is executing, the button2_Click event handler cannot be called. The application thus stops responding to button clicks by the user, resulting in the workerFunctionDone variable remaining false. In other words, the application is stuck in an endless loop and will not close properly.

There are numerous other reasons why you might want to use multiple threads in an application, including:

  • To service relatively slow hardware devices while keeping the user interface responsive.
  • To guarantee timely responses to devices that expect responses within a certain period.

.NET Compact Framework Threads

To change the application shown earlier in Figure 1 into a correctly working application, you can create a worker thread in which MyWorkerFunction will be executed. The application background thread now executes the time-consuming function while at the same time, the user remains in control of the application. For that purpose, you need only to change the button1_Click event handler, where you then can create and start a worker thread. To better match the desired functionality, you should rename the MyWorkerFunction function to MyWorkerThread. The function itself remains the same.

Thread Creation

To create a thread in the button1_Click event handler, you need to instantiate a new object of type Thread. The Thread class is available in the "System.Threading" namespace, so use the using keyword to make the types easily accessible. The Thread class takes one parameter, a ThreadStart delegate. This delegate references the method to be invoked when this thread begins executing. Passing MyWorkerThread as a delegate means that your worker thread executes this method after it is started.

A ThreadStart delegate callback method does not take any parameters. To be able to pass information to the worker thread, you can, for example, set particular class instance variables and use their values inside the worker thread.

Note that only instantiating a new Thread class does not automatically start the worker thread. You have to start it explicitly by calling its Start method. The revised code for the button1_Click event handler is as follows.

private void button1_Click(object sender, System.EventArgs e)
{
    label1.Text = "Worker Thread started";
    button1.Enabled = false;
    button2.Enabled = true;
    workerThreadDone = false;
    Thread myThread = new Thread(new ThreadStart(MyWorkerThread));
    myThread.Start();
}

When you run the modified application, you now are able to terminate the worker thread by clicking the Stop Processing button. However, even though the sample code seems to work properly now, there is still one more problem.

Proper Termination of Threads

In the .NET Compact Framework version 1.0, all threads created by instantiating new Thread objects are foreground threads. This is an important limitation compared to the full .NET Framework. In the full .NET Framework, both foreground and background threads are available. The common language runtime terminates background threads by calling the Abort method on each background thread as soon as the last foreground thread belonging to the same process ends. Foreground threads, on the other hand, run indefinitely. Therefore in the .NET Compact Framework 1.0, the developer is responsible for properly terminating all threads in an application.

Because there is no Abort method available in the .NET Compact Framework 1.0, you must take another approach to terminate threads. This approach is useful even when Abort is available because it is in the .NET Compact Framework version 2.0. There are situations where the Abort method cannot terminate the thread — especially in those scenarios where you use platform invoke facility to call into native code and wait for a native synchronization object by using the API WaitForSingleObject or WaitForMultipleObjects while the Abort method is called.

Properly terminating all worker threads is very important because a managed application will terminate only when all of its threads have been terminated. If you run the sample code (ThreadDemo1 in the downloadable source code) from within Visual Studio.NET 2003, you can easily verify this fact. Simply deploy and run a debug build of the application in the emulator by pressing F5. Start the worker thread by clicking the appropriate button, and then click OK in the upper-right corner of the application. Even though the form is closed, Visual Studio.NET 2003 still appears to be actively debugging the application. When you click Break all on the Debug menu of Visual Studio.NET 2003, you see that you are breaking on a statement somewhere inside the while loop of MyWorkerThread.

To prevent this behavior, the application needs to be extended somewhat. You can add a Form_Closing event handler to the application by selecting the Form properties and double-clicking the Closing event. Visual Studio.NET 2003 then adds the Form1_Closing event handler to your source file.

There are two different ways to implement this event handler. First, when you automatically want the worker threads to terminate when the form is closed, you simply set the Boolean variable workerThreadDone to true in the Form1_Closing event handler. The event handler then looks like the following.

private void Form1_Closing(object sender,
    System.ComponentModel.CancelEventArgs e)
{
    workerThreadDone = true;
}

When you add a Form_Closing event handler as shown in the preceding code and run the application again, you will see that it will be properly terminated — although perhaps after a short delay, because the worker thread might take up to 1,000 milliseconds to wake up.

The other possibility is to let the user explicitly terminate the worker thread by clicking the Terminate Thread button. In this case, when the user requests to exit the application with the worker thread still running, you need to tell the user to terminate the worker thread before he or she can exit the application. To give the user this information, you can again use the Form_Closing event handler. In this case, you cancel the Form_Closing event, resulting in the application continuing to run. The following code shows an example of canceling the Form_Closing event.

private void Form1_Closing(object sender,
        System.ComponentModel.CancelEventArgs e)
{
    if (! workerThreadDone)
    {
        MessageBox.Show("Stop the worker thread first");
        e.Cancel = true;
    }
}

Of course, the message to the user should be friendlier, but when you run the application now, you can see that it will not terminate until the worker thread has been terminated.

Generic Structure of a Worker Thread

As you saw in the first code example in this article, a worker thread that executes a repetitive task is often implemented by means of a while loop. The while loop checks for a Boolean variable to allow termination of the thread. Using this Boolean variable inside a while loop as a stop condition eliminates the need for an Abort method in the Thread class. However, you have to make sure that the variable is initialized properly prior to starting the worker thread. Typically, this variable is implemented as an instance variable of the class that also implements the worker thread. Because multiple threads will access the same Boolean variable, you have to make sure that each thread always retrieves the most up-to-date value. Also, the value of the variable must be written immediately on assignment. In the full .NET Framework, you can use the volatile keyword to achieve this. In .NET Compact Framework 1.0, the volatile keyword is not available. However, in this version of the .NET Compact Framework, all variables are treated as being volatile by default.

In its most basic form, a worker thread waits for an external event to happen. When the thread wakes up, it first checks whether it still is allowed to execute. If so, it will perform its task. If the thread needs to terminate, it will break out of the while loop so that it can terminate properly. The following code example shows a generic structure of a worker thread that repeatedly executes some code.

private void WorkerThread()
{
    while (! workerThreadDone)
    {
        Wait for an external event or a timeout value
        // Make sure that the worker thread can continue to run
        if (! workerThreadDone)
        {
            Functionality that needs to be executed inside the thread
        }
    }
}

Alternative Approach — ThreadPool

The .NET Compact Framework offers are a number of different ways to create multithreaded applications. Probably the most efficient way to implement multiple threads is to use the threads that exist in the ThreadPool class. Figure 2 compares the performance of the ThreadPool class to that of the Thread class.

Click here for larger image

Figure 2. Performance of Thread versus ThreadPool. Click on thumbnail for larger image.

In this example, running inside a generic Windows CE .NET 4.2 emulator, you create 200 different worker threads that all have a relatively short lifetime. In fact, the only functionalities of all worker threads are simulating some processing time and checking whether a particular thread is the two hundredth. In that case, an event is set to inform the main thread that the test run is completed, so the main thread can update the user interface with timing information. The following code shows the difference between using Thread objects (see the Button1_Click method) and using the ThreadPool (see the Button2_Click method).

private void Button1_Click(object sender, System.EventArgs e)
{
    threadCounter = 0;
    doneEvent = new AutoResetEvent(false);
    TextBox1.Text = "";
    int elapsedTime = Environment.TickCount;
    for (int i = 0; i < 200; i++)
    {
        Thread workerThread = new Thread(
            new ThreadStart(MyWorkerThread));
        workerThread.Start();
    }
    doneEvent.WaitOne();
    elapsedTime = Environment.TickCount - elapsedTime;
    TextBox1.Text = "Creating threads: " + 
        elapsedTime.ToString() + " msec";
}
private void MyWorkerThread()
{
    threadCounter++;
    if (threadCounter == 200) 
    {
        doneEvent.Set();
    }
}
private void Button2_Click(object sender, System.EventArgs e)
{
    threadPoolCounter = 0;
    doneEvent = new AutoResetEvent(false);
    TextBox2.Text = "";
    int elapsedTime = Environment.TickCount;
    for (int i = 0; i < 200; i++)
    {
        ThreadPool.QueueUserWorkItem(
            new WaitCallback(MyWaitCallBack));
    }
    doneEvent.WaitOne();
    elapsedTime = Environment.TickCount - elapsedTime;
    TextBox2.Text = "Creating threads: " + 
        elapsedTime.ToString() + " msec";
}
private void MyWaitCallBack(object stateInfo)
{
    threadPoolCounter++;
    if (threadPoolCounter == 200)
        doneEvent.Set();
}

As you can see in Figure 2, using ThreadPool threads takes about 40 percent of the execution time, compared to regular worker threads. The reason is that threads in the thread pool are reused, thus eliminating the overhead of creating a new thread class. Because creating a worker thread is a relatively expensive operation, the difference in performance is significant. ThreadPool threads are created when needed, up to a maximum of 200 (in .NET Compact Framework 1.0). When the callback function (the actual thread that executes in the ThreadPool) is terminated, the ThreadPool thread stays in the pool for reuse for around 60 seconds. If you run the same application immediately for a second time, the performance improves because the ThreadPool in that situation contains enough threads in the pool to execute all of the requested callback functions without the need to create new threads.

ThreadPool threads are shared resources, so you should not run long-lasting threads inside the thread pool. When no idle thread is available in the thread pool when the QueueUserWorkItem method is called, either a new thread will be created in the ThreadPool or the specified delegate will execute only after a thread pool thread becomes available (when the maximum number of ThreadPool threads is in use already). Also, because a ThreadPool thread is a ready-to-run real thread, you need to properly terminate it before the application closes (as described in "Proper Termination of Threads" earlier in this article). The downloadable samples for this document contain a demonstration of how to use ThreadPool.

Periodic Execution — Timer Class

In cases where a thread needs to run at a particular timer interval, you can use the Timer class to queue a method for periodic execution. The Timer class in the "System.Threading" namespace uses a ThreadPool thread to run a TimerDelegate every time the timer expires. Because a System.Threading.Timer object executes in a separate thread, you need to properly terminate the timer before the application closes (as described in "Proper Termination of Threads" earlier in this article).

There is another Timer class available in the .NET Compact Framework. The "System.Windows.Forms" namespace contains a Timer class that has different methods than the Timer class that exists in "System.Threading". The Timer class in "System.Windows.Forms" simply generates WM_TIMER messages every time the timer expires. It does not use a separate thread for execution, but it uses underlying Windows messages. This timer is simple to use and is effective for periodic UI updates. However, it is inaccurate. If you need a high degree of accuracy, you should use the "System.Threading" timer.

The downloadable samples for this document contain a demonstration of the behavior of the different Timer classes.

The Using Timers example, shown in Figure 3, creates instances of both Timer classes and sets the interval for both timers to 100 milliseconds.

Click here for larger image

Figure 3. Accuracy in timers. Click on thumbnail for larger image.

The "Using Timers" application measures the average timer interval for both timers. The application generates extra processing activity when you click the Processing on Form button. When you click that button, the difference in behavior of the different timers becomes clear. The Windows.Forms.Timer object calls its delegate only when a WM_TIMER message is read from the Windows message queue. At any time, there can be no more than one WM_TIMER message pending in the message queue. When the application is busy processing, WM_TIMER messages may be combined, thus missing timer ticks. That is exactly the behavior shown in Figure 3. Because the Threading.Timer object runs on a separate thread, it runs whenever the timer interval expires. In the downloadable sample, the Threading.Timer object maintains accuracy because the priority of the main thread is lowered temporarily when the user clicks the Processing on Form button.

User Interface Controls Inside Worker Threads

A common mistake that many developers make is trying to update or access user interface controls directly from within worker threads. This action results in unexpected behavior; frequently, the application stops responding.

The following code is a slightly modified version of the first code example in this article. The user interface of this example is identical to that shown earlier in Figure 1; therefore, you should look only at the button click event handlers and the code of the worker thread.

    private void button1_Click(object sender, System.EventArgs e)
    {
      button1.Enabled = false;
      button2.Enabled = true;
      workerThreadDone = false;
      Thread myThread = new Thread(new ThreadStart(MyWorkerThread));
      myThread.Start();
    }

    private void button2_Click(object sender, System.EventArgs e)
    {
      workerThreadDone = true;
      button2.Enabled = false;
      button1.Enabled = true;
    }

    private void MyWorkerThread()
    {
      label1.Text = "Worker Thread started";
      while (! workerThreadDone)
      {
        // simulate some processing
        Thread.Sleep(1000);
      }
      label1.Text = "Worker Thread terminated";
    }

You can find the complete sample in the ThreadDemo2 folder after you install the downloadable samples. When you run ThreadDemo2 and create/terminate a worker thread several times, the application stops responding after a seemingly random number of threads have been created and terminated. The buttons don't give visual feedback anymore, and you aren't able to close the application. This is a simple example of one of the most frequent errors in the design of multithreaded applications. To avoid the error, remember the following rule: Only the thread that creates a UI control can safely update that control.

If you need to update a control inside a worker thread, you should always use the Control.Invoke method. This method executes a specified delegate on the thread that owns the control's underlying window handle; in other words, the thread that created the control. The .NET Compact Framework 1.0 supports only synchronous updates of user interface controls from within worker threads. As another limitation, Control.Invoke expects a delegate that is an instance of EventHandler.

To make sure that the ThreadDemo2 application runs without problems, you need to modify its worker thread to use Control.Invoke instead of updating the UI control directly. Calling the Control.Invoke method transfers control to the thread that created the control and executes a delegate on that thread. To modify the previous code example, you first create an additional method that will update the label by setting its Text property:

    private void UpdateLabel(object sender, EventArgs e)
    {
      label1.Text = textToShow;
    }

As you can see, this method has exactly the same signature as an event handler (compare UpdateLabel, for instance, with button1_Click in the downloadable sample code).

Because you can't pass parameters, the code example uses an instance variable called textToShow. Before calling UpdateLabel, the caller sets textToShow. For this sample, this action is safe to do, because only one worker thread is available, and only that worker thread updates textToShow.

    private void MyWorkerThread()
    {
      textToShow = "Worker Thread started";
      this.Invoke(new EventHandler(UpdateLabel));
      while (! workerThreadDone)
      {
        // simulate some processing
        Thread.Sleep(1000);
      }
      textToShow = "Worker Thread terminated";
      this.Invoke(new EventHandler(UpdateLabel));
    }

In this modified version of MyWorkerThread, you set the variable and then call Invoke on the form. Because the form manages the label on it and both were created on the same thread, this action results in executing the UpdateLabel method in that particular thread. You simply pass a new EventHandler delegate to Invoke, in this case the UpdateLabel method. The complete source code for this sample is available in ThreadDemo3.

To simplify the use of Control.Invoke and to allow passing of parameters to a delegate that needs to update UI controls, you can use a QuickStart sample.

You need to use Control.Invoke not only to update UI controls inside worker threads you created yourself, but also when you use ThreadPool threads — thus inside the WaitCallback delegate you pass to ThreadPool.QueueUserWorkItem. Because System.Threading.Timer timers also run in a ThreadPool thread, this is true for all UI updates you want to make in the TimerCallback delegate that you pass to the constructor of a System.Threading.Timer object.

Possible Deadlock Situations

It is important to realize that Control.Invoke works synchronously. Suppose that you want to change the code of the preceding code example to get an exact indication of thread termination. One method is to create an AutoResetEvent event, set this event as the last statement in the worker thread, and wait in your main thread until the event has been set. After the event has been set, you know for sure that your worker thread has terminated.

However, this common way in the .NET Compact Framework to wait until a thread has terminated might lead to unexpected problems when you are using Control.Invoke. Remember that Control.Invoke executes a delegate on the thread that created the control (in the downloadable sample, the main thread). Now, look at the following code. The worker thread has not been changed much, but as a last operation, you set an event.

    private void MyWorkerThread()
    {
      textToShow = "Worker Thread started";
      this.Invoke(new EventHandler(UpdateLabel));
      while (! workerThreadDone)
      {
        // Simulate some processing
        Thread.Sleep(1000);
      }
      textToShow = "Worker Thread terminated";
      this.Invoke(new EventHandler(UpdateLabel));
      workerThreadTerminated.Set();
    }

You terminate the thread setting workerThreadDone to true in the button2_Click event handler. Because you want to know exactly when the worker thread terminates, inside the button2_Click event handler, you will wait until the termination event has been set. The code looks like the following.

    private void button2_Click(object sender, System.EventArgs e)
    {
      workerThreadDone = true;
      button2.Enabled = false;
      workerThreadTerminated.WaitOne();
      button1.Enabled = true;
    }

Simply adding the AutoResetEvent event and waiting for it in the button2_Click event handler now results in a deadlock situation. The reason is that the worker thread is most likely in sleeping state when the user clicks the Terminate Thread button. At that time, workerThreadDone is set to true, and the main thread blocks inside the workerThreadTerminated.WaitOne() method until the worker thread sets the corresponding workerThreadTerminated event. Before the worker thread can set the event, it attempts to execute a Control.Invoke() method. Remember that Control.Invoke executes on the main thread. However, the main thread is blocked because it is waiting for the worker thread to set an event, so Control.Invoke cannot execute. Because Control.Invoke is a synchronous operation, it returns only when the delegate passed by Control.Invoke is finished executing. Therefore, there is no way for the worker thread to continue and set the event to indicate to the main thread that it has been terminated.

This is a typical deadlock situation. In this case, the solution is simple, as you can see for yourself if you modify the sample code in ThreadDemo4. Simply remove the Control.Invoke statement in the worker thread under the while loop. If you still want a visual indication that the worker thread has finished, you can simply set the Text property of the label in the button2_Click event handler, immediately below the workerThreadTerminated.WaitOne() method, because you know for sure that the worker thread is finished after you return from that method.

Thread Synchronization

All threads in a multithreaded application run autonomously unless you pay special attention to thread synchronization. Especially in cases where multiple threads access shared data, you must synchronize access of that data.

The following example demonstrates the need for synchronization between threads. (You can find the complete sample in the ThreadDemo5 folder after you install the downloadable samples.) The application starts two different worker threads. Both worker threads access the same variable. One thread simply increments the variable, whereas the other thread decrements it. Each thread loops 10 times. When both worker threads finish executing, the value of the variable should be zero. The end result is shown in a label. As Figure 4 shows, the user interface of the application is, again, as simple as possible.

Click here for larger image

Figure 4. Thread synchronization in action. Click on thumbnail for larger image.

When you click the Start Demo button, you create and start two worker threads. When both worker threads finish executing, the value of the counter appears. The following code is the source code for both worker threads. Note that both worker threads indicate that they are finished by means of a delegate that the application calls by using Control.Invoke.

    private void Thread1Function()
    {
      for (int i = 0; i < 10; i++)
      {
        counter++;
      }
      thread1Running = false;
      this.Invoke(new EventHandler(ThreadsFinished));
    }

    private void Thread2Function()
    {
      for (int i = 0; i < 10; i++)
      {
        counter--;
      }
      thread2Running = false;
      this.Invoke(new EventHandler(ThreadsFinished));
    }

    private void ThreadsFinished(object sender, System.EventArgs e)
    {
      if (! thread1Running && ! thread2Running)
      {
        button1.Enabled = true;
        label1.Text = "The value of counter = " + counter.ToString();
      }
    }

Listing 11 - Multiple threads accessing the same data

Running the application several times always results in the display of the correct value of the counter. However, suppose that you change the application in such a way that each worker thread loops 1 million times instead of 10 times. If you run the application, the end value of the counter will sometimes still be zero, but often it will seem to be a random number. The reason is that each of the threads is running with the same priority. They both get equal amounts of processor time and execute round robin.

Even though accessing the counter in the worker threads seems to be a single statement (counter++ or counter--), if you look in the generated Microsoft intermediate language (MSIL) in Figure 5, you will see that this single C# statement consists of several lines of MSIL code.

Click here for larger image

Figure 5. Intermediate language for Thread1Function. Click on thumbnail for larger image.

The highlighted lines of MSIL code help explain the problem you are experiencing now. Both threads of the ThreadDemo5 sample access the same variable: counter. The Windows CE operating system is responsible for giving both threads equal amounts of time to execute. Now imagine that one thread is running in the highlighted code of Figure 5, and the operating system schedules that thread out of the processor — just as it has read the current value of counter. In other words, the thread is scheduled out of the processor just after it executed line IL_0006 in Figure 5. The other thread is scheduled into the processor by that time, reading the counter as well, and then modifying it and storing the updated value of the counter back into memory. When the first worker thread resumes running, it continues exactly where it stopped; it does not read the counter again. It continues at the next line of code, being IL_000b, where it increments the local copy it has of the counter variable. It then stores that value back into memory. With that, all the changes that the other worker thread made to counter are destroyed, which results in unexpected values of the counter after both worker threads terminate.

Accessing Data from Multiple Threads

To prevent the previous situation, you should protect the data that both worker threads are accessing. In the case of simple increment/decrement/compare operations, you can protect the data by using the Interlocked class. This class has methods to increment/decrement integers. Interlocked guarantees that the increment/decrement operations are atomic operations, thus not interruptible during the execution of a C# increment/decrement operator. To use Interlocked, you simply change the code of both worker threads to the following.

    private void Thread1Function()
    {
      for (int i = 0; i < 1000000; i++)
      {
        Interlocked.Increment(ref counter);
      }
      thread1Running = false;
      this.Invoke(new EventHandler(ThreadsFinished));
    }

    private void Thread2Function()
    {
      for (int i = 0; i < 1000000; i++)
      {
        Interlocked.Decrement(ref counter);
      }
      thread2Running = false;
      this.Invoke(new EventHandler(ThreadsFinished));
    }

The preceding code generates the MSIL shown in Figure 6. This MSIL now uses a call to a method of the Interlocked class to do the incrementing. The Interlocked class itself guarantees that the operation will be atomic. You might wonder, though, why two statements are still involved (see the highlighted lines). Because you need to pass a reference to the original variable counter, in this case, problems do not occur. A reference simply points to the location of the original variable instead of taking a local copy of it, so even if there were a thread switch between the two highlighted statements, there would still be no chance of accidentally updating the variable with an incorrect value.

Click here for larger image

Figure 6. Intermediate language when using Interlocked. Click on thumbnail for larger image.

The complete source code for this sample is available in ThreadDemo6.

Monitor and Mutex Objects

Imagine the following scenario. An application creates two worker threads, and both worker threads use methods of another class, called Processing. Those methods are updating instance data of the class. Both worker threads have access to all methods in the Processing class. The Processing class has two methods that each update a counter value in a loop. Both functions loop the same number of times. The following two methods are in the Processing class.

    public void Function1()
    {
      for (int i = 0; i < nrLoops; i++)
      {
        counter1++;
      }
    }

    public void Function2()
    {
      for (int i = 0; i < nrLoops; i++)
      {
        counter2++;
      }
    }

The worker threads are completely free to update data at times that they determine to be appropriate. They both call Processing.Function1 and Processing.Function2, but in opposite order. After both threads are finished, you ask the Processing class for the result (simply returning counter1 counter2). A correct return value would be zero, but when you don't think about synchronization, unexpected results might be returned if the nrLoops value is very large. (See the previous example for an explanation of this behavior.) Because you want to locally protect the data in the Processing class — in other words, make it thread safe — you have to use some synchronization objects.

When you do not only want to update simple integers (which you are in fact doing for the sake of simplicity), you have the Monitor and Mutex classes available. Both Monitor and Mutex protect regions of code that can be safely executed by only one thread at a time. Using these synchronization objects adds some responsibility to you as a developer. Prior to accessing the region of code that needs to be protected, you call either Mutex.WaitOne or Monitor.Enter, depending on which synchronization object you are using. These methods give you immediate access to the protected region of code when no other thread is executing that particular code. However, when another thread already is executing that code, Mutex.WaitOne and Monitor.Enter block the requesting thread until the thread currently executing the protected region of code calls Mutex.Release or Monitor.Exit respectively.

Omitting the release of previously obtained Monitor or Mutex objects might lead to unexpected results, including deadlocks. Therefore, it is extremely important to properly use these objects. In cases where protected regions of code might catch exceptions, be sure to release the synchronization object in a finally block. Even if you don't expect exceptions to be thrown, it is still good practice to surround the actual functionality and the release of the synchronization objects by a try and finally construction. The following code shows the same methods of the Processing class, which is protected by the Mutex object in this example:

    private Mutex function1Mutex;
    private Mutex function2Mutex;

    public void Processing()
    {
      function1Mutex = new Mutex();
      function2Mutex = new Mutex();
    }

    public void Function1()
    {
      function1Mutex.WaitOne();
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter1++;
        }
      }
      finally
      {
        function1Mutex.Release();
      }
    }

    public void Function2()
    {
      function2Mutex.WaitOne();
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter2++;
        }
      }
      finally
      {
        function2Mutex.Release();
      }
    }

The following code shows the Processing class, which is protected by a Monitor object that protects the entire Processing class in this example.

    public void Function1()
    {
      Monitor.Enter(this);
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter1++;
        }
      }
      finally
      {
        Monitor.Exit(this);
      }
    }

    public void Function2()
    {
      Monitor.Enter(this);
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter2++;
        }
      }
      finally
      {
        Monitor.Exit(this);
      }
    }

In the code example of ThreadDemo7, the entire Processing class instance is protected by a Monitor object, and the individual methods of the Processing class are protected by separate Mutex objects. It is very important to limit the scope of the protected area. To see this in action, you need to execute ThreadDemo7 and set the values in the application according to Figure 7.

Click here for larger image

Figure 7. Sample use of Mutex and Monitor. Click on thumbnail for larger image.

Click Start Demo, and you will see that the underlying function takes approximately eight seconds to finish. Selecting Use Mutex executes the underlying function twice as fast.

Because you introduced a sleeping time in each method of Processing, the worker thread calling into Processing starts sleeping, allowing another thread in the system to run. However, in the case of protecting the entire Processing class with Monitor during the sleeping time, even though the other worker thread could theoretically run, it also needs access to the Monitor object. Therefore, this thread cannot run.

When you protect both functions of Processing by using separate Mutex objects, both threads can execute seemingly simultaneously.

By looking at ThreadDemo7, you may get the idea that the Monitor class is inferior to the Mutex class — this is absolutely not true. It is possible to limit the scope of the protected area when using Monitor as well. In that case, you need to use separate synchronization classes in each method of Processing on which to put a Monitor object. In that case, the behavior of Mutex and Monitor is nearly the same although their internal implementations in the .NET Compact Framework are different. In the full .NET Framework, Mutex can be used for data protection across different processes. However, for that, you would need a named Mutex, which is not implemented in the .NET Compact Framework. The following code shows how to limit the scope of the protected area when using Monitor.

    private Object function1Protector = new object();
    private Object function2Protector = new object();

    public void Function1()
    {
      Monitor.Enter(function1Protector);
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter1++;
        }
      }
      finally
      {
        Monitor.Exit(function1Protector);
      }
    }

    public void Function2()
    {
      Monitor.Enter(function2Protector);
      try
      {
        for (int i = 0; i < nrLoops; i++)
        {
          counter2++;
        }
      }
      finally
      {
        Monitor.Exit(function2Protector);
      }
    }

Events

In a managed multithreaded environment, an event is an object that can be used to signal a thread that something has happened. In the context of this article, events are synchronization objects — not to be confused with UI events that are associated with event handlers. UI events are sent to an application as a result of a user action. For example, when a user clicks a button, a Click event will be sent that can be handled in a button_Click event handler. These types of events are not discussed in this article. This article focuses on event objects that can be found in the "System.Threading" namespace and that are used to coordinate actions between different threads.

In the .NET Compact Framework, two different types of event objects are available, AutoResetEvent and ManualResetEvent. The main difference is that when an AutoResetEvent is set to the signaled state, it is immediately reset to the nonsignaled state when a thread waiting for the event is released. ManualResetEvent remains in the signaled state until it is explicitly reset to the nonsignaled state.

To see the difference between the two different types of events in action, you can look at the sample code of ThreadDemo8. In this application, one worker thread wakes up when an event is set. You set the event from the user interface by clicking a button. In the main form of the application, you can choose between AutoResetEvent and a ManualResetEvent. Each time you set the event, the number of times the worker thread executed appears in a label. Figure 8 shows the user interface of ThreadDemo8.

Note   You update all user interface controls from within the UI thread. Directly updating UI controls from a worker thread leads to unexpected results, unless you use Control.Invoke as previously discussed.

Click here for larger image

Figure 8. AutoResetEvent and ManualResetEvent in action. Click on thumbnail for larger image.

The downloadable sample code has some logic to differentiate between AutoResetEvent and ManualResetEvent. It also has some functionality to enable and disable relevant buttons. The explanation concentrates on the worker thread and the behavior of both event classes. The following code example shows the worker thread.

    private void MyWorkerThread()
    {
      WaitHandle nextEvent = useAutoResetEvent ?       (WaitHandle)runOnceEvent : (WaitHandle)runManyEvent;

      while (! workerThreadDone)
      {
        nextEvent.WaitOne();
        if (! workerThreadDone)
        {
          counter++;
          Thread.Sleep(10);
        }
      }
      // Simulate long termination time
      Thread.Sleep(1000);
      workerThreadTerminated.Set();
    }

The first statement of the worker threads assigns a WaitHandle object to either a runOnceEvent event (an instantiation of AutoResetEvent) or a runManyEvent event (an instantiation of ManualResetEvent). AutoResetEvent and ManualResetEvent are both derived from WaitHandle. Making use of this fact, you can simply wait in your worker thread for a WaitHandle object to be set. If there had been a chance to pass parameters to a worker thread, you would have passed it a WaitHandle object, making the worker thread completely unaware of the type of event it is waiting for. To mimic that, you now use a local WaitHandle object and assign it to the correct instance variable.

The thread itself is extremely simple. Every time it returns from the WaitOne method (which happens when an event is signaled), a counter is incremented and the thread sleeps for 10 milliseconds. If you select AutoResetEvent on the user interface, it becomes clear that the worker thread runs one time for each click on the Set Event button. If you change to ManualResetEvent by clearing the check box and then start the worker thread again, the behavior will be different. After you click Set Event, the worker thread starts running and continues to do so until you click the Reset Event button.

A typical scenario in which events are used is the situation where one thread has to inform another thread that something has happened, after which it is safe to continue processing for the waiting thread. For instance, suppose you want to receive data through a serial port for further processing in the application. Using AutoResetEvent, you might have one thread reading raw data from the serial port (data-receiving thread). It stores the raw data until a special character has been received; for instance, one that indicates the end of a message. At that time, the stored data is ready for further processing by another thread, the data-processing thread. The data-receiving thread sets AutoResetEvent, which the data-processing thread is waiting for. The data-processing thread can now empty the buffer.

It is a design decision to either set another event after the data-processing thread is ready, allowing the data-receiving thread to use the buffer again, or protect the buffer by using Monitor or Mutex. It is also possible to use a double buffering technique, which allows the raw data collection thread to continue processing.

In a similar scenario, you can use ManualResetEvent when access to a buffer can be granted for a longer time — for instance, if the data-processing thread reads buffered data one character at a time and processes them on an individual basis. As long as the data-receiving thread is not using the buffer, it can keep ManualResetEvent set. After the data-receiving thread needs to update the buffer, it resets ManualResetEvent, thus blocking the data-processing thread.

The last statement of the worker thread in the preceding code shows another use for AutResetEvent. The last action the worker thread is performing is setting AutoResetEvent. In that way, it is capable of informing the application that it has been terminated.

Note   In the full .NET Framework, and in the .NET Compact Framework 2.0, you can use the Thread.Join method to know when a worker thread is terminated.

Using AutoResetEvent is a common way in the .NET Compact Framework 1.0 to wait until a thread has terminated. In the following code example, you first have to make sure that the while loop of the worker thread is ended by setting the Boolean variable workerThreadDone to true. Working with events in the while loop necessitates setting the event one more time. The worker thread is waiting indefinitely for the event to be set inside the while loop (see the previous code example). To be able to check the value of the Boolean variable workerThreadDone inside the thread, you have to force the worker thread to run one more time. That is exactly what you achieve by setting the AutoResetEvent one more time after you have set workerThreadDone to TRUE. This is also the reason why you must check explicitly on workerThreadDone inside the while loop after the event has been set. Not setting the AutoResetEvent one more time results in the worker thread being stuck in the WaitOne method. The code responsible for properly terminating the worker thread is available in a button click event handler. The code looks like the following.

    private void button2_Click(object sender, System.EventArgs e)
    {
      label1.Text = "";
      workerThreadDone = true;

      if (useAutoResetEvent)
      {
        runOnceEvent.Set();
      } 
      else
      {
        runManyEvent.Set();
      }

      workerThreadTerminated.WaitOne();

      button3.Enabled = false;
      button2.Enabled = false;
      button1.Enabled = true;
      checkBox1.Enabled = true;
    }

When the workerThreadTerminated event is set, you are certain that your worker thread has terminated, and you can take further action. However, there is a theoretical possibility that the worker thread is not completely finished. It might need to do some termination processing after the event is set. For example, the highlighted statements in the generated MSIL in Figure 9 indicate that there is still some code to run after setting the event. There might also be some finalizing code running to terminate the thread, although the generated MSIL does not indicate it (this all depends on the inner working of the Thread class).

Click here for larger image

Figure 9. Intermediate language events example. Click on thumbnail for larger image.

Thread Safe Classes

When you are designing a multithreaded application, you always have to be aware of the fact that multiple threads can access the same resources at seemingly random times. In cases where those resources should be accessed only in a controlled way, you need to use appropriate synchronization mechanisms. This is an important design issue, but it would be bad practice to make all classes completely thread safe. Accessing synchronization objects is time consuming, and because synchronization objects can block other threads that attempt to access the same resource, overall application performance can be influenced dramatically. Therefore, you should make only certain classes thread safe: those that can be accessed by multiple threads and that need protection against that, exactly as is done in the .NET Compact Framework itself.

Conclusion

A multithreaded application is more difficult to create than a single-threaded application. You need to take special care when multiple threads attempt to access the same data. Another area of attention is the way user interface controls must be updated from within worker threads. However, a multithreaded application is often more user friendly, because the application remains responsive while performing background computations.

The following table lists the samples that this article refers to.

Table 1. Overview of the Downloadable Samples

ThreadDemo0 Demonstration of the need for multiple threads
ThreadDemo1 A simple multithreaded application
ThreadDemo2 The wrong way to update UI controls from within worker threads
ThreadDemo3 Use of Control.Invoke to update UI controls from within worker threads
ThreadDemo4 Potential deadlock because of Control.Invoke
ThreadDemo5 Simultaneous data updates by multiple threads
ThreadDemo6 Protection data by means of Interlocked
ThreadDemo7 Use of Monitor and Mutex for thread synchronization
ThreadDemo8 Use of events for thread synchronization
Timers Demonstration of the use of System.Threading.Timer and System.Threading.Timer
ThreadPool Demonstration of the use of ThreadPool
Show:
© 2014 Microsoft