Windows Thread Synchronization 2






Process and Thread Revisited


Processes are used to separate the different applications that are executing at a specified time on a single computer. The operating system does not execute processes, but threads do. A thread is a unit of execution. The operating system allocates processor time to a thread for the execution of the thread's tasks. A single process can contain multiple threads of execution. Each thread maintains its own exception handlers, scheduling priorities, and a set of structures that the operating system uses to save the thread's context if the thread cannot complete its execution during the time that it was assigned to the processor. The context is held until the next time that the thread receives processor time. The context includes all the information that the thread requires to seamlessly continue its execution. This information includes the thread's set of processor registers and the call stack inside the address space of the host process.




In programming, an atomic action is one that effectively happens all at once. An atomic action cannot stop in the middle: it either happens completely, or it doesn't happen at all. No side effects of an atomic action are visible until the action is complete. We have already seen that an increment expression, such as i++, does not describe an atomic action. Even very simple expressions can define complex actions that can be decomposed into other actions. However, there are actions that you can specify as an atomic:


  1. Reads and writes are atomic for reference variables and for most primitive variables (all types except long and double).
  2. Reads and writes are atomic for all variables declared volatile (including long and double variables).


Atomic actions cannot be interleaved, so they can be used without fear of thread interference. However, this does not eliminate all need to synchronize atomic actions, because memory consistency errors are still possible. Using volatile variables reduces the risk of memory consistency errors, because any write to a volatile variable establishes a happens-before relationship with subsequent reads of that same variable. This means that changes to a volatile variable are always visible to other threads. What's more, it also means that when a thread reads a volatile variable, it sees not just the latest change to the volatile, but also the side effects of the code that led up the change. The volatile keyword is a type qualifier used to declare that an object can be modified in the program by something such as the operating system, the hardware, or a concurrently executing thread. Specific to Microsoft, Objects declared as volatile are not used in certain optimizations because their values can change at any time. The system always reads the current value of a volatile object at the point it is requested, even if a previous instruction asked for a value from the same object. Also, the value of the object is written immediately on assignment.

Also, when optimizing, the compiler must maintain ordering among references to volatile objects as well as references to other global objects. In particular:


  1. A write to a volatile object (volatile write) has Release semantics; a reference to a global or static object that occurs before a write to a volatile object in the instruction sequence will occur before that volatile write in the compiled binary.
  2. A read of a volatile object (volatile read) has Acquire semantics; a reference to a global or static object that occurs after a read of volatile memory in the instruction sequence will occur after that volatile read in the compiled binary.


This allows volatile objects to be used for memory locks and releases in multithreaded applications. Using simple atomic variable access is more efficient than accessing these variables through synchronized code, but requires more care by the programmer to avoid memory consistency errors.


Windows Thread States


While a process must have one thread of execution, the process can create other threads to execute tasks in parallel. Threads share the process environment, thus multiple threads under the same process use less memory (resource) than the same number of processes. From the Win32 documentation (unmanaged), a thread can be in the following states:


  1. Initialized - it is recognized by the microkernel.
  2. Ready - it is prepared to run on the next available processor.
  3. Running - it is executing.
  4. Standby - it is about to run; only one thread may be in this state at a time.
  5. Terminated - it is finished executing.
  6. Waiting - it is not ready for the processor, when ready, it will be rescheduled.
  7. Transition -  the thread is waiting for resources other than the processor,
  8. Unknown - the thread state is unknown.


The following Figure shows the Windows thread states.


Windows thread synchronization: Various Windows thread states


While the current operating condition (execution state) of the thread can be one of the following:


  1. Unknown
  2. Other
  3. Ready
  4. Running
  5. Blocked
  6. Suspended Blocked
  7. Suspended Ready


From the managed (.NET) documentation, a thread is always in at least one of the possible states in the ThreadState enumeration, and can be in multiple states at the same time. The ThreadState enumeration members are:


Member name



The thread has been started, it is not blocked, and there is no pending ThreadAbortException()


The thread is being requested to stop. This is for internal use only.


The thread is being requested to suspend.


The thread is being executed as a background thread, as opposed to a foreground thread. This state is controlled by setting the Thread..::.IsBackground property.


The Thread..::.Start method has not been invoked on the thread.


The thread has stopped.


The thread is blocked. This could be the result of calling Thread..::.Sleep or Thread..::.Join, of requesting a lock - for example, by calling Monitor..::.Enter or Monitor..::.Wait - or of waiting on a thread synchronization object such as ManualResetEvent.


The thread has been suspended.


The Thread..::.Abort method has been invoked on the thread, but the thread has not yet received the pending System.Threading..::.ThreadAbortException that will attempt to terminate it.


The thread state includes AbortRequested and the thread is now dead, but its state has not yet changed to Stopped.


ThreadState enumeration defines a set of all possible execution states for threads. Once a thread is created, it is in at least one of the states until it terminates. Threads created within the common language runtime (CLR) are initially in the Unstarted state, while external threads that come into the runtime are already in the Running state. An Unstarted thread is transitioned into the Running state by calling Start(). Not all combinations of ThreadState values are valid; for example, a thread cannot be in both the Aborted and Unstarted states.  The following table shows the example of the actions that cause a change of state.




A thread is created within the common language runtime.


A thread calls Start()


The thread starts running.


The thread calls Sleep()


The thread calls Wait() on another object.


The thread calls Join() on another thread.


Another thread calls Interrupt()


Another thread calls Suspend()


The thread responds to a Suspend() request.


Another thread calls Resume()


Another thread calls Abort()


The thread responds to a Abort() request.


A thread is terminated.



In addition to the states noted above, there is also the Background state, which indicates whether the thread is running in the background or foreground. A thread can be in more than one state at a given time. For example, if a thread is blocked on a call to Wait(), and another thread calls Abort() on the blocked thread, the blocked thread will be in both the WaitSleepJoin and the AbortRequested states at the same time. In this case, as soon as the thread returns from the call to Wait() or is interrupted, it will receive the ThreadAbortException() to begin aborting. In this tutorial we will concentrate on the running state. When there are more than one threads, we need mechanisms to synchronize the threads so that all the threads will be served 'equally' by processor(s) and all the threads will have a fair access to the shared and limited resources.





< Thread Synchronization 1 | Thread Synchronization Programming | Win32 Programming | Thread Synchronization 3 >