Contents tagged with Threading

  • Getting Current Native Thread

    The native OS threads running in the current process are exposed through the Threads property of the Process class. Please note that this is not the same as a managed thread, these are the actual native threads running on the operating system.


  • Thread Synchronization in .NET

    This comes after my last post Multi-threading in .NET. There are several synchronization mechanisms you can use in your .NET applications, so choosing among them must be done wisely.

    First you must decide one thing:

    Do you need to synchronize threads in different processes on the same machine?

    If so, the available possibilities are:

    • System.Threading.Semaphore: the number of concurrent entries is specified at construction time; all entries are treated equally; the name of the synchronization object must also be specified when the semaphore is created;
    • System.Threading.Mutex: equals a semaphore with the maximum number of concurrent entries set to 1; the name of the synchronization object must be specified when the mutex is created;
    • System.Threading.EventWaitHandle (or their better known children, System.Threading.ManualResetEvent and System.Threading.AutoResetEvent): the name of the synchronization object must be specified when the instance is created, as well as the reset policy; if set to auto, whenever the EventWaitHandle is signaled, only one object waiting on it will succeed.

    If you only want to synchronize threads inside a single process, you have more options, but first you must decide on this:

    Do you want all of your threads to be equally treated?

    If so, try these:

    • All of the above classes, but you can skip setting the name of the synchronization object;
    • System.Threading.Monitor (best known as the lock keyword): only one object at a time can have hold of the synchronization lock;
    • System.Runtime.CompilerServices.MethodImplAttribute attribute: if used with System.Runtime.CompilerServices.MethodImplOptions.Synchronized option, marks a method as synchronized, so there can be only one thread at a time running it.

    Otherwise, the BCL has support for a classical synchronization scenario: reader-writer. In this case, all access are not equal: some require write access and some require read access. There are two classes available for implementing this:

    • System.Threading.ReaderWriterLock: allows at any time either: 1) a single writer or 2) any number of readers;
    • System.Threading.ReaderWriterLockSlim: a more recent and optimized version of ReaderWriterLock; you should use this instead.

    It is worth noting that typically scenarios with different access levels offer better performance than the ones where every thread is treated the same, and thus waits on the same object.

    Happy multi-threading!


    Bookmark and Share


  • Multi-threading in .NET

    In .NET 2.0+, if you want to have something run on another thread, you have a number of choices:

    • The classical System.Threading.Thread class
    • The not-so-known System.Threading.ThreadPool
    • The System.ComponentModel.BackgroundWorker
    • Delegates invoked asynchronously

    Let's see how we would do it in each case, and what would be the benefits. Let's suppose our thread method looks like this:

    private void ThreadMethod(Object state)



    Using System.Threading.Thread

    You start a new System.Threading.Thread instance and pass it a pointer to the thread method wrapped in a System.Threading.ParameterizedThreadStart (for an optional state parameter) or System.Threading.ThreadStart delegate, then, you start it, by calling Start:

    Thread thread = new Thread(state => ThreadMethod(state));


    As you can see, I am using the new .NET 3.5 lambda syntax for delegates.

    If you want to wait for the thread to exit, do this:



    Using System.Threading.ThreadPool

    Assuming your thread pool is properly configured (maximum number of threads, minimum number of threads before creating another one, etc), all you have to do is enqueue a method for execution, whenever a thread from the pool is free:

    Boolean result = ThreadPool.QueueUserWorkItem(ThreadMethod, state);

    Or, if you must wait for the result:

    ManualResetEvent handle = new ManualResetEvent(false);

    ThreadPool.RegisterWaitForSingleObject(handle, (s, timedout) => ThreadMethod(s), state, timeout, true);


    By the way, QueueUserWorkItem always returns true, otherwise an exception is thrown.


    Using System.ComponentModel.BackgroundWorker

    You create an instance of the System.ComponentModel.BackgroundWorker class, add an event handler to its DoWork event, and start the processing by calling RunWorkerAsync, with an optional state parameter:

    BackgroundWorker worker = new BackgroundWorker();worker.DoWork += delegate(Object sender, DoWorkEventArgs args) { ThreadMethod(args.Argument); };


    Waiting for the worker to complete is accomplished this way:

    ManualResetEvent handle = new ManualResetEvent(false);

    ThreadPool.RegisterWaitForSingleObject(handle, (s, timedout) => ThreadMethod(s), state, timeout, true);


    The System.ComponentModel.BackgroundWorker class is available from the Windows Forms toolbox, in design view, so you can drag it into your form, and change its properties or register events through the property inspector.


    Using Delegates

    You declare a delegate that points to your method and you start its BeginInvoke method, passing it the state parameter and optionally a callback method, that gets called when the method terminates, and a state argument for that callback: Action<Object> action = ThreadMethod;

    action.BeginInvoke(state, null, null);

    And waiting:

    Action<Object> action = ThreadMethod;IAsyncResult result = action.BeginInvoke(state, null, null);



    So, what is the difference between all these methods? Let's see:

    • You would use System.Threading.Thread class when you want your task to be run exactly now; you may also want to suspend it or kill it;
    • System.Threading.ThreadPool helps preserving resources: no new threads are created, we just grab another one from the pool, if it is available, or wait for it, so the job may not start immediately; you don't have control over the actual thread that does the job;
    • If you want to have feedback from a thread, you use System.ComponentModel.BackgroundWorker. Its events allows you to set the completion status and to be notified when the task finishes. Typically you use it in a Windows Forms application. As it uses internally the thread pool, jobs may not start immediately;
    • Delegates are a fast way of launching a thread from a method delegate; it also uses the thread pool, so a task may take some time to actually start.

    In all cases, starting from .NET 2.0, the spawned threads retain the same principal (System.Threading.Thread.CurrentPrincipal property), which is great for access control.

    Bookmark and Share


  • The lock Statement

    Experienced multithreading developers may have wondered what is the relation between the lock keyword and the System.Threading.Monitor class. Well, as it happens, they are exactly the same:


    lock (instance)


        //do something



    is exactly equal to:











    Bear in mind that the finally clause is necessary so that the lock is released.

    Make sure you never lock on the object instance, but on an inner field instead, because the .NET Framework may also need to lock the instance, which may lead to a deadlock.