I was writing an application the other day that required serializing objects to a set of files on a network share as a user is entering data.  The writing is done “live” (instead of waiting until the user closes the application) because there are other applications on other computers that sync to the changes by listening to file changes on the network share.  Normally this is no big deal, I could create a dedicated thread using a BackgroundWorker or new Thread, but all of that thread management seemed overkill for a call to File.AppendAllText just because it needed to occur in the background.  I’d rather just queue up a delegate using the ThreadPool, but unfortunately the ThreadPool doesn’t guarantee the order in which it executes its work items, and the order of the calls to File.AppendAllText really mattered in my case.  It also doesn’t guarantee that it won’t execute its work items in parallel, which would obviously be a problem if you had multiple threads trying to write to the same file at the same time.

So, I created a tiny class called ThreadQueue, which looks just like a ThreadPool, except that it executes its work items in the same order that they were enqueued, and also guarantees that it won’t start executing one work item until the previous work item has finished, preventing the problem of parallelism when your work items can’t be run at the same time.

The QueueUserWorkItem method simply places the WaitCallbacks in a Queue:

private readonly Queue<KeyValuePair<WaitCallback, object>> workItems = new Queue<KeyValuePair<WaitCallback, object>>();

public bool QueueUserWorkItem(WaitCallback callback, object state)
{
    lock (this.workItems)
    {
        if (this.workItems.Count == 0)
            if (!ThreadPool.QueueUserWorkItem(RunWorkItems))
                return false;
        this.workItems.Enqueue(new KeyValuePair<...>(callback, state));
        return true;
    }
}

If the queue had been empty when the work item was enqueued, then the ThreadQueue starts a single thread (using the original ThreadPool.QueueUserWorkItem) that will process the work items in order:

private void RunWorkItems()
{
    KeyValuePair<WaitCallback, object> action;
    lock (this.workItems)
        action = this.workItems.Peek();
    while (true)
    {
        action.Key(action.Value);
        lock (this.workItems)
        {
            this.workItems.Dequeue();
            if (this.workItems.Count == 0)
                break;
            action = this.workItems.Peek();
        }
    }
}

RunWorkItems simply runs in a loop, calling all of the work items in the queue until there are none remaining, and then returns.  ThreadQueue is efficient at not wasting threads, since it doesn’t actually create a thread until there is work to do, and that thread exits as soon as the work is finished.  This means you can create as many ThreadQueues in your application as you want without having to worry about thread limits.

That’s really all there is to it.  I’ve attached the full code to this post, which also lets you set the Thread.Name and Thread.IsBackground properties of the background thread.  Enjoy!