I decided to play around with the idea of a multi-threaded Select extension.  This could be useful, for example, if you're creating a system that farms out work to various web services and have a limited number of connections to use.

Attached should be the results of my excursion.

While I've created multi-threaded applications before, I hadn't yet created one where it continuously tried to ensure that the maximum number of threads possible were used, while also ensuring that the sequence of results matched the sequence of inputs.  The basic algorithm I came up with is fairly straightforward:

  1. Fetch the enumerator of the input.
  2. Set lastMoveNextWasTrue to true, as an initial value.
  3. While lastMoveNextWasTrue:
    1. While there are spare threads and buffer space:
      1. Set lastMoveNextWasTrue to enumerator.MoveNext()
      2. If lastMoveNextWasTrue, then create a new BackgroundWorker and results object, and store them off.  Build the arguments, and run the worker.  Else break.
    2. Wait for at least one worker to complete.
    3. While true:
      1. Remove completed workers.
      2. While the head of the buffer contains results from completed workers, yield them.
      3. If lastMoveNextWasTrue was false (no more items to consume) and there are still workers running, continue at (3).  Else break.

A similar logic can be had for the ForEach case (takes an Action<T>).

The results show that this sort of approach is worthwhile when the processing is IO-bound (simulated here with Thread.Sleep(1)):  with 5 threads and 3 maximum pending results, I get roughly 2x performance.  If instead I'm calculating factorials -- CPU-bound -- the overhead involved makes this just a little slower.