My Response to Nat's "Threads Considered Harmful" Post

My Response to Nat's "Threads Considered Harmful" Post

  • Comments 6

I don't normally do this but Nat's post on Professor Edward A. Lee piece about The Problem With Threads drew a response from me straight away.

The comment I made on the post was never approved so I thought it was worth sharing here.

I Wrote...

In the summary of Professor Edward A. Lee paper - "he observes that threads remove determinism and open the door to subtle yet deadly bugs, and that while the problems were to some extent manageable on single core systems, threads on multicore systems will magnify the problems out of control. He suggests the only solution is to stop bolting parallelism onto languages and components--instead design new deterministically-composable components and languages." Benjamin then takes this comparison to the biological world.

It urks me when people need to feel so in control all the time.

Without wanting to enter into a philosophical debate I think we should caution ourselves about jumping to conclusions about the dangers of parallelism.

The irony is that there is a social perception in our society that woman can multitask and men cannot. Since men are the dominate force behind inventing computer languages it is of no surprise there is an intrinsic fear of parallelism. People can only easily memorise 7+-2 things (or groups of things) so to try and debug and track multiple threads is not mean feat for an inexperience (or in some cases experienced) programmer.

I worked building threaded systems in code for many years. Many were overly complicated and bugs were introduced occasionally which were difficult for others to track, test and fix.

From here I moved to building workflow driven applications that operated as state machines. The state machines could adapt to dynamic rules and were much easier to visualise, log and debug.

I guess you could argue that a state machine is a “deterministically-composable component” but once situations evolve and layers and layers of complexity are added, sometimes to an individual running instance (special case) and sometimes to the workflow for a period of time or sometimes forever based on changing demands. Working with systems like this (as I'm sure many of you do) you will become acutely aware of the similarities between the programming models that we use and the natural world and all its beauty and complexity.

If you believe in determinism even at the macro level it should then be theoretically possible to predict the lotto numbers each week based on the kinetic and physical forces involved or maybe some things are just random and we should feel comfortable in treating them that way.

So what if the result was a little unpredictable even if a computer was performing the task... doesn’t the wisdom of crowds sort this one out for us overtime anyway? Think about a computer farm of complex parallel processing running at 80% or 90% accuracy. Surely you could discount the difference as an "incorrect response" or better yet learn from the ambiguity.

In fact (if you believe in free will) maybe the subtle yet deadly bugs that professor Edward Lee is talking about are the spark that will create human like flaws in our inventions moving forward... meet pleo anyone?

What are your thoughts?

Tags:  languages parallel programming threads

  • Threads do indeed complicate things, sometimes by a little and sometimes by a lot, but saying "absolutely do not use threads" is wrong.

    There is only one reason I've seen that comment made:  In response to a question about how to use threads, I've seen experienced programmers say this because they don't want a newbie to start on the very easy path of over-using threads when they don't need them.

    Random bugs he talks of happen because the programmer has poor planning skills.  Piling things on as you go with no clear plan is always poor practice but with threads it hurts you more.

    A well-coded threaded application is almost never _hard_ to make, and becomes second nature as you develop more experience with it.  It's just another thing to learn, like everything else.

  • Every networked system, everywhere, is inherently part of a multithreaded system.  Yet, somehow, we manage to create sophisticated, working systems.

    There's a really nasty anti-MS diatribe stemming from the WPF documentation which states, allegedly, that using threads is too hard so you should never do it.  The word "idiots" is bandied about freely.  Aside from the lack of professionalism shown on the part of the poster, the fact remains that the advice is terribly unuseful and lacking in insight.  

    Threading came about and remains because it's useful -- if not necessary.  If someone doesn't think people know how to do it well, they should teach them how.

    Head-in-the-sand is no way to teach innovation.

  • I'd be interested in a reference to the WPF documentation that advises against using threads.

    I haven't seen this do you have a reference?

  • You may be looking at a cache page - your comment was approved on Tim's post.

    Posted by: Nigel Parker at January 23, 2007 05:38 PM

  • Thanks James!

    Looks like Nat has approved the comment for me today. Probably just a backlog at his end.

  • nparker:  

    The complaint was issued here:  http://www.users.on.net/~notzed/ (the post of 29 Sep)

    I searched around, and I think the reference he read is this:  http://msdn2.microsoft.com/en-us/library/ms741870.aspx

    I'm perfectly willing to believe he's reading too much into it (note the "m$-fanboy" epithet, among others), but even so, it seems that the documentation failed by leaving itself open to such an interpretation.

Page 1 of 1 (6 items)