Being Cellfish

Stuff I wished I've found in some blog (and sometimes did)

Why slow unit tests are a bad smell:

Change of Address
This blog has moved to blog.cellfish.se.

Why slow unit tests are a bad smell:

  • Comments 2

Earlier I promised to elaborate on why slow unit tests are a sign of problems (or a smell if you like). So here it goes.

The first thing I would like to look at is when the complete test suite takes too long to run to be part of a tight TDD-loop. If it is just the number of tests that makes the difference and each single test is super fast you will have a very large number of tests. Probably several thousands of tests. This can only mean two things. The least bad thing is that you have a very large application that really need all these tests. But you could probably split things up into different modules and test each module by it self thus reducing the number of tests that need to be run when you're working on a single module. Any dependencies between the modules can be faked in your day to day work. A much worse problem is that you're over-testing your code. That means testing the same thing in several similar (but not identical) ways. While over-testing it self should not be a problem it impacts TDD bad in two ways. First of all writing more tests takes more time so you get the feel that TDD is much slower than "developing the old way". It also makes the feedback slower since tests takes longer to run and that means to loose focus.

The second reason for slow tests is when a single test actually takes to long to execute just by itself. This is a sign of trouble since it probably means you're having badly written tests or even worse; badly designed production code. The reason for the test to be slow is typically because it takes some time to set up the circumstances for the test. The reason for this is typically a bad design since it is difficult or impossible to fake/stub/mock dependencies away in your test code. If you're having trouble faking dependencies you probably have a too tightly coupled design which most people I know tend to think is a bad design.

So whenever you feel the complete test suite takes too long to run, don't start looking at what is wrong with the tests - start looking at what is wrong with your production code...

  • There are some good exceptions, like tests targeting slow resources such as databases.

    For example, I work on a free linq to sql alternative, designed to support several databases (the software is named DbLinq), and each test needs to connect to database to check the SQL code generation by comparing the results. So we have two bottlenecks: SQL generation from LINQ expression (which sometimes takes up to 1s) and then connection to database, SQL request and result mapping.

    We have more than a thousand tests, running accross 7 differents databases, and it take something like 10 mn to complete.

    So yes, a slow unit test is a pain, but no, it's not always reflecting a bad organization :)

  • Well I must apologize for making the same mistake as many others, but the other way around. Many people mistake TDD for unit testing. In this case I wrote "unit tests" when I really should have written "your TDD tests". And in TDD a quick feed back loop is important to keep the momentum. I hope you can agree with me on that.

    The fact that having a lot of tests for a single module (in the case where the number of tests is the only problem) is generally a bad smell. Either your module is too big or you are testing too much. By too much I mean "you have more tests than you benefit of".

    Having slow tests is almost always a bad smell since it most often mean you're testing too much and not mocking/stubing stuff away. In your example I see the possibility for four different tests. One for SQl generation, one for DB connetc, one for SQL requests and one for result mapping. They do not need to be in the same test and everything around can be faked in some way and thus speeding up the single test. If you want to test the whole chain it sounds like some kind of integration test actually. And integration tests that are slow are typically run in another context than the TDD-loop tests.

    Having a 1s SQL generation time is however something that is hard to "fix" in the test. Personally I would look at it from a performance veiw because it sounds very slow to me.

    But each case here is different and there is always exceptions to the rule in some sense. Sometimes you will have several thousands of tests for a single module and no way to break it up. Sometimes you really want to test something that takes several seconds to test just because you have to. But that doesn't mean that slow tests arn't a smell. They're still a bad smell and you should try to get rid of them and not just accept them as is. And in some occations you just have to live with the smell but don't accept it without questioning it!

Page 1 of 1 (2 items)