Jamie Thomson recently suggested that you should run your packages using DTEXEC when doing performance testing, as it (typically) performs much better then when the same package is run in BIDS. If you haven’t already read it, go do that now. It has a pretty picture in it and everything. I’ll wait.
The purpose of this post was to call attention to 1) acknowledge the truthiness of Jamie’s post, 2) call attention to some of the great comments on the post, and 3) shed some light on why this is.
Why is BIDS slower than DTEXEC?
There are a number of factors, most of which are purely theoretical for me – I haven’t done deep analysis to determine the actual cost of each of these items, but they should give you the general idea….
Is BIDS always slower than DTEXEC?
No. For smaller packages (single data flow, source –> destination), there might not be a difference. In some cases, BIDS might even be a little faster – which I assume is because the package object is reused, and doesn’t need to be loaded fresh from disk. Generally, larger packages will perform better with DTEXEC, because they have more work for BIDS to do (more objects to draw, more events to filter, etc). Memory (RAM) can become a factor with large packages as well. Since Visual Studio is a 32bit process, it has a 2gb memory limit. You can easily hit the point where BIDS will start swapping to disk if you have multiple large packages open, are using a number of IDE extensions, or have multiple project types loaded.
If something performs slowly in BIDS, chances are it will perform the same or faster with DTEXEC. Perf testing different designs in BIDS, like seeing if is using OLE DB Command is slower than doing things in a batch, is just fine to do in BIDS. If you’re trying to get an idea of whether your package execution time fits into your ETL batch window, make sure you measure your results using DTEXEC.
There's a pretty good justification there for a 64bit BIDS/Visual Studio. I wonder if us lowly SSIS folk can make Scott Guthrie alter his stance on that!