This really belongs in the comments field of my last blog post, but it turned out I had more thoughts to vent and this would make a monster of a comment, so here goes:

What more could there be to dynamic analysis than code coverage and profiling?  Probably the debugger falls into that category. I don't know what the future holds.  But if the tester inside me is allowed to dream big here, I think one big open door is - the operating environment.  Here's why:

James Whitaker of software testing fame has talked himself hoarse explaining how the operating environment is such a key to understanding software defects.  Dynamic code analysis tells us a lot about the code under test (and admittedly more noise than you want sometimes about dependency code that you're living on top of).  However let's just come right out and say it - current dynamic code analysis doesn't yet give us total information awareness into how the operating environment itself is influencing our code.  For example, in the essay "What Is Software Testing? And Why Is It So Hard?" ( http://www.computer.org/software/so2000/pdf/s1070.pdf ) it explains the type of challenges that testers deal with, and that dynamic analysis tools could help with.

On a marginally related side note, today at work we are testing for Section 508 compliance.  One time in college I participated in an 'awareness' day where you would walk in someone's shoes so to speak for a day.  You would sign up, take a slip of paper out of a hat, and get your challenge.  Some participants got wheelchairs, some crutches, I lucked out with a sling for my left arm (I'm right handed), but one really nasty one was "tunnel vision" with special glasses where you could only see out of a pinhole.  To bring this back on topic, dynamic code analysis tools give us a view into our code, but maybe they are still tunnel vision - better than a blindfold, but with room to add more peripheral information.  As a developer, I want total information awareness into my code and what's going on in the OS.   I guess in some what it boils down to is that while code coverage and profiling are glorious and wonderful, and represent a huge amount of effort and agonizing work to get in place, there is a lot more peripheral information that can be added.

Having dynamic analysis tools that help you grok not just your code but the simultaneous interactions it is having with the operating environment is critical to testing and development in ways that I feel fall right in line with key principles of the Agile Manifesto ( http://agilemanifesto.org/principles.html ).  For example: "Working software is the primary measure of progress" - runing dynamic analysis tools puts the emphasis on actually having working software first, because if your code won't compile or you can't walk through the scenario, there's no information to collect dynamically anyway.  "Continuous attention to technical excellence and good design enhances agility" - because good dynamic code analysis helps you see the forest from the trees, spot abstractions that are too leaky (when to fish or cut bait), how to get the most out of the operating environment, etc.  Technical sloppiness for me is when you know just enough to be dangerous but don't have insight into what is really going on.  More dynamic code analysis tools will hopefully tip the balance on the side of excellence by increasing the overall level of understanding of what's happening.  "Simplicity--the art of maximizing the amount of work not done--is essential" is one I really like because the last thing you want to do is spend your day optimizing the compute performance of an algorithm when network latency is to blame. 

How hard would an auto mechanic's job be if they were never allowed to turn on the car and listen to hear what kind of weird sound it was making, or plug in the diagnostics check while the car was running.  That's dynamic analysis.  I've seen and used some security tools that will snapshot your system, then let you run your app, while it is running it can be dynamically inspecting and monitoring your system resources, gathering information about data flow and resources in use and systematically analyzing and reporting potential elevation of privilege paths and other such things.  Tools like this are klunky today, often home-brewed, cobbled together from various sources, but they systematically provide information you might not get any other way, and the more insight and knowledge you have about the context of the operating system environment and what is happening, the more capable you are of pinpointing quality issues in the code.   Another dynamic analysis tool could be Sanctum's AppScan product, but the idea is that you basically have some automated toolset helping you get lots of little details checked out, and then reporting the results back in a way that returns relevant information. I consider AppScan to be a dynamic analysis tool because it is analyzing a live website.  It's not looking at the source code itself, it's just dynamically putting it through its paces and to some degree it even has a human running through a web page scenario to help guide it along.

For me, dynamic analysis tools are all about helping humans understand the system in the holistic sense, while it is running.  One of the great lessons of 20th century engineering here in Washington state was the Galloping Gertie bridge.  It proved the point that mankind is capable of building systems that are beyond our ability to analyze or model.  Without knowing about the external environment (wind) and the effect on the bridge (airfoil) in conjunction with harmonic resonance what we really ended up building was the largest (in weight at least) man-made underwater reef. http://www.gigharbormuseum.org/nbonlinexhibit.html   Plenty of software has been reefed as well.

Tools like this can automate investigations and accelerate the traditional heuristic method for real-time learning that makes software development so fun (with heuristic defined in the educational sense - "in which learning takes place through discoveries that result from investigations made by the student" - http://dictionary.reference.com/search?q=heuristic ).  As developers see the actual software running on the system and know what's going on with the environment, they learn what's happening, and make it better.  I think the end result is higher quality software, which is something we're all rooting for.