Sponsored By Aspose - File Format APIs for .NET

Aspose are the market leader of .NET APIs for file business formats – natively work with DOCX, XLSX, PPT, PDF, MSG, MPP, images formats and many more!

TDD and Debugging

Just a quick follow up to my post on “Test Small Before Testing Big.”


It’s relatively easy to sell the long term benefits of TDD is to make the code much easier and safer to modify, but in the short term the cost of writing all of the automated unit tests can appear to be inefficient.  Looking at the bigger picture of the project, Test Driven Development does not take longer because the extra time spent writing unit tests is more than offset by reducing the amount of time spent in the debugger and fixing bugs late in the project.  This point is often overstated in inflated claims but I’ll repeat it anyway because I believe it and it’s consistent with my experiences — using Test Driven Development can significantly reduce the amount of time spent in debugging.  As a feedback cycle, TDD style unit tests are a shorter, more efficient way to discover and remove defects from your code than long debugging sessions.


Notice that I said “can” and not “will” reduce debugging time.  The dramatic reduction in debugging time only happens when you’re writing small unit tests against isolated code.  If you are trying to use TDD, excessive usage of the debugger is your code telling you that you need more granularity in your unit tests.  Stop and reflect on how you’re coding the solution.  Is there a way to test in smaller chunks?  Is there a class, or classes, that are getting too big and taking on more responsibilities than they should?  Do you have a good separation of concerns in the code, or are some concerns getting a bit too intimate with each other?  Could I use a mock object to write smaller tests?


It isn’t just that TDD makes debugging unnecessary most of the time — it also makes debugging easier.  To defend that statement, let’s first think about why debugging can be difficult (not counting goofy third party components or web services outside your codebase).  The main difficulty of debugging in my mind is the number of variables at any given time, and by this I mean how many different places in the code could be the problem.  The number of variables will shoot up when you try to write large tests, especially when the smaller code units within the larger code structure have not been tested.  In this case the problem could be anywhere because nothing has been previously verified.  Going a long time without a test makes the problem worse because you could be going off in the wrong direction and not know it until you try to write a test.


The way out of this situation is to apply the Third Law religiously by writing granular unit tests.  In a granular unit test that is failing there should be so little code to examine that you can usually spot the problem visually through inspection, or at least promptly when you debug into the unit test (unit tests are a great entry point for debugging, I simply won’t code without TestDriven.Net installed).  For that matter, bigger tests are often much more difficult to understand, especially with a lot of mock object setup.  It’s bad enough when the code isn’t  working, a difficult to comprehend unit test on top of that certainly exacerbates the problems.  Maintaining a pace of writing smaller granular unit tests should make integration testing easier.  When you do move on to putting the little pieces together in bigger integration tests you have the confidence in knowing that the little pieces are already unit tested and any problems are most likely related to the new integration code (some of that will depend on how closely the interaction testing in the unit tests matches reality, but that’s a big topic for another day).


Another factor that can make code harder to debug, and this shouldn’t be underestimated, is simply the length of time between writing the code and troubleshooting the code.  With Test Driven Development the verification of code is virtually simultaneous with the writing of the code. 

About Jeremy Miller

Jeremy is the Chief Software Architect at Dovetail Software, the coolest ISV in Austin. Jeremy began his IT career writing "Shadow IT" applications to automate his engineering documentation, then wandered into software development because it looked like more fun. Jeremy is the author of the open source StructureMap tool for Dependency Injection with .Net, StoryTeller for supercharged acceptance testing in .Net, and one of the principal developers behind FubuMVC. Jeremy's thoughts on all things software can be found at The Shade Tree Developer at http://codebetter.com/jeremymiller.
This entry was posted in Test Driven Development. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • http://www.jeffperrin.com Jeff Perrin

    Really? The last time I used TestDriven, it was basically just a context menu that let you kick off unit tests, and a crappy console output showing the results in 6pt font. Has it gotten better?

  • jmiller

    Jeff,

    We’ve used ReSharper 2.0 for awhile. I still prefer TestDriven.Net myself.

    Jeremy

  • http://www.jeffperrin.com Jeff Perrin

    “I simply won’t code without TestDriven.Net installed”

    FYI, Resharper 2.0 has a unit test window integrated into VS that kicks ass. TestDriven is obsolete.