NDepend version 5 has just turned RTM today! Big milestone! Last week I talked of the UI relifting we did, but v5 comes also with several flagship features. One of those new feature is about measuring and visualizing development trends. Basically NDepend can log a variety of application-wide code metrics at analysis time. Then the tool can plot historic metrics values on charts and trends suddenly appears:
We were pretty impatient and curious to apply this trending feature on the NDepend code base itself! But the feature has been developed just a few months ago and the trend chart above shows trends for like the past 7 years. To fetch the past values, we did a short program based on NDepend.API that:
- Check out past versions of our code base
- Compile it
- Pass unit tests and gather code coverage
- Analyze the result with NDepend analysis
- Log trend metrics (thanks to types in the namespace NDepend.Trend )
Several remarks can be inferred from this trend chart:
- The Lines of Code progression is pretty linear in the long term. This suggests that in the long run, Lines of Code is a reliable way to measure progression and estimates future schedules.
- This suggests also that our development shop doesn’t suffer from the diseconomy of scale syndroma described well by Steve McConnell in its legendary book: Software Estimation: Demystifying the Black Art : “People naturally assume that a system that is 10 times as large as another system will require something like 10 times as much effort to build. But the effort for a 1,000,000 LOC system is more than 10 times as large as the effort for a 100,000 LOC system”.
- As a side note, I also quote this Steve McConnel idea : “My personal conclusion about using lines of code for software estimation is similar to Winston Churchill’s conclusion about democracy: The LOC measure is a terrible way to measure software size, except that all the other ways to measure size are worse. “
- My thoughts on high code coverage ratio are becoming prevalent in the NDepend codebase. With v5 we reached almost 80% of overall code coverage by tests. We can see a large effort to jump from 50% to 80% code coverage within the last months and we will continue this effort. What this chart doesn’t show is that code contract is also massively used since I estimate that automatic test and code contract are two sides of the same practice. Once the product reaches a certain critical size it is not humanly possible to sustain a constant productivity, unless you have a good automatic testing and code quality plan coupled with a solid code structure! My hope is that with everything achieved so far, we will increase the productivity. This will be an interesting trend to measure!
- The ratio # Lines of Code / # Lines of Comments seems pretty constant, nothing surprising but it is good to visualize this.
- The lines of code that fall in the NotMyCode category didn’t increase that much. This can be explained by the fact that the good old CQL impl is mostly generated code, and since then we don’t use much generated code anymore.
That’s a lot of remark for just 4 trend metrics logged. But there are almost 50 default trend metrics defined by Code Query LINQ queries embracing code size, code coverage by tests, max and average values, third party usages (see them listed in the Trend Metrics group). Some out-of-the-box extra trend metrics related to the number of code rules violated and the number of code rules violations, are also available.
Defining a trend metric can be as simple as:
Or more sophisticated like:
Creating your own trend metric is easy if you know a bit of LINQ. For example to count the number of types implementing IDisposable one can just write:
With the richness of the code model API and the flexibility of LINQ, the possibilities are pretty wide.
I hope you’ll enjoy this new feature as much as I do!