Sponsored By Aspose - File Format APIs for .NET

Aspose are the market leader of .NET APIs for file business formats – natively work with DOCX, XLSX, PPT, PDF, MSG, MPP, images formats and many more!

"Velocity" is numerical voodoo

This isn’t a post about coding or design, but there’s other stuff we’ve got to pay attention in order to code in peace, so…


Project Velocity” is not much more than a way to figure out how much work you can do in an iteration.  Ideally, a team should really be increasing its velocity over the initial iterations as the team becomes more knowledgable about the problem domain and the “this is the way we’re doing it” quality of the project comes into focus.  The velocity might not really seem to increase because the measurement is by the team’s estimate of the work.  As we get farther into the project we’re typically able to accomplish the same amount of business value in less time.  The story estimates then become less for the same amount of observed work as the development team becomes more confident.  Velocity appears to be the same, but the actual amount of work accomplished may be increasing.  It’s not very scientific, and should never be used at face value. 


We do try to be good about velocity and estimating in iteration and release planning, but my boss and I still quote Caddyshack quite a bit — “Just pick it up” or “We’ll just let velocity take care of that” or Kramer’s “Just write it off.  All the big companies do it.”


You could try to measure actual productivity in terms of something like functional points (my colleague almost threw something at me this morning when I joked about this) to even out the sizing, but that’s nothing but numerology to me.  And besides, you know damn well that some tasks are simply more difficult than others.  Plus that little issue of the developers being widely different in experiece and skill with different technologies and problem domains.  How in the world do you correct project velocity when a task is estimated by a senior developer and actually done by a developer with much less familiarity with the problem domain?  Should you really care, or let the numbers come out in the wash?  I did an XP project with a team that easily had an order of magnitude difference in competence and experience between our developers.  I think our project velocity measurement was nothing but an imaginary number for the first several iterations.


Senior management for some strange reason wants some predictability in our delivery so they can size project roadmaps and team velocity is probably the best tool we do have.  Since project velocity is a little bit dicey in terms of real value outside of the context of a single project, we have to be adaptive in our planning across the board.  This bothers a lot of people, but any nontrivial software development project has so many variables that you almost have to admit that you just can’t predict things too far out.  A great deal of Agile project management is just admitting that there is a lot of uncertainty in software development and maximizing the ability to recognize and accomodate change through feedback loops.  In other words, you gotta be thinking all the time (and that can get tiring).  A lot of people really want the comfort zone of working against a predictive schedule, but it just isn’t going to happen in software.


What’s the answer?  I don’t know, what’s the question?  The one thing I do know is that statistics never tell the whole story about the success or progress of a team.  That might not help senior management very much, but that’s why they get paid the big bucks to figure this stuff out.  It’s our job to get them the best information we can, then constantly educate them on the reliability of the information.


Martin Fowler has a bunch of posts on project velocity.  I’d start here.


BTW, I certainly don’t have a point here, I’m just rambling.  I’d certainly be interested in hearing how you might be doing velocity in your projects.  No two projects in the world do it exactly the same way.

About Jeremy Miller

Jeremy is the Chief Software Architect at Dovetail Software, the coolest ISV in Austin. Jeremy began his IT career writing "Shadow IT" applications to automate his engineering documentation, then wandered into software development because it looked like more fun. Jeremy is the author of the open source StructureMap tool for Dependency Injection with .Net, StoryTeller for supercharged acceptance testing in .Net, and one of the principal developers behind FubuMVC. Jeremy's thoughts on all things software can be found at The Shade Tree Developer at http://codebetter.com/jeremymiller.
This entry was posted in Uncategorized. Bookmark the permalink. Follow any comments here with the RSS feed for this post.
  • Jeremy D. Miller

    One thing you can do is actually pay attention a little bit to actual hours versus predicted hours. Not so much so the PM can “keep some heat” on the developers, but so we the developers can make better estimates the next time around.

    Anybody that’s worked with me knows how much that kind of bookkeeping bugs me, but maybe things like the new VSTS or Traq that integrate auditing with code checkin’s can make the mechanics go smoother.