Before you flame me, I’m not talking about the canonical book by Steve McConnell. What I mean is that this statement is a lie – “we’re code complete on feature xyz.” That phrase is a lie, or at least misleading, because you really aren’t complete with the feature. Code complete doesn’t tell you anything definitive about the quality of the code, or most importantly, the readiness of the code for actual deployment. Code complete just means that the developers have reached a point where they’re ready to turn the code over for testing. You say the phrase “code complete” to mark off a gate on a schedule or claim earned credit for coding work done on a project schedule. Using “code complete” to claim earned value is a bald-faced lie because it doesn’t translate into business value – yet. It could have lots of bugs and issues yet to be uncovered by the testers. If the code hasn’t gone through user acceptance testing, it might not even be the right functionality.
It’s a nearly arbitrary status anyway. Oftentimes a team will call something code complete just because the date on the schedule says its supposed to be code complete on that date. Just call it code complete to satisfy the schedule, all software has bugs anyway right? Code complete is one of many common evils on waterfall projects when team members have somewhat divergent goals. Developers may be judged by meeting the code complete date, and the testers by the quality of the system – i.e. the lack of bugs. These conflicting goals can easily have a negative impact on the project.
One of my favorite aspects of XP development has been the emphasis on creating working software instead of obsessing over intermediate progress gates and deliverables. In direct contrast to “Code Complete,” XP teams use the phrase “Done, done, done” to describe a feature as complete. “Done, done, done” means the feature is 100% ready to deploy to production.
There’s quite a bit of variance from project to project, but the “story wall” statuses within an iteration I learned on my first XP project were:
- Not started
- Business Analyst Review (The developers called it “BA Volleyball,” I think this was a process smell)
- Customer Review
- Done, done, done
The other story wall columns besides “done, done, done” are just intermediate stages that help the team coordinate activities and handoffs between team members. We use a Scrum style burndown chart to track intermediate progress in an iteration, but it’s nothing better than an educated guess of status. The burndown chart informs management on the state of the iteration and helps spot problems and changes in the iteration plan, but the authoritative progress meter is the number stories crossing the line into the “done” column. The goal of the XP team in any given iteration is to get every story in play into the “done, done, done” column before the end of the iteration. No credit or progress is earned on an XP project until a user story has been coded, fully tested, and approved by the business customer. It’s a draconian measure for a team, but it’s an accurate indication of how much business value an XP team has really delivered. XP nirvana is being able to consistently push all iteration stories into the “done, done, done” column and produce a potentially deployable release at the end of every iteration. It’s a lofty goal that most XP teams probably don’t meet, but that doesn’t mean we should stop trying to reach that goal. Like so many agile teams we struggle with leaky iterations because we aren’t fully driving stories to production-ready completion within an iteration, but that’s the topic for my next post…
A friend of mine who’s still mired in waterfall land challenged me one time on how an XP can know their status without a detailed plan. My response was that our status was measured by what new features were ready to deploy. If you think about it, that’s a far more useful measurement than saying x number of features are “code complete” with unknown quality attributes.
A very important attribute of agile methods in general is the entire team sharing a common goal to produce working software. My experience has been that developer/tester interaction is far smoother in agile projects (certainly not perfect of course), in no small part because the developer’s purview moves from getting code into test to getting code to a potentially deployable state. As an agile developer it’s in your best interest to do everything possible to make testing go smoothly – helping with test automation, build automation, unit testing, etc. Testers are your allies now, not sparring partners.