The TMD Sprint Metrics
If you were to decide to sail from San Diego, California to Honolulu, Hawaii and all you did was point your sailboat towards Hawaii you would miss the island of Oahu by thousand of miles.
To successfully navigate the Pacific ocean you would need to take regular readings of where you currently are on Earth.
You would need to make the necessary navigational changes to ensure that you successfully reach your destination.
This seems like common sense when applied to the sailing scenario but so many software development projects use no maps or measurements to ensure success.
Designs are not adequately developed and methodologies for measuring daily activities for their success criteria are never defined and implemented.
Is It No Wonder that Over 70% of all Software Projects
… Do Not Meet the Expectations of the Business Stakeholders
…… And as Much as 40% of the Projects Fail Completely.
The Sprint process has these primary metrics to track the success of an iteration:
Sprint Velocity – The total effort a team is capable of in a sprint. The number is derived by evaluating the work, in story points, completed from the last sprint’s backlog items. The collection of historical velocity data is a guideline for assisting the team in understanding how much work they can do in a future sprint.
Sprint Burn Up – Burn-up chart’s mechanics is basically the same as the Sprint Burn-down. The only difference is that instead of tracking how much work is left to be done, we track how much work that has been completed, so the curve is going up, not down.
Sprint Burn-down – It is a simple graph showing amount of work on a vertical and timeline on a horizontal axis. As time progresses the team tracks how much work is still not done. The goal is to hit the ground. The steepness of the curve can help approximate when it’s going to happen or, in other words, when we’re going to be done with the all of the assigned work.
Increment – The increment is the sum of all the Product Backlog items completed during a sprint. At the end of a sprint, the Increment must be done according to the Scrum Team’s criteria called Definition of Done (DoD). The increment must be in usable condition regardless of whether the Product Owner decides to actually release it.
To Burn Up or to Burn Down, that is the Question
The Sprint burn-up and burn-down are, for the most part, are identical.
The real difference becomes visible when the scope changes with the development of previously unknown tasks are required.
When the reality of development rears its untimely head and adds more work to a task than what was estimated the burn-down spikes up:
Unfortunately, it can also look differently if we happen to be lucky enough or unlucky enough, to complete some other task-work at the same time when we learn about additional work.
The Burn-down chart blurs the reality of accomplished work due to the averaging of new and completed task requirements:
It becomes even trickier when change cause the scope to decrease:
Has the tasks been completed something? Has a Business Team cancelled a Sprint feature that is no longer required? It also becomes more difficult approximating the completed work.
The Burn-up chart makes it all perfectly clear as progress is tracked independently of changes in the scope during Sprint iterations:
In theory, the scope of development should not change during a Sprint iteration but “Reality Always Trumps Theory”.
With the Burn-Up Process
… You see Sprint Iteration Scope changes in Both Directions
…… As Well as Real task Development Progress
This reality alone is a compelling reason
to choose Burn-up over Burn-down.
The Agile Engagement Metric: Sprint Velocity
The Sprint Burn metric tracks Sprint Tasks while Sprint Velocity tracks Business User Story Points completed.
The Burn metric allows the measurement of assigned task and the development resources and hours spent on the development of defined development tasks.
The velocity metric tracks Sprint efforts as defined within the Sprint Planning process. This metric is valuable for future Sprint Story Point Estimations as it represents the real world results of past estimations.
This metric is the real reporting metric for the Business Team as their development world is based on the Business User Stories and how they have been represented by the Architect and Development teams.
The Sprint Velocity metric
reports User Story Success
while the Burn metric
reports Development Task Success
The Agile Engagement Metric: The Increment
The Business Team defines the changes to the Business Domain through a collection of Business User Stories.
These User Stories are massaged and placed into the Product Backlog for future consideration for Sprint iteration development.
Not all User Stories make it to a Sprint.
The Sprint Metric that tracks all of the Backlog Items that are completed during the engagement is the “Increment” Metric.
When a Backlog item is certified completed according the Scrum Master’s “Definition of Done” (DoD), at end of a Sprint Iteration, it is added to the Agile Increment count.
A Backlog item qualifies as a DoD item
when it is certified as meeting the QA Objectives
for the item regardless of its release to Production
The metrics in a Sprint iteration are valuable not only in managing the iteration itself, but to understand what is working with the engagement’s Agile Teams and what areas need process improvements.
Velocity helps Improve the Story Point Estimation Process
… While the Burn helps Understand and Manage
…… Unexpected Scope Changes
These Agile Metrics will help management learn that deviations in the proved success criteria of the Agile methodology can be costly.
Hopefully the wisdom gained by the Sprint Metrics will aid to more effective Sprint iterations as the engagement matures.
Wisdom Pearl # 103 – Managing Technology Results
If You Can’t Measure It
… You Can’t Manage It
…… Always Create Ways to Quantify Your Results
The following two tabs change content below.
I am a Principal Architect at Liquid Hub in the Philadelphia area specializing in Agile Practices as a Certified Scrum Master (CSM). I use Test Driven Development (TDD) and Acceptance Test Driven Development (ATDD) with Behavior Driven Development (BDD) as my bridge to Agile User Stories Acceptance Criteria in a Domain Driven Design (DDD) implementing true RESTful services