Measuring Impact With Agile Metrics: Guide To
Measuring Impact With Agile Metrics: Guide To
Measuring Impact With Agile Metrics: Guide To
Measuring Impact
with Agile Metrics
At Excella, we have partnered with numerous federal agencies to help them on their Agile journey. We follow an adaptive
approach, continuously seeking to improve, by listening to our business partners, running experiments, and making data-
driven decisions based on the results from those experiments. Knowing what to measure, and what those measurements
can and cannot tell us, is an Excella core competency. Our expertise is based on our own experiences as practitioners,
along with careful application of metrics such as those that are included in the State of DevOps Report.
Collaboration
dependency significantly increases the risk that work will not be
finished when expected; and count the number of cross-team
Metrics
dependencies closed, where each elimination of a dependency
significantly decreases the risk that work will not be finished when
expected. (Focused Objective: Dependency Types and Impact)
Delivery
to users. (CircleCI: How Often Does Your Team Deploy?)
How many running, tested units How stable and reliable are code
How stable and reliable is our
of work can we reliably deliver and other artifacts that we deploy
deployment pipeline?
during a Sprint? to Production?
Leverage system availability data to identify areas for Use a deployment or system failure as a reason to single out an
improvement individual or team for blame
Examples of Quality Metrics:
Defects
• Defect Density: Number of Defects by Module: Count the number of
defects that exist in a particular software module (often counted based
on number of defects per 1,000 Lines of Code (KLOC). (SeaLights: Defect
Density: Context is King)
• Defect Escape Rate: Count the number of defects that are opened in
a particular component, where the defects are opened after the Sprint
has ended during which the component was created or modified.
(LeadingAgile: Escaped Defects)
Metrics • Amount of Time Set Aside for Code Refactoring (per User Story): Capture
data on the amount of time during which code refactoring is done, during
initial development and/or during peer code review. (Mountain Goat
Quality Metrics provide visibility into the Software: The Economic Benefit of Refactoring | Tushar Sharma: How to
extent to which the products that we deliver Track Code Smells Effectively)
work as they are intended to work.
• Number of User Stories or Defects Created to Address Technical Debt:
Some quality metrics, such as unit test Count how many user stories or defects are opened to address shortcuts
coverage, are leading indicators. For that had to be taken due to time constraints; related metrics that can
instance, if we take it as a given that unit help uncover technical debt via code analysis tools include cyclomatic
tests are testing something meaningful, complexity, code coverage, SQALE rating, and rule violations. (Excella: The
execution of unit tests can prevent problems Technical Debt Management Plan | Christiaan Verwijs: How to Deal with
from surfacing later. Other quality metrics, Technical Debt in Scrum)
such as Defect Escape Rate, are lagging
indicators, because they tell us after the fact Test Coverage
that we have a gap in our testing approach. • Test Automation Coverage Level: Collect data on the percentage of the
code base that is covered by automated tests, by dividing the total test
coverage by the test automation coverage. (LogiGear: 5 Useful KPIs for
Test Automation)
• Unit Test Coverage Level: Collect data on the percentage of the code base
that is covered by unit tests. (SeaLights: Code Coverage vs Test Coverage)
Questions that Quality Metrics help answer:
Make time available for code refactoring, which greatly reduces Place blame on teams when it becomes difficult to improve or
the likelihood that technical debt will be created maintain existing code, due to the existence of technical debt
To what extent do we deliver To what extent are customers To what extent are internal
solutions that customers satisfied with the solutions that stakeholders satisfied with the
consider usable? we deliver? solutions that we deliver?
Collect data on how much work (and by extension, how much Compare how much work one team completed with how much
value) each team finishes during a particular time frame work another team completed
Conclusion
By taking a balanced perspective, being sure to include Collaboration Metrics (which often get little if any attention),
and including Delivery, Quality, and Value Metrics, we have a strong foundation on which to build. As the authors
point out on page 9 of the 2019 Accelerate State of DevOps Report:
Excella is an Agile technology firm helping leading organizations realize their future through the power of
technology. We work collaboratively to solve our clients’ biggest challenges and evolve their thinking to help
them prepare for tomorrow. Together we transform bold ideas into elegant technology solutions to create real
progress. Learn more at www.excella.com.
Find us on social