Measuring Value in Agile Projects WP

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

From Performance to Value:

Measuring in Agile

Date June 23, 2015


Author(s) The Agile Research Network
Peggy Gregory and Katie Taylor (University of Central Lancashire, UK)
Helen Sharp, Leonor Barroca and Dina Salah (The Open University, UK)
From Performance to Value: Measuring in Agile

1 Summary

Faster delivery of business value is often cited as a reason for adopting agile. How-
ever, measuring the value achieved through IT development is challenging. For those
relatively new to agile, measurement is not straightforward, as traditional project and
portfolio metrics are often hardwired into management report requirements and are
hard to change. During agile adoption it makes sense to start by measuring team
performance, as this can aid learning. From small beginnings, an agile measurement
approach can iteratively be developed to meet the requirements of the business.
This white paper presents the case of a department within a multinational organiza-
tion in the insurance sector that adopted Dynamic Systems Development Method
(DSDM) in the first quarter of 2013. One of the issues they faced was how to shift
from a traditional approach to measuring IT development to one that is compatible
with agile development. We present the challenges they asked us to investigate
along with suggestions from the published literature about how to overcome them,
and a summary of proposed mitigation strategies. Their primary challenge was
‘understanding and measuring value in agile projects’. In order to do this we were asked
to investigate the three levels of performance measurement that the department
already used, and explore how they may be adapted to be more suitable for agile
working. The three levels we looked at were:
1 Personal performance: how to gauge individual contribution to the success of a
project;
2 Project performance: how to identify, track and report on project progress and deliv-
ery in a meaningful way in order to demonstrate strong delivery of benefits, along
with improvements compared to more traditional projects; and
3 Department performance: how to use the information in existing KPIs at departmental
level.
The department already had a well-established measurement process at all three lev-
els for their waterfall projects. As they started adopting agile processes they wanted
to shift their measurement practices in order to ensure that they were capturing the
right management information as well as driving the right behaviour. Three types of
recommendation for how to approach these challenge areas are identified from the
literature:
1 Guidelines and frameworks for measuring business value and agile processes in agile
projects
2 Specific measurement techniques
3 Approaches for dealing with individual, team and portfolio measurement in agile
environments.

2 Introduction

The measurement of IT products and development processes is a well-established


practice, and is a core part of the traditional project manager’s job. As software
development is a complex process, some sort of measurement is necessary in order
to understand what is going on and to improve. The agile approach to measurement
is to focus on improvement and to ensure that value is achieved. Metrics can be
devised to assess different aspects of an agile project, for example aspects of team

Page 2 of 13
From Performance to Value: Measuring in Agile

learning, process improvement, product quality, and stakeholder satisfaction. The


main point is that in agile, measurement is a tool and not an end in itself.
In this case study we (the Agile Research Network1 ) worked with a department
that was undergoing a transformation from waterfall to agile IT development. They
wanted to know how best to measure the success of that transformation to ensure
that the new approaches they were adopting were delivering value to the company.
Value in the context of Agile IT projects often refers to the delivery of ‘business
value’, which includes any benefits that maintain the general health of a company. It is
closely linked to economic value, and ultimately may be assessed by measures such as
return on investment (ROI). In order to justify an IT investment, a calculation may be
required to show that the income generated or the cost savings made will be greater
than the cost of producing the software. However, economic value is also embedded
in many features of a business. Information technology does not exist in a vacuum; it
is a complementary system that works with other factors in the business [1]. Value
can accrue, for example, through knowledge production, brand awareness, loyalty,
customer satisfaction, and trust. Some IT developments may only yield economic
benefits over time and may not show immediate gains. For some projects, such as
infrastructure projects, it may be very difficult to measure their precise economic
value to the business. A long-term perspective may be required for some develop-
ments, such as those aimed at gaining competitive advantage, improving supplier
relationships or achieving strategic alignment.
In the rest of this paper we report our investigation, which starts by looking at the
story of agile adoption in the department. We explore when, how and why they
started using agile, their existing performance measures, and how these relate to
project and personnel management. We identify the specific challenges they face
in the current context, and we make suggestions about performance measurement
from the literature. Finally we come back to the issue of value, and we propose an
approach they can adopt for identifying and measuring value.

3 The Company and the Context

The company is a UK section of a large multi-national organization working in the


insurance sector. Some parts of the UK operation were already using agile, but the
section we collaborated with had only recently started to use an agile approach.
They began to adopt agile approaches for their projects in March 2013, with the
support of a DSDM consultant. They chose DSDM because they regarded it as
a corporate-strength framework that would work in their regulated environment,
could be adapted for different projects, and could accommodate a spectrum of soft-
ware development processes from very agile to almost waterfall. The desire to have
a common framework for all projects was an important part of this decision. They
did not want to use one project framework for waterfall projects, and a different one
for agile projects. At the same time they expected that some of their projects would

1 The Agile Research Network (www.agileresearchnetwork.org) is funded by the DSDM Consortium


Board. It comprises a group of academics from the Open University and the University of Central
Lancashire who research agile methods in industrial settings. The model operated by the network is
that DSDM members propose a challenge they’d like to investigate, and then work closely with the
research team to understand the causes and consequences of the challenge and to identify alternative
ways of working from published research and other literature.

Page 3 of 13
From Performance to Value: Measuring in Agile

remain waterfall-like, so whatever they chose had to be adaptable. Another rationale


for choosing DSDM was that they wanted to adopt an approach that gave full life-
cycle coverage, so they could use a consistent process framework across all projects.
The move to DSDM started with Project R (see Box 1), and was soon followed by
two others, B and Q (see Table 1). DSDM projects had been running for nearly a
year when we started our collaboration. Agile teams are all based in a large open-
plan office space. Members of each project team work in a semi-circular workspace
divided from other groups by desks and low-level room dividers. They each have
their own whiteboard on which they keep progress information, and which they use
as the focus for stand-up meetings. All the teams have some members who are not
geographically collocated, as the business functions are not on the same site as the
development teams. Contact with the business is made primarily by phone or email
and occasionally by face-to-face visits.
Table 1: Summary of First Three DSDM Projects

Project R Project B Project Q


Domain New business Management Rework of an
function Information existing business
development function
Team size 10 (core) 8 (core) 3 core
Team make-up Dedicated MI project with Small team plus
resource; minimal more than 60% outsource
dependencies Musts development
High profile
Team Dev team and Wide business Dev team
distribution Business team stakeholder group outsourced
separated
Tester in India
Anticipated 1st 2 increments 12 months 5 months
length within 9 months
How far 1st 2 increments Delivery of 5th Complete within
through were delivered increment (20 12 months pilot
within 9 months; months in) started within 7
Now in Delivery months
of 7th increment
(24 months in)

The first agile projects were selected from upcoming projects. They were chosen
because they were not the most important projects and they were assessed as being
suitable projects for agile development. The first was the R project, which was chosen
because it was a stand-alone project with an appropriate scope and a screen-based
interface. The team for this project was handpicked; choosing staff it was believed
would have the right behaviours to adopt the method. Managers were aware that
adoption involved a big learning curve for everyone. The move from a command-
and-control project management style was found to be particularly challenging. The
department already had a well-established measurement process at individual and

Page 4 of 13
From Performance to Value: Measuring in Agile

project levels for their waterfall projects. However, waterfall governance was not
imposed on the agile projects. As they started adopting agile processes they wanted
to adapt their measurement practices to the new agile way of working in order to
ensure that they were capturing the right management information as well as driving
the right behaviour. These first agile projects were successful and the decision was
made to commit to rolling out agile more widely over time.
Box 1: Example from Practice - The First DSDM Project

The R project was the first DSDM project in the company. In


this project each increment consists of three, three-week
timeboxes. High-level planning is done at the start of each
increment and, in each timebox, stories are checked for the
next timebox. Increments end with paperwork production and
sign-off.
The team run stand-ups every day. There is no stand-up leader,
but control passes around the team members. There is a
whiteboard on which the team captures the current state of
stories being worked on. This is the focus for the daily
stand-up. Retrospectives are run a few days after the end of a
timebox. In the early days they were very action-based, but
over time the team relaxed and started making more
suggestions. Retrospective comments are circulated to the
team after the meeting.
The team have noticed a number of improvements since
adopting DSDM. Work is completed more effectively because
the project is broken down into manageable chunks and
working software is delivered frequently. The project has had
no reported defects. The team likes making decisions about
process improvement. Communication between the
developers and the business is easier. The Business Visionary
takes away blockers and the Business Ambassadors are
decision makers.
Some issues remain. As most of the organisation is waterfall,
documentation production and releases have to fit into a
waterfall schedule. Some developers find the tight time scale
stressful. Business Ambassadors and Business Advisors are
not collocated with the development team. The developers
felt that the BAs sometimes make impossible demands, for
example ‘they want everything as a Must’ or ‘they throw 200
stories at you’. Communication with the tester who is based in
India is not as easy as it is with the rest of the team because
they are not collated and are therefore not able to take part in
the informal discussions that aid decision-making during
development and testing.

Page 5 of 13
From Performance to Value: Measuring in Agile

4 The Challenges

We worked with the Head of Business Change and the Capability and Delivery Manager
to identify a primary challenge that we would investigate. As relatively new adopters
there were many areas they wanted to investigate, and choosing one was not easy.
We discussed a range of challenges they were interested in, including knowledge
sharing within the organisation, re-organising the workspace, business engagement,
and evidence of agile success.
After discussions we agreed that the primary focus of our study would be to inves-
tigate how the department could show that their newly adopted agile processes
were delivering real benefit for the organisation. This was summarised from the
organisation’s point of view as:
Understanding and measuring value from agile projects
However, they were not yet in a position to start measuring value from agile projects.
First, as new adopters they were still in a transition period. Their teams were finding
their feet and exploring ways of adapting to agile within their context. Second,
they had a well-established set of measures and processes already. These were de-
signed for use with waterfall projects and focussed on performance and deliverables
rather than value and assessed at three levels: personal, project and department.
It therefore made sense to start by exploring how to introduce measures of agile
performance so the department could transition their measures to accommodate the
new agile way of working.
The managers identified three areas in the primary focus:
• Personal performance – how to gauge individual contribution to the success of a
project
• Project performance – identifying, tracking and reporting on project progress and
delivery in a meaningful way so that we can demonstrate strong delivery of benefits,
along with improvements compared to more traditional projects
• Department performance – identifying how we use the information in our key perfor-
mance indicators at department level
We investigated these focus areas by visiting the department numerous times during
the summer of 2014 to do observations and interviews with members of the three
agile development teams, business ambassadors and a range of managers. Below
we summarise the current practices of the organisation, and identify the particular
challenges for moving to agile that were identified through our investigations and
discussions with staff.

4.1 Personal Performance

Within the department capability managers and change managers (project man-
agers) work closely to assess personal performance of all employees. Capability man-
agers write role descriptions and undertake personal performance reviews within
their capability area. Change managers collect and assess information about the
performance of members of their project teams that feed into personal perfor-
mance reviews. Capability areas are based on functions. Examples include business
analysis, development, environment, and testing. Each job role has a profile that
describes what tasks and objectives the role is accountable for and how they are

Page 6 of 13
From Performance to Value: Measuring in Agile

measured. These role profiles are reviewed and adapted each year so they keep
pace with changes. The capability review procedure involves a half-year and end
of year performance review in which the generic set of objectives are developed
into a person-specific profile, which is assessed, measured and used to plan training,
development and support. In the IT department there are also interim monthly
performance reviews in addition to the twice-yearly reviews. The primary issue for
moving to agile is that current role profiles do not take account of the increased team
working that happens in agile projects.
The organisation has already learnt from experience that if they measure perfor-
mance outcomes they may drive the wrong behaviour. For instance, if they measure
timely project completion, they may receive project plans that under-estimate activi-
ties and achievements in order to guarantee successful completion to target. This has
the unintended consequence of slowing down delivery of features to the business.
Hence they are more interested in measuring activities, roles and accountability
within projects.
The challenge areas for personal performance monitoring in agile projects that were
identified during our discussions with staff, are:
1 Gauging individual tasks in agile projects is more difficult than it is in traditional
projects. Individuals often undertake a wider set of tasks. Task-allocation is no longer
done by the project manager, but happens less formally as part of working in a self-
managing team.
2 DSDM introduces new roles and changes existing roles. Roles that change include
that of the Project Manager and Business Analyst. New roles include the Business
Ambassador, Business Advisor, Technical Co-ordinator amongst others.
3 Focussing on personal performance is somewhat anti-agile as it focuses away from
the self-managing team.

4.2 Project/Team Performance

Existing project performance measures were devised for waterfall projects, and
many are not appropriate for agile projects.
The existing measurement process focuses on collecting data for Key Performance
Indicators (KPIs). Using electronic data collection, project managers create a project
schedule including phase duration estimates and checkpoints. Actual data for task
completion and task duration are input during the project by members of the team
and the project manager. Measurement is driven by KPIs and includes:

Valuable (= the business benefit – the total project cost);


Days of Effort by project life cycle stage (feasibility, initiation, solution defini-
tion, delivery, closedown);
Timely (= elapsed days per stage) and (% of time spent in each stage) and
(elapsed time from request to deployments)
Controlled (RAG (red-amber-green) values measured against schedule check-
points)
OTOBOS = On Time, On Budget, On Scope (RAG values measured at project
end)

Page 7 of 13
From Performance to Value: Measuring in Agile

Some agile-specific measures are currently being collected, but these are out of the
standardised system. They are:

Effort – how many hours are being put into projects


Outcomes –Must-haves, Should-haves and Could-haves completed in time-
boxes

Managers and project managers are aware that measures need to be adapted to
the agile approach whilst still being focussed on delivering better outcomes for
the business. New agile measures being considered include: velocity, cycle time,
boomerangs (defects in delivered software), business value, risk, quality, customer
involvement, and customer satisfaction.
The challenges for measuring agile project performance that were identified in our
discussions with staff, are:
1 There are currently no agreed project measures for agile teams
2 Many current measures are based on waterfall checkpoints which will be gradually
phased out as agile is more widely adopted
3 New agile project measures must not drive the wrong behaviours

4.3 Department Performance

The department covers two areas: IT and Operations. Measurement at the depart-
ment level focuses on the KPIs described in the section above. Project data is collated
into a department-level report for each KPI. At the moment there are no specific
department-level KPIs. Generally about 50 projects run at a time, but data is not
collected for infrastructure projects or projects that involve less than 40 days of
effort.
The IT Quality Manager would like to develop measures and KPIs that will drive
beneficial change in the department. One of the main issues discussed is that
not all the data being produced are being used. This is a waste of effort. A new
measurement approach is needed, which will produce useful outputs. As well as
measuring performance and business value, metrics have been suggested that assess
the success of the transition to agile. Current proposals include collecting data on
the number of users of agile tools and methods and the number of people trained or
coached in agile methods, but other measures are needed.
The challenges identified during discussions for measuring department performance
when more projects are agile are:
1 Ensuring that data is only collected if it is useful and that reports are only produced
if they are read.
2 Developing appropriate departmental-level metrics that help with decision making
3 Developing metrics that measure the uptake of agile methods through the depart-
ment

Page 8 of 13
From Performance to Value: Measuring in Agile

5 Mitigating the Challenges

Having spent time in the company understanding their existing practice and the detail
of the challenges they faced, we turned to the literature to find guidance on how to
address the challenges in their context. A key lesson that emerged from the literature
is that agile project measurement needs to be tailored to the business context.
Our findings break down into two main areas. We look first at measuring perfor-
mance at the three levels identified. Second, we focus on guidelines for developing
agile metrics and give examples of some agile metrics used in practice.

5.1 Measuring Personal, Project and Department Performance

The ultimate value of adopting agile for software development projects is the effect it
has on the business as a whole. In the context of our collaborators, this translated into
an initial focus on performance measurement. Below we compile some guidelines on
performance measurement at the three levels identified in the challenges.

5.1.1 Agile Personal performance

Measuring personal performance is regarded as somewhat ‘anti-agile’ as it defeats


the spirit of collaboration in agile teams. However, as most organisations insist on
assessing the performance of individuals, some practitioners have proposed met-
rics that minimise the threat of personal performance measurement to the agile
ethos. Gautam [2] suggests an appraisal backlog for individuals with prioritised goals
and acceptance criteria for each goal. Sutherland [3] suggests the replacement of
performance appraisals with a self-evaluation followed by a conversation between
appraiser and appraisee, while Coens and Jenkins [4] go further in their challenge
of personal appraisal and suggest a complete rethink of the assumptions underlying
appraisal, and of the roles of managers and employees. In the context of our collab-
orators, personal performance is still linked to waterfall performance objectives as
role profiles assume that individuals will only work in a very tightly-specified job role.

5.1.2 Agile Project/Team Performance

The literature on agile teams and their performance is extensive. There are some
peer-reviewed publications on agile performance measurement [5-7], but much of
what can be found is anecdotal and to be found on websites and blogs. Hartman
& Dymond [5] looked at sources for agile metrics and compiled a checklist to help
measure a team’s performance.
• Identify a clear question
• Clearly state what is being measured
• Identify assumptions
• Indicate intended target audience
• Capture actual against expected outcomes
• Review and adapt

Page 9 of 13
From Performance to Value: Measuring in Agile

As our collaborators are undergoing an agile adoption process, we considered that


team performance is the most appropriate level to start measuring agile performance
in the department. These metrics can be developed iteratively, and each team can
experiment with what works for them. The guidelines discussed in 4.2 are a good
starting point for teams to decide on the metrics that best fit their context

5.1.3 Agile Department Performance

As the department moves towards more comprehensive agile adoption we sug-


gested a series of guidelines for developing department measures, based on Thomas
& Baker [8]. Adopting this approach will aid a smooth transition from measuring
traditional projects to measuring agile projects, and developing new department
performance measures:
• Introduce change gradually
• Move from ‘control through data’ to ‘enable and ensure’
• Introduce fact-based measures that give insights but don’t force a comprehensive
suite of metrics on all projects
• Maintain a light touch – don’t collect data for the sake of it
• Prioritise and manage projects in progress over whole the department
The department we worked with in this company is cross-functional, and the move
to agile will require buy-in from all stakeholders. This will take time.

5.2 Guidelines for agile metrics

In order to maintain a light touch, it is generally accepted that agile teams should
design and use their own metrics in response to identified needs, rather than using
pre-defined metrics. Some of the literature therefore provides suggestions for de-
signing good agile metrics.
Agarwal and Majundar [7] suggest that optimal metrics should be:
• Simple, precisely definable – so that it is clear how the metric can be evaluated;
• Objective, to the greatest extent possible;
• Easily obtainable, (i.e., at a reasonable cost);
• Valid – the metric should measure what it is intended to measure; and
• Robust – relatively insensitive to insignificant changes in the process or product
Hartmand and Dymond [5] characterise a good agile metric as one which:
• Affirms and reinforces agile and lean principles
• Measures outcome, not output
• Follows trends, not numbers
• Answers a particular question for a real person
• Belongs to a small set of metrics and diagnostics
• Reveals, rather than conceals, its context and significant variables
• Provides fuel for meaningful conversation
• Provides feedback on a frequent and regular basis
• May measure value or process
• Encourages good-enough quality”

Page 10 of 13
From Performance to Value: Measuring in Agile

With these two sets of criteria and taking into account the context, we suggest that
metrics should:
• Be at the right level – just enough;
• Answer a question for a real person;
• Link to high level goals; and
• Be used for a specific purpose.

5.3 Agile metrics in practice

The academic literature and agile blogs provide numerous examples of metrics that
have been used by companies in different contexts. Table 2 shows some of these
metrics.
Table 2: Agile Measurement Practices

Practice Aim Based on Beneficiary When


Software size Estimation of User stories Project At start of
[6] size/effort Manager each iteration
Velocity [6] Productivity User stories Project At end of each
of team Manager iteration
Burndown [6] Progress User stories Team At end of each
monitoring iteration
Cumulative Observation Work in Top At end of each
Flow [6] of lead time progress managers/ iteration
and WIP customers
queue depth
Responding to Ability of team Defects Project At end of each
change [6] to handover fixing cost Manager itera-
quality tion/project
Earned Monitoring bv Business Top As each
business value delivered to value managers/ feature is
[6] customer customers delivered
Total Planning and User stories Top At beginning
estimation budgeting and reworks managers/ of project
effort [6] customers
Story Time spent on User stories Team Beginning of
estimation story iteration
estimation
Requirements Misinterpreted User stories Team End of
ambiguity requirements iteration
Unfinished Identify User stories Team End of
stories problems with iteration
stories
Number of Identify Impediments Team During retro-
impediments impediments identified in spectives
team faced retrospec-
tives

Page 11 of 13
From Performance to Value: Measuring in Agile

Work-in- Items being User stories Team During


progress actively on kanban iteration
worked on board
Unplanned Changes able Changes to Team During
changes to process in user stories increment
increment
Employee Individual Questionnaires/Individuals End of
satisfaction satisfaction Surveys iteration

6 The next steps

Our proposals for moving forward are to start introducing agile measurement at
project team level. At this level, the focus can be on performance improvement and
value.
First, teams can adopt their own measures for improving performance by identifying
and analysing problems they encounter. The teams already run retrospectives. It
would only be a small step for each team to discuss appropriate metrics in retrospec-
tives, using the approach suggested in 4.1.2.
Second, project teams can adopt measures to ensure they are delivering value
through their project. Jeff Patton outlines the business-goal approach [9] through
which this can be achieved. In this approach project stakeholders meet at an early
stage in the project and:
• Brainstorm on business goals/values
• Merge goals (cluster and categorise all goals suggested)
• Distill clustered goals
• Vote on priority
• Identify metrics
• Create metrics
Table 3: Example Business Goals and Values

Category Business goals and values


Responding to business Increasing customer base
needs Providing quicker service
Freeing up staff time
Delivering a good user Customer satisfaction
experience Well-designed products
Easy to use interface
Delivering return on Early ROI
investment

Table 3 shows an example of categorised business goals and values that we suggested
may be used as a starting point for a business goals stakeholder meeting.References

Page 12 of 13
From Performance to Value: Measuring in Agile

1. Lee, C.S., Modeling the Business Value of Information Technology. Informa-


tion and Management, 2001. 39(3): p. 191-210.
2. Gautam, A. Agile Performance Appraisals. 2012; Available from:
https://www.scrumalliance.org/community/articles/2012/march/
agile-performance-appraisals.
3. Sutherland, J. Agile Performance Reviews. 2006; Available from: http://
jeffsutherland.com/2006/11/agile-performance-reviews.html.
4. Coens, T. and M. Jenkins, Abolishing Performance Appraisals: Why They
Backfire and What To Do Instead. 2002, San Francisco, CA: Berrett-Koehler.
5. Hartmann, D. and R. Dymond, Appropriate Agile Measurement: Using
Metrics and Diagnostics to Deliver Business Value, in Agile 2006. 2006, IEEE.
6. Javdani, T., et al., On the Current Measurement Practices in Agile Software
Development. International Journal of Computer Science Issues, 2012. 9(4):
p. 127-133. arXiv:1301.5964
7. Agarwal, M. and R. Majumdar, Tracking Scrum Project Tools, Metrics
and Myths About Agile. International Journal of Emerging Technology and
Advanced Engineering, 2012. 2(3): p. 97-104.
8. Thomas, J.C. and S.W. Baker, Establishing an Agile Portfolio to Align IT In-
vestments with Business Needs, in Agile Conference. 2008: Toronto, Canada.
p. 252-258.
9. Patton, J., Ambiguous Business Value Harms Software Products, in IEEE
Software. 2008. p. 50-51.

Page 13 of 13

You might also like