2002 - A Framework of The Factors Affecting The Evolution of Performance Measurement Systems
2002 - A Framework of The Factors Affecting The Evolution of Performance Measurement Systems
2002 - A Framework of The Factors Affecting The Evolution of Performance Measurement Systems
Access to this document was granted through an Emerald subscription provided by emerald-srm:273599 []
For Authors
If you would like to write for this, or any other Emerald publication, then please use our Emerald for
Authors service information about how to choose which publication to write for and submission guidelines
are available for all. Please visit www.emeraldinsight.com/authors for more information.
About Emerald www.emeraldinsight.com
Emerald is a global publisher linking research and practice to the benefit of society. The company
manages a portfolio of more than 290 journals and over 2,350 books and book series volumes, as well as
providing an extensive range of online products and additional customer resources and services.
Emerald is both COUNTER 4 and TRANSFER compliant. The organization is a partner of the Committee
on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive
preservation.
IJOPM
22,11 A framework of the factors
affecting the evolution of
performance measurement
1222
systems
Mike Kennerley and Andy Neely
Centre for Business Performance, Cranfield School of Management,
Cranfield, UK
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Introduction
Although it has long been recognised that performance measurement has an
important role to play in the efficient and effective management of
organisations, it remains a critical and much debated issue. Significant
management time is being devoted to the questions what and how should we
measure while substantial research effort, by academics from a wide variety
of management disciplines, is being expended as we seek to enhance our
understanding of the topic and related issues (Neely, 1999).
Survey data suggest that between 40 and 60 per cent of companies
significantly changed their measurement systems between 1995 and 2000
(Frigo and Krumwiede, 1999). Most of these initiatives, however, appear to be
static. Although many organisations have undertaken projects to design and
implement better performance measures, little consideration appears to have
been given to the way in which measures evolve following their
implementation (Waggoner et al., 1999). It is important that performance
measurement systems be dynamic, so that performance measures remain
relevant and continue to reflect the issues of importance to the business (Lynch
and Cross, 1991).
International Journal of Operations &
Production Management, The authors are grateful to the Engineering and Physical Sciences Research Council (EPSRC)
Vol. 22 No. 11, 2002, pp. 1222-1245.
# MCB UP Limited, 0144-3577
for the award of research grant number GR/K88637, to carry out the research reported in this
DOI 10.1108/01443570210450293 paper.
In order to ensure that this relevance is maintained, organisations need a Performance
process in place to ensure that measures and measurement systems are measurement
reviewed and modified as the organisation's circumstances change (Dixon et systems
al., 1990). Yet few organisations appear to have systematic processes in place
for managing the evolution of their measurement systems and few researchers
appear to have explored the question what shapes the evolution of an
organisation's measurement system. 1223
The research reported in this paper seeks to address this gap in the literature
by presenting a framework that describes the forces that shape the evolution of
the measurement systems used by different organisations.
Following this introduction the paper consists of a further six sections. The
next section discusses the literature regarding the evolution of performance
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
include the need for measures to relate directly to the organisation's mission
and objectives, to reflect the company's external competitive environment,
customer requirements and internal objectives (Globerson, 1985; Wisner and
Fawcett, 1991; Maskell, 1989; Kaplan and Norton, 1993). Others make explicit
the need for strategies, action and measures to be consistent (Lynch and Cross,
1991; Dixon et al., 1990).
At the heart of the processes, frameworks and criteria discussed, as with
much that has been written on the subject of performance measurement, is the
premise that measures and measurement systems must reflect the context to
which they are applied (Neely, 1999). Indeed as Johnson (1983) observed, the
introduction of financial performance measures, such as cash flow and return
on investment, reflected the changing marketplace in which organisations
competed. At the turn of the century sole traders were giving way to owner
managers who needed to assess the return on their investment in plant and
premises.
The performance measurement revolution has prompted many
organisations to implement new performance measurement systems, often at
considerable expense. However, unlike the environment in which organisations
operate, many measurement initiatives appear to be static. Senge (1992) argues
that, in today's complex business world, organisations must be able to learn
how to cope with continuous change in order to be successful. Eccles (1991)
suggests that it will become increasingly necessary for all major businesses to
evaluate and modify their performance measures in order to adapt to the
rapidly changing and highly competitive business environment. Numerous
authors espouse the need for reflection on measures to ensure that they are
updated to reflect this continuous change (Meyer and Gupta, 1994; Ghalayini
and Noble, 1996; Dixon et al., 1990; Wisner and Fawcett, 1991). However, there
has been little evidence of the extent or effectiveness with which this takes
place. Moreover, the literature suggests that ineffective management of the
evolution of measurement systems is causing a new measurement ``crisis'', with
organisations implementing new measures to reflect new priorities but failing
to discard measures reflecting old priorities resulting in uncorrelated and
inconsistent measures (Meyer and Gupta, 1994). Furthermore, it is suggested
that organisations are drowning in the additional data that is now being Performance
collected and reported (Neely et al., 2000). As with measurement systems measurement
introduced at the turn of the century, there is a danger that failure to manage systems
effectively the way in which measurement systems change over time will cause
new measurement systems to lose their relevance, prompting a new crisis and
necessitating a further measurement revolution.
This raises a crucial question. Why do performance measurement systems 1225
fail to change as organisations change, rendering them irrelevant? This is an
important question to answer if history is not to be repeated and organisations
are to avoid the expense of another extensive overhaul of their measurement
systems.
Wisner and Fawcett (1991) acknowledge the need for performance measures
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
to be reviewed and changed to ensure that measures remain relevant in the last
step of their nine step process. They highlight the need to ``re-evaluate the
appropriateness of the established performance measurement systems in view
of the current competitive environment''. Bititci et al. (2000) identify the need for
performance measurement systems to be dynamic to reflect changes in the
internal and external environment; review and prioritise objectives as the
environment changes; deploy changes in objectives and priorities; and ensure
gains achieved through improvement programmes are maintained.
Dixon et al. (1990) and Bititci et al. (2000) propose audit tools that enable
organisations to identify whether their existing measurement systems are
appropriate given their environment and objectives.
Bititci et al. (2000) go on to posit that a dynamic performance measurement
system should have:
. an external monitoring system, which continuously monitors
developments and changes in the external environment;
. an internal monitoring system, which continuously monitors
developments and changes in the internal environment and raises
warning and action signals when certain performance limits and
thresholds are reached;
. a review system, which uses the information provided by internal and
external monitors and the objectives and priorities set by higher level
systems, to decide internal objectives and priorities; and
. an internal deployment system to deploy the revised objectives and
priorities to critical parts of the system.
Bourne et al. (2000) suggest measurement systems should be reviewed and
revised at a number of different levels. They identify the need for review of
targets and performance against them; individual measures as circumstances
change; and the set of measures to ensure that they reflect the strategic
direction.
Although the authors discussed above propose the need to review measures
and suggest techniques for such review, there is little discussion of their
IJOPM application in practice, investigation of how measures actually change or of the
22,11 factors that affect how effectively and efficiently performance measurement
systems change. With a few notable exceptions (Meyer and Gupta, 1994;
Townley and Cooper, 1998; Bourne et al., 2000), empirical investigation of the
evolution of measurement systems over time remains a considerable gap in
performance measurement research (Neely, 1999).
1226 Meyer and Gupta (1994) observe that measures tend to lose their relevance
and ability to discriminate between good and bad performance over time as
performance objectives are achieved or as behaviour no longer reflects the
performance objectives underpinning the measures. They observe that failure
to effectively manage this change causes the introduction of new measures
``that are weakly correlated to those currently in place'' so that an organisation
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
will have a diverse set of measures that do not measure the same thing.
Townley and Cooper (1998) undertook a longitudinal study of performance
measurement in Alberta government in Canada. They observed that support
for performance measurement can diminish over time. They observe that
measurement initiatives can suffer from loss of initial enthusiasm, which is
replaced by scepticism and disillusionment. They cited a number of causes of
this including failure to manage the change appropriately, underestimating the
effort required and lack of commitment to the change. They also identified that
political issues and the involvement of employees affect success. Not only does
their study identify factors affecting the success of performance measurement
activities, but it also highlights the need for support of such activities within
the organisation.
In a case study company, Bourne et al. (2000) observed that performance
measures changed over time. They identified that changes were prompted by
existing budgetary review processes, chance, intervention of the researchers
and eventually by design, however they provide little insight into how this
change by design took place.
Despite the limited discussion of evolution in the performance measurement
literature, it is possible to draw lessons from a variety of other streams of
literature that address issues relating to the management of change (Waggoner
et al., 1999).
Based on a review of the relevant literature, Waggoner et al. (1999)
summarise the key forces driving and demanding change as: customers,
information technology, the marketplace, legislation (public policy), new
industries, nature of the work (e.g. outsourcing) and future uncertainty.
However, many authors also identify barriers to change that have received
little attention in the performance measurement literature.
Gabris (1986) identifies four categories of such barriers:
(1) process burden, where processes such as performance measurement
take employees away from their actual responsibilities;
(2) internal capacity, where organisations lack the in-house capability to
support an initiative;
(3) credibility anxiety, where organisations suffer from an overload of Performance
management techniques; and measurement
(4) the ``Georgia giant syndrome'', where management techniques work only systems
under rigorous and closely supervised control conditions.
These factors can be considered to be the organisation's readiness for change
(Waggoner et al., 1999). Furthermore, Kotter (1996) argues that willingness or 1227
urgency to change throughout the organisation is necessary for such change to
be effective.
Greiner (1996) categorises inhibiting factors as institutional, pragmatic,
technical and financial. Numerous authors (such as Scott, 1995 and Pettigrew
and Whipp, 1991) also highlight that the political nature of organisations
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Methodology
A multiple case study approach was used to investigate the way in which
performance measures actually evolve within organisations. The research
involved semi-structured interviews with a total of 25 managers from a range
Figure 1.
Summary of factors
affecting evolution
drawn from the
literature
IJOPM of management functions, from seven different organisations. The companies
22,11 involved in the research were from the industries shown in Table I.
The interview structure was designed to investigate the key themes
identified from the literature reviewed. As such the case studies sought to
answer the following questions:
. What factors encourage the introduction of new measures, modification
1228 of existing measures and deletion of obsolete measures?
. What factors inhibit the introduction of new measures, modification of
existing measures and deletion of obsolete measures?
The companies were selected on the basis of their considerable experience in
the implementation and use of performance measures. Companies from
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Company Industry
extensive use was made of existing and accepted communication tools to ensure
performance measurement had the appropriate credibility. As the human
resources manager remarked: ``Effective use of the measurement system is due
to the managing director's promotion of the need for and importance of
measurement and his use of measurement to manage and communicate''.
The managing director highlighted the need for flexible systems ``None of
the commercial performance measurement software provided the required
support you must have a system that satisfies your requirements''. In-house
information systems were developed to provide data collection, analysis and
reporting systems giving flexibility not provided by systems available on the
market. Addressing these issues, and integrating performance measurement
into the strategy development and review process, provided the organisation
with a measurement system that they believed would evolve with the
business's requirements.
Company 2
Although performance measurement systems had been implemented in
company 2 for a number of years, failure to actually use new performance
measures to manage the business was seen as major barrier to their deployment
and hence evolution. Although senior management had backed the
implementation of a balanced set of measures, the continued emphasis on
financial performance measures prevented use of the balanced measurement
system being embedded throughout the organisation. As in company 1,
company 2 used experiences of ineffective measurement practices in the past to
design a measurement system with the attributes that they considered
necessary to maintain a relevant set of performance measures in the future. To
ensure that their measures remained relevant, managers in company 2 explicitly
included a review of measures in the periodic review of business processes.
The head of business process development highlighted the importance of
having the appropriate systems to facilitate measurement activity and the
evolution of measurement systems. ``New systems have been designed from
scratch to be flexible enabling measures to be changed easily. The system
being Web-based enables worldwide access to all information allowing
IJOPM information sharing. This facilitates benchmarking and the transfer of best
22,11 practice. The global availability of the same reporting systems enables
commonality of approach''.
Furthermore he highlighted that: ``reporting needs to be efficient to reduce
the resources required to administer measurement, allowing resources to be
dedicated to acting on the results.'' The system was designed to enable efficient
1230 and effective data collection and reporting, minimising the effort of
measurement to ensure acceptance throughout the organisation.
According to the consultancy sales manager: ``Benchmarking of
performance against competitors (including those in new markets) has given a
common understanding of the need to improve and where improvement should
be focused. This has reduced any resistance to the change of performance
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Company 3
The evolution of measures was not effectively managed in company 3. ``The
culture at [company 3] is a barrier to the implementation of a consistent
approach to measurement across the whole company.'' The ad hoc approach to
performance measurement that was adopted led to inconsistency in approaches
between different business units and geographical locations. The inconsistency
in measurement practices limited the comparability of performance data,
detrimentally affecting the credibility, and hence acceptance, of performance
measures. Despite attempts to change measures to reflect changing business
circumstances, managers were reluctant to use non-financial data to manage
the business. ``The overriding factor affecting the acceptance of performance
measurement is that it become a business issue so that it occupies the minds of
managers and measures are used to manage the business'' (Manager
Stationary Office Supplier). This reflects the need for managers to actively use
measures to manage the business. It was found that this would increase their
desire to ensure measures changed to remain appropriate, as their performance
would be assessed on them.
Inflexible IT systems were also found to be a major barrier to evolution. The
European customer care manager specifically noted that: ``it is not possible to
change the structure and content of the performance reports produced by the
mainframe IT system.''
Company 4
The use of performance measurement to manage the business was accepted in
company 4. However, the tendency to report too much data and produce too
many measurement reports acted as a significant barrier to evolution. The
service recovery manager stated: ``I spend too much time preparing reports for
my manager to take to board meetings. It prevents me from reviewing and
updating measures so that they remain current. Most of the reports are never
referred to, they are just a security blanket in case he is ever asked to produce Performance
the data.'' In the past key individuals had stood in the way of the use of some measurement
measures. ``This resistance was due to reluctance to provide a better systems
understanding of actual performance for which they were responsible. Removal
of the individuals has been the most successful way of eliminating the
problem'' (Service Recovery Manager).
The availability of people with the appropriate skills to analyse and redefine 1231
measures was also identified as an issue. This was particularly the case when
individuals responsible for measurement left departments or the company all
together. It was recognised that measurement practices could be developed
further by planning skills development and ensuring that the appropriate skills
were maintained in the areas they were required.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Company 5
Extensive performance measurement implementation had been undertaken in
company 5. However, as in company 2, although senior management had
initiated the implementation of new measures, they failed to use the resultant
performance measurement data, in favour of traditional financial performance
measures. ``The previous CEO paid lip service to the scorecard but only really
focussed on the financials, hence this is where all attention was focused'' (Head
of Strategic Planning). As a result the new measures were not considered to be
important at other levels of the organisation and they were not effectively used.
Measurement reverted to financial measurement and the process of evolution
was stifled. This clearly demonstrated the need for top level support for
measurement and the need for a change in mindset of management so that
measures are used to manage the business.
Company 6
Company 6 provided the best example of managing the evolution of
measurement systems. The primary factor facilitating evolution was the
availability of resources dedicated to measurement and the management of
performance measures. ``The availability of a dedicated employee who is
responsible for the review of measures enables gaps to be identified and the
need to change existing measures as well as identifying performance
measures'' (Sales Order Manager).
The dedicated systems analyst ensured that measures were reviewed and
that action was taken to improve performance and ensure that measures were
changed to remain relevant. In addition, ``having split responsibility and
budget from operations and the IT department enables me to develop systems
that would not be justified under either department individually''. This ensured
IJOPM that systems were flexible enough to change as required. The availability of a
22,11 manager dedicated to measurement, who had credibility within all areas of the
business stimulated measurement activity and helped overcome barriers to the
acceptance and evolution of measurement, such as inflexible payroll structures
and high staff turnover.
Company 6 highlighted the need to create the appropriate environment in
1232 which the use of performance measures is most effective. Weekly meetings to
review performance were open and honest discussions of performance,
including new issues requiring measurement and identifying new areas of
performance on which to focus improvement attention. ``It is important to
recruit and retain employees who are open to new ideas and are willing and
able to implement new performance measures.'' ``Use of neutral measures, that
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Company 7
The lack of a formal review process was considered to be the main reason that
the evolution of performance measures was not managed in company 7 (``There
is no process to review measures and identify whether or not they are
appropriate. That is a major factor affecting whether measures change in line
with organisational circumstances''). Within company 7 the leadership of the
managing director was clearly the main driver of measurement activity. ``The
ability and energy of the managing director drive measures and measurement.
He prompts other board members to review measures and ensure that they are
relevant and appropriate to the business and reflect what is important.''
The availability of management time to reflect on measures was also
considered to be a major constraint. The group technical and quality director
identified that: ``In previous years we have had too many measures. We need to
focus on fewer important objectives''. He also noted that the frequency with
which measures are reviewed is dependent on the availability of management
time. Similarly the availability of management skills is also a key determinant
of the ability to review and modify measures. This will affect when
inappropriate measures are identified and the ability to change measures to
make them appropriate''. He identified the need for systems that could
accommodate a hierarchy of measures, reporting the few important measures,
but enabling analysis of the many underlying measures of the drivers of
performance.
Table II summarises the key factors that facilitate and inhibit the evolution
of performance measurement systems in each of the case study companies.
Evidence from the case study companies demonstrates the need for
companies to change their performance measures as the organisation's
circumstances change. The group technical and quality director in company 7
pointed out: ``If people don't think measures are relevant they won't use them,
so they won't evolve''. This clearly demonstrates that in order for an
organisation to have performance measures that evolve over time, they must
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
1 Senior management driving measurement activities Off the shelf systems insufficiently flexible
Development of in-house IT systems Availability of skills to effectively collect and analyse data
Use of accepted communication media to communicate,
generate feedback and involve all employees
Integration of measurement with strategy development and
review
Consistent approach to measurement
2 New Web-based system developed Senior management inertia
In-house systems provide required flexibility Measures not used to manage the business
Measurement included in business process review Time consuming and costly data collection
Alignment of rewards to measures
Need for measures to evolve considered important
Common understanding of objectives and the need to improve
3 Enthusiastic champion of measurement Management inertia
Contact with external research bodies to keep up to date with Inflexible IT/finance systems
developments in measurement practices Incompatibility of measures/inconsistent approach
Make measurement a business issue manage with measures Culture ad hoc measurement, no integrated approach or PM
function
4 Enthusiastic champion to kick off ``measurement revolution'' Individual inertia/resistance to measurement
The need for succession planning identified Time wasted producing reports
Ability to quantify performance
Measures lacking credibility
5 Top level management support is critical Measurement not used to manage the business (need new
User involvement in designing measures mind set)
Alignment of rewards Accounting systems focus
Inconsistent approach to measurement (due to changes in
ownership and management)
Lack of flexible systems to collect and analyse data
(continued)
1233
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
22,11
1234
IJOPM
Table II.
Company Facilitators of evolution Barriers to evolution
that the case study organisations encountered, and approaches that different
organisations used to overcome them, provide significant insight into the way
that the evolution of measurement systems can be managed.
Thus the data suggests that culture that is inappropriate to the use (and
change of measures) is the fourth key barrier that was identified.
This further analysis of the data identifies four key themes commonly
observed within the case study organisations. These are demonstrated through
the examples discussed. Table III presents the findings from the case studies
structured around four themes that emerge from the data collected. This
demonstrates that these themes comprehensively cover all of the case study
data as presented in Table II.
22,11
1238
findings
IJOPM
Table III.
Recategorised
summary of case study
Facilitators of evolution Barriers to evolution
Process Integration of measurement with strategy development and review Lack of proactive review process (7)
(company 1) Inconsistent approach to measurement:
Integration of measurement with business process review (2) over time (5)
PM ``function'' the focal point of measurement activity (6) between locations/business units (3, 6, 7)
Forum to discuss appropriateness of measures (6) no integrated measurement function (3)
Implementation of common definitions/metrics (3, 7) Insufficient time to review measures:
Consistent approach to measurement across all areas of the business (1) lack of management time (4, 7)
Away day to measures (6) too much data reported (4, 7)
Involvement of external bodies (3) The need to trend measures limits ability change (7)
User involvement in measurement (5) Lack of data analysis (5, 6)
Culture The need for evolution considered to be important (2, 6, 7) Senior management inertia (2, 3)
Communication: Individual inertia/resistance to measurement (4)
use of accepted medium (1) Ad hoc approach to measurement (3)
feedback all actions (1) Lack of alignment of actions with measures (7)
engage all employees (1) In appropriate use of measures/measures not used to
Measurement integrity is encouraged: manage the business (2, 5)
open and honest discussion of performance (6) Rigid remuneration and union systems (6)
no blame culture (6)
discouragement of ``gaming behaviour'' (6)
Ongoing senior management support/champion for measurement (all
companies):
continued focus on measurement (1, 6)
identify and remove barriers to use/change of measures (1, 6)
Establish common understanding of objectives (2)
Integration/alignment of reward systems (2)
Measurement not owned by finance (6)
Alignment of measures and rewards (2, 5, 6)
Table III.
Performance
measurement
systems
1239
IJOPM examples of internal triggers which prompted review of the relevance of
22,11 current measures given changes in circumstances. Other such triggers were
also identified that prompted the realisation that measures were
inappropriately designed for their purpose, that use of measures prompted
inappropriate behaviour or that circumstances, such as competitive
requirements, changed. Once the trigger has been received then the first stage
1240 in the evolution of the measurement system is to reflect on the performance
measurement system and identify whether it remains appropriate given
changing organisational circumstances. This stage of the evolutionary process
is known as reflect (reflect) and the research identified several barriers that
prevent it from occurring in organisations, most crucially those associated with
process, people, infrastructure and culture:
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Figure 2.
Framework of factors
affecting the evolution
of performance
measurement systems
IJOPM enabling infrastructure). Each of these elements must be considered
22,11 during the evolution of the performance measurement system.
(3) There are four stages of evolution use, reflect, modify and deploy.
These form a continuous cycle.
(4) Barriers exist that will prevent the evolutionary cycle from operating.
1242 These barriers can be overcome if the evolutionary cycle is underpinned
by enabling factors broadly categorised under the headings: people,
process, people, infrastructure and culture. Specifically, a well designed
measurement system will be accompanied by an explicitly designed
evolutionary cycle with clear triggers and:
. process existence of a process for reviewing, modifying and
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
deploying measures;
. people the availability of the required skills to use, reflect on,
modify and deploy measures;
. infrastructure the availability of flexible systems that enable the
collection, analysis and reporting of appropriate data;
. culture the existence of a measurement culture within the
organisation ensuring that the value of measurement, and
importance of maintaining relevant and appropriate measures, are
appreciated.
Discussion
The literature and case study data presented clearly show first, the importance
of managing measurement systems so that they change over time and second,
the complex range of interrelated factors that affect the evolution of
performance measurement systems. The literature highlights many of the
issues affecting the management of change within organisations. This paper
discusses many of these issues in the context of case study data relating to
performance measurement system evolution.
A considerable amount has been written about the design and
implementation of measurement systems and a number writers have identified
the need to reflect on measures to ensure that they remain relevant as the
organisation changes. The research findings echo the themes identified in the
literature concerning the external and internal drivers of change affecting
organisations and the need for organisations to have effective process in place
to identify these changes and when they necessitate changes to measurement
systems. However, there is little discussion in the literature of what to do once
that reflection has taken place. The data collected clearly show that the process
of managing the evolution of measurement systems consists of a number of
stages that have to date received little attention. In addition to reflection,
consideration should be given to how measures are to be modified and how
modified measures are to be deployed without embarking on a wholesale
performance measurement system redesign project.
It is also clear that for measurement systems to evolve effectively there are Performance
key capabilities that an organisation must have in place (i.e. effective processes; measurement
appropriate skills and human resources; appropriate culture; and flexible systems
systems). The research demonstrates how lessons from different strands of
literature such as the need for the appropriate resources (Greiner, 1996) and
capabilities (Gabris, 1986); the appropriate culture (Tichy, 1983); willingness to
change (Kotter, 1996); and relevant processes (Bourne et al., 2000; Bititci et al., 1243
2000) can be drawn together into a structured framework.
The data indicates that organisations should consider these capabilities at
each stage of the evolutionary cycle, as they are fundamental to effective
evolution. However, little consideration is given to these capabilities in the
literature concerning the design and implementation of measurement systems.
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Conclusions
Although the issue of development of effective performance measures has
received considerable attention from both academic and practitioner
communities, neither has satisfactorily addressed the issue of how performance
measures should evolve over time in order to remain relevant.
The research reported in this paper provides an understanding of how
measurement systems can be managed so that a dynamic and relevant set of
performance measures can be maintained, reflecting an organisation's
changing requirements. It provides an understanding of the factors, both
internal and external to the organisation, that facilitate and inhibit the
introduction of new measures, the modification of existing measures and
deletion of obsolete measures. These factors are presented in a framework that
illustrates the process, people, infrastructure and culture capabilities that an
organisation must demonstrate in order to manage the evolution of measures.
The paper discusses many issues of relevance to the growing literature in the
field of performance measurement while providing organisations with a
practical tool to help them establish an effective performance measurement
system. Ensuring that evolution of measurement systems is effectively
managed over time is important if another measurement crisis and revolution
is to be avoided.
IJOPM References
22,11 Bititci, U.S., Turner, T. and Begemann, C. (2000), ``Dynamics of performance measurement
systems'', International Journal of Operations & Production Management, Vol. 20 No. 6,
pp. 692-704.
Bourne, M., Neely, A., Mills, J. and Platts, K. (1999), ``Performance measurement system
implementation: an investigation of failures'', Proceedings of the 6th International
Conference of The European Operations Management Association, Venice, 7-8 June,
1244 pp. 749-56.
Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000), ``Designing, implementing and
updating performance measurement systems'', International Journal of Operations &
Production Management, Vol. 20 No. 7, pp. 754-71.
Bruns, W. (1998), ``Profit as a performance measure: powerful concept, insufficient measure'',
Performance Measurement Theory and Practice: The First International Conference on
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
International Journal of Operations and Production Management, Vol. 19 No. 2, pp. 205-28.
Neely, A.D., Kennerley, M.P. and Adams, C.A. (2000), The New Measurement Crisis: The
Performance Prism as a Solution, Cranfield School of Management, Cranfield.
Neely, A.D., Mills, J.F., Gregory, M.J., Richards, A.H., Platts, K.W. and Bourne, M.C.S. (1996),
Getting the Measure of Your Business, Findlay Publications, Horton Kirby.
Pettigrew, A. and Whipp, R. (1991), Managing Change for Competitive Success, Blackwell,
Oxford.
Scott, W.R. (1995), Institutions and Organizations: Theory and Research, Sage Publications,
London.
Senge, P.N. (1992), The Fifth Discipline: The Art and Practice of the Learning Organization,
Century Business Press, London.
Tichy, N.M. (1983), Managing Strategic Change: Technical, Political, and Cultural Dynamics, John
Wiley & Sons, New York, NY.
Townley, B. and Cooper, D. (1998), ``Performance measures: rationalization and resistance'',
Proceedings of Performance Measurement Theory and Practice: the First International
Conference on Performance Measurement, Cambridge, 14-17, July, pp. 238-46.
Waggoner, D.B., Neely, A.D. and Kennerley, M.P. (1999), ``The forces that shape organisational
performance measurement systems: an interdisciplinary review'', International Journal of
Production Economics, Vol. 60-61, pp. 53-60.
Wisner, J.D. and Fawcett, S.E. (1991), ``Linking firm strategy to operating decisions through
performance measurement'', Production and Inventory Management Journal, Third
Quarter, pp. 5-11.
This article has been cited by:
1. Ruggero Sainaghi, Paul Phillips, Emma Zavarrone. 2017. Performance measurement in tourism firms: A
content analytical meta-approach. Tourism Management 59, 36-56. [CrossRef]
2. SoysaIshani Buddika Ishani Buddika Soysa Ishani Buddika Soysa (BSc Hons, MSc), formerly a
Lecturer in Engineering Mathematics and Statistics, is a Research Scholar pursing her PhD in the
School of Engineering and Advanced Technology, Massey University, New Zealand. Ishanis research
interests include performance measurement, structural equation modelling, and mathematics teaching to
engineering undergraduates. JayamahaNihal Palitha Nihal Palitha Jayamaha Nihal Palitha Jayamaha (PhD,
MEng, MBA) is a Lecturer in Quality Management at the Massey University. Nihal previously worked
in Electrical utilities in South Asia and the Middle East for 20 years as a Project Manager, an Operations
Engineer and an Auditor. Nihal has published several journal articles, book chapters, and conference papers
on TQM and related topics. His research interests include process modelling, performance measurement,
and quality management theory building for process improvement. GriggNigel Peter Nigel Peter Grigg
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Nigel Peter Grigg is a Professor in Quality Systems at the Massey University. He leads the taught Master of
Quality Systems Qualification and Coordinates Postgraduate Research in Quality Management, Business
Excellence, and related areas. He is a Chartered Quality Professional and Certified Six Sigma Black Belt,
and a Member of the Chartered Quality Institute, the American Society for Quality, and the New Zealand
Organisation for Quality. School of Engineering and Advanced Technology, Massey University, Palmerston
North, New Zealand . 2016. Operationalising performance measurement dimensions for the Australasian
nonprofit healthcare sector. The TQM Journal 28:6, 954-973. [Abstract] [Full Text] [PDF]
3. SchniederjansDara G. Dara G. Schniederjans OzpolatKoray Koray Ozpolat ChenYuwen Yuwen Chen
Department of Supply Chain Management, University of Rhode Island, Kingston, Rhode Island, USA .
2016. Humanitarian supply chain use of cloud computing. Supply Chain Management: An International
Journal 21:5, 569-588. [Abstract] [Full Text] [PDF]
4. Tyler F. Thomas. 2016. Motivating revisions of management accounting systems: An examination of
organizational goals and accounting feedback. Accounting, Organizations and Society 53, 1-16. [CrossRef]
5. Jelle Van Camp Department of Engineering Management, University of Antwerp, Antwerpen, Belgium
Johan Braet Department of Engineering Management, University of Antwerp, Antwerpen, Belgium .
2016. Taxonomizing performance measurement systems failures. International Journal of Productivity and
Performance Management 65:5, 672-693. [Abstract] [Full Text] [PDF]
6. Sandeep Vij Department of Management, DAV University, Jalandhar, India Harpreet Singh Bedi
Department of Management, Lovely Professional University, Phagwara, India . 2016. Are subjective
business performance measures justified?. International Journal of Productivity and Performance
Management 65:5, 603-621. [Abstract] [Full Text] [PDF]
7. MunirRahat Rahat Munir Dr Rahat Munir is an Associate Professor at the Department of Accounting and
Corporate Governance, Faculty of Business and Economics, Macquarie University, Sydney. He has been an
academic in accounting discipline for over 20 years. His teaching is across undergraduate and postgraduate
subjects in management accounting and related subjects. He supervises Honours and PhD students. Dr
Munir has worked very closely with the Professional Risk Managers International Association in the
USA to implement risk management and measurement practices in Pakistan. He has worked on various
banking projects, including a project funded by the International Finance Corporation on Small and
Medium Enterprises Financing. His area of research includes risk management, performance measurement
systems and management control systems in banks. He has published in high-impact journals, such as the
Accounting, Auditing & Accountability Journal and the International Journal of Operations & Production
Management. BairdKevin Kevin Baird Dr Kevin Baird is an Associate Professor in the Department of
Accounting and Corporate Governance at Macquarie University in Sydney, Australia. He has 15 years
experience teaching undergraduate and postgraduate subjects in the management accounting area. He
also has supervised numerous Honours and PhD students and conducted research covering many topic
areas within the management accounting discipline including: Activity-based management practices; Total
quality management; Performance measurement systems; Management control systems; Outsourcing;
Employee organizational commitment; and Employee empowerment. Dr Baird has published 28 papers
with 16 papers published in A/A* ranked journals, including the Accounting, Auditing & Accountability
Journal, the International Journal of Operations & Production Management and Management Accounting
Research. Department of Accounting and Corporate Governance, Macquarie University, Sydney, Australia .
2016. Influence of institutional pressures on performance measurement systems. Journal of Accounting &
Organizational Change 12:2, 106-128. [Abstract] [Full Text] [PDF]
8. Premaratne Samaranayake School of Business, University of Western Sydney. Penrith, Australia Tritos
Laosirihongthong Industrial Engineering Department, Faculty of Engineering, Thammasat University,
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Bangkok, Thailand . 2016. Configuration of supply chain integration and delivery performance. Journal
of Modelling in Management 11:1, 43-74. [Abstract] [Full Text] [PDF]
9. Yulia Sidorova Department of Management, Economics and Industrial Engineering, Politecnico di Milano,
Milan, Italy Michela Arnaboldi Department of Management, Economics and Industrial Engineering,
Politecnico di Milano, Milan, Italy Jacopo Radaelli Department of Management, Economics and Industrial
Engineering, Politecnico di Milano, Milan, Italy . 2016. Social media and performance measurement
systems: towards a new model?. International Journal of Productivity and Performance Management 65:2,
139-161. [Abstract] [Full Text] [PDF]
10. Cory Searcy. 2016. Measuring Enterprise Sustainability. Business Strategy and the Environment 25:2,
120-133. [CrossRef]
11. Kamilah Ahmad, Shafie Mohamed Zabri. 2016. The Application of Non-Financial Performance
Measurement in Malaysian Manufacturing Firms. Procedia Economics and Finance 35, 476-484. [CrossRef]
12. Richard Henley, Christopher A. Brown. 2016. Axiomatic Design Applied to Play Calling in American
Football. Procedia CIRP 53, 206-212. [CrossRef]
13. James Kamwachale Khomba Department of Management Studies, University of Malawi, Blantyre, Malawi .
2015. Conceptualisation of the Balanced Scorecard (BSC) model. International Journal of Commerce and
Management 25:4, 424-441. [Abstract] [Full Text] [PDF]
14. Han-Hsin Chou. 2015. Multiple-Technique Approach for Improving a Performance Measurement and
Management System: Action Research in a Mining Company. Engineering Management Journal 27:4,
203-217. [CrossRef]
15. Joel Jrvinen, Heikki Karjaluoto. 2015. The use of Web analytics for digital marketing performance
measurement. Industrial Marketing Management 50, 117-127. [CrossRef]
16. Jonathan Pryshlakivsky, Cory Searcy. 2015. A Heuristic Model for Establishing Trade-Offs in Corporate
Sustainability Performance Measurement Systems. Journal of Business Ethics . [CrossRef]
17. Heidi Kromrei. 2015. Enhancing the Annual Performance Appraisal Process: Reducing Biases and
Engaging Employees Through Self-Assessment. Performance Improvement Quarterly 28:2, 53-64.
[CrossRef]
18. Antonella Cifalin, Irene Eleonora Lisi. 2015. La misurazione delle performance dei servizi domiciliari e
residenziali tra riforme istituzionali e applicazioni locali. MECOSAN :93, 9-32. [CrossRef]
19. Carron M. Blom, Luca De Marco, Peter M. Guthrie. 2015. Customer perceptions of road infrastructure
surface conditions. Infrastructure Asset Management 2:1, 23-38. [CrossRef]
20. Aki Jskelinen Department of Industrial Engineering, Tampere University of Technology, Tampere,
Finland Juho-Matias Roitto Department of Industrial Management, Tampere University of Technology,
Tampere, Finland . 2015. Designing a model for profiling organizational performance management.
International Journal of Productivity and Performance Management 64:1, 5-27. [Abstract] [Full Text]
[PDF]
21. Clandia Maffini Gomes, Jordana Marques Kneipp, Isak Kruglianskas, Luciana Aparecida Barbieri da Rosa,
Roberto Schoproni Bichueti. 2014. Management for sustainability in companies of the mining sector:
ananalysis of the main factors related with the business performance. Journal of Cleaner Production 84,
84-93. [CrossRef]
22. Hella Abidi Institute for Logistics and Service Management (ild), FOM University of Applied
Sciences, Essen, Germany and Department of Information, Logistics and Innovation, VU University
Amsterdam, Amsterdam, The Netherlands Sander de Leeuw Department of Information, Logistics and
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
34. Fei Deng, Hedley Smyth. 2013. Contingency-Based Approach to Firm Performance in Construction:
Critical Review of Empirical Research. Journal of Construction Engineering and Management 139:10,
04013004. [CrossRef]
35. Zhigang Jin, Fei Deng, Heng Li, Martin Skitmore. 2013. Practical Framework for Measuring Performance
of International Construction Firms. Journal of Construction Engineering and Management 139:9,
1154-1167. [CrossRef]
36. Andrew Taylor, Margaret Taylor. 2013. Antecedents of effective performance measurement system
implementation: an empirical study of UK manufacturing firms. International Journal of Production
Research 51:18, 5485-5498. [CrossRef]
37. Luiz C. R. Carpinetti, Rafael H. P. Lima. 2013. Institutions for collaboration in industrial clusters: proposal
of a per-formance and change management model. International Journal of Production Management and
Engineering 1:1. . [CrossRef]
38. Tuomas Korhonen, Teemu Laine, Petri Suomala. 2013. Understanding performance measurement
dynamism: a case study. Journal of Management & Governance 17:1, 35-58. [CrossRef]
39. F. FranceschiniDISPEA, Politecnico di Torino, Torino, Italy M. GalettoDISPEA, Politecnico di Torino,
Torino, Italy E. TurinaDISPEA, Politecnico di Torino, Torino, Italy. 2013. Techniques for impact
evaluation of performance measurement systems. International Journal of Quality & Reliability Management
30:2, 197-220. [Abstract] [Full Text] [PDF]
40. Fiorenzo Franceschini, Elisa Turina. 2013. Quality improvement and redesign of performance measurement
systems: an application to the academic field. Quality & Quantity 47:1, 465-483. [CrossRef]
41. Ingrid Guerra-Lpez, Alisa Hutchinson. 2013. Measurable and Continuous Performance Improvement:
The Development of a Performance Measurement, Management, and Improvement System. Performance
Improvement Quarterly 26:2, 159-173. [CrossRef]
42. Laura Grosswiele, Maximilian Rglinger, Bettina Friedl. 2013. A decision framework for the consolidation
of performance measurement systems. Decision Support Systems 54:2, 1016-1029. [CrossRef]
43. Elvin Bastian, Munawar Muchlish. 2012. Perceived Environment Uncertainty, Business Strategy,
Performance Measurement Systems and Organizational Performance. Procedia - Social and Behavioral
Sciences 65, 787-792. [CrossRef]
44. Pekka Helki, Timo Ala-Risku. 2012. Local environment as a pitfall in the performance measurement of
multi-site operations. Operations Management Research 5:3-4, 81-86. [CrossRef]
45. Pedro Gustavo Siqueira Ferreira, Edson Pinheiro de Lima, Sergio E. Gouvea da Costa. 2012. Perception of
virtual teams performance: A multinational exercise. International Journal of Production Economics 140:1,
416-430. [CrossRef]
46. Bunjongjit RomphoSchool of Management, Asian Institute of Technology, Pathumthani, Thailand
Sununta SiengthaiSchool of Management, Asian Institute of Technology, Pathumthani, Thailand. 2012.
Integrated performance measurement system for firm's human capital building. Journal of Intellectual
Capital 13:4, 482-514. [Abstract] [Full Text] [PDF]
47. Michaela Striteska. 2012. Key Features of Strategic Performance Management Systems in Manufacturing
Companies. Procedia - Social and Behavioral Sciences 58, 1103-1110. [CrossRef]
48. Ricardo Chalmeta, Sergio Palomero, Magali Matilla. 2012. Methodology to develop a performance
measurement system in small and medium-sized enterprises. International Journal of Computer Integrated
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
[CrossRef]
77. Paolo TaticchiPaola CoccaDepartment of Mechanical and Industrial Engineering, Brescia University,
Brescia, Italy Marco AlbertiDepartment of Mechanical and Industrial Engineering, Brescia University,
Brescia, Italy. 2010. A framework to assess performance measurement systems in SMEs. International
Journal of Productivity and Performance Management 59:2, 186-200. [Abstract] [Full Text] [PDF]
78. Okfalisa, Rose Alinda Alias, Naomie Salim, Kuan Yew WongMetric for strategy implementation:
Measuring and monitoring the performance 29-34. [CrossRef]
79. Edson Pinheiro de LimaAffiliated to both Pontifical Catholic University of Parana, Curitiba, Brazil and
the OM Group, Warwick Business School, University of Warwick, Coventry, UK Sergio E. Gouvea da
CostaBased at the Pontifical Catholic University of Parana, Curitiba, Brazil Jannis J. AngelisBased at The
OM Group, Warwick Business School, University of Warwick, Coventry, UK. 2009. Strategic performance
measurement systems: a discussion about their roles. Measuring Business Excellence 13:3, 39-48. [Abstract]
[Full Text] [PDF]
80. Karen FryerCaledonian Business School, Glasgow Caledonian University, Glasgow, UK Jiju
AntonyDepartment of DMEM, Strathclyde Institute for Operations Management, University of
Strathclyde, Glasgow, UK Susan OgdenCaledonian Business School, Glasgow Caledonian University,
Glasgow, UK. 2009. Performance management in the public sector. International Journal of Public Sector
Management 22:6, 478-498. [Abstract] [Full Text] [PDF]
81. Claire MoxhamManchester Business School, University of Manchester, Manchester, UK. 2009.
Performance measurement. International Journal of Operations & Production Management 29:7, 740-763.
[Abstract] [Full Text] [PDF]
82. Mattias Elg, Beata Kollberg. 2009. Alternative arguments and directions for studying performance
measurement. Total Quality Management & Business Excellence 20:4, 409-421. [CrossRef]
83. Michael Gall, Christian Sterba, Thomas GrechenigDefinition and Segmentation of Orchestra Companies
135-139. [CrossRef]
84. Juhani Ukko, Sanna Pekkola, Hannu Rantanen. 2009. A framework to support performance measurement
at the operative level of an organisation. International Journal of Business Performance Management 11:4,
313. [CrossRef]
85. Stephan Schmidberger, Lydia Bals, Evi Hartmann, Christopher Jahns. 2009. Ground handling services
at European hub airports: Development of a performance measurement system for benchmarking.
International Journal of Production Economics 117:1, 104-116. [CrossRef]
86. Roberto Cigolini, Alberto Grando. 2009. Modelling capacity and productivity of multi-machine systems.
Production Planning & Control 20:1, 30-39. [CrossRef]
87. A. Ramaa, T. M. Rangaswamy, K. N. SubramanyaA Review of Literature on Performance Measurement
of Supply Chain Network 802-807. [CrossRef]
88. Mike BourneProfessor of Business Performance at the Centre for Business Performance, Cranfield School
of Management, Cranfield University, Cranfield, UK. 2008. Performance measurement: learning from the
past and projecting the future. Measuring Business Excellence 12:4, 67-72. [Abstract] [Full Text] [PDF]
89. N. van der MerweSchool of Accounting Sciences, NorthWest University S.S. VisserSchool of
Accounting Sciences, NorthWest University. 2008. Performance management in the South African motor
manufacturing industry: a framework. Meditari Accountancy Research 16:2, 189-211. [Abstract] [PDF]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
90. Vinod KumarSprott School of Business, Carleton University, Ottawa, Canada Raili PollanenSprott School
of Business, Carleton University, Ottawa, Canada Bharat MaheshwariOdette School of Business, University
of Windsor, Windsor, Canada. 2008. Challenges in enhancing enterprise resource planning systems for
compliance with SarbanesOxley Act and analogous Canadian legislation. Management Research News
31:10, 758-773. [Abstract] [Full Text] [PDF]
91. Dilek Ozdemir, Sitki GozluInvestigation of performance criteria for health information systems 2066-2072.
[CrossRef]
92. Panagiotis Chytas, Michael Glykas, George ValirisA Proactive Fuzzy Cognitive Balanced Scorecard
1331-1338. [CrossRef]
93. Martin KuncSchool of Business, University of Adolfo Ibaez, Santiago, Chile. 2008. Using systems
thinking to enhance strategy maps. Management Decision 46:5, 761-778. [Abstract] [Full Text] [PDF]
94. Cory SearcyOld Dominion University, Norfolk, Virginia, USA Stanislav KarapetrovicUniversity of
Alberta, Edmonton, Canada Daryl McCartneyUniversity of Alberta, Edmonton, Canada. 2008. Application
of a systems approach to sustainable development performance measurement. International Journal of
Productivity and Performance Management 57:2, 182-197. [Abstract] [Full Text] [PDF]
95. Jean-Franois Henri Taxonomy of performance measurement systems 247-288. [Abstract] [Full Text]
[PDF] [PDF]
96. Edson Pinheiro de Lima, Sergio E. Gouvea da Costa, Jannis J. Angelis. 2008. The strategic management
of operations system performance. International Journal of Business Performance Management 10:1, 108.
[CrossRef]
97. Chanan Syan, Krystal Ramoutar. 2008. Development of an Integrated Framework for Assessing and
Improving the Performance of Manufacturing Industries in Developing Countries. Journal of Konbin 8:1. .
[CrossRef]
98. Kit Fai PunDepartment of Mechanical and Manufacturing Engineering, The University of the West
Indies, St. Augustine, Trinidad and Tobago Anesa HoseinDepartment of Mechanical and Manufacturing
Engineering, The University of the West Indies, St. Augustine, Trinidad and Tobago. 2007. Identification
of Performance Indicators for Poultry Agribusiness Operations. Asian Journal on Quality 8:3, 11-22.
[Abstract] [PDF]
99. Massimiliano Bonacchi, Leonardo Rinaldi. 2007. DartBoards and Clovers as new tools in sustainability
planning and control. Business Strategy and the Environment 16:7, 461-473. [CrossRef]
100. Eric O. OlsenOrfalea College of Business Industrial Technology, California Polytechnic State University,
San Luis Obispo, California, USA Honggeng ZhouDepartment of Decision Sciences, Whittemore School
of Business and Economics, University of New Hampshire, Durham, New Hampshire, USA Denis M.S.
LeeSawyer School of Business, Suffolk University, Boston, Massachusetts, USA YokeEng NgHewlett
Packard Singapore (Pte) Ltd, Singapore Chow Chewn ChongHewlettPackard Singapore (Pte) Ltd,
Singapore Pean PadunchwitPeregrine Semiconductor, Homebush Bay, Australia. 2007. Performance
measurement system and relationships with performance results. International Journal of Productivity and
Performance Management 56:7, 559-582. [Abstract] [Full Text] [PDF]
101. Sai Nudurupati, Tanweer Arshad, Trevor Turner. 2007. Performance measurement in the construction
industry: An action case investigating manufacturing methodologies. Computers in Industry 58:7, 667-676.
[CrossRef]
102. Mike Bourne, Steven Melnyk and Norman FaullClaire MoxhamManchester Business School, University
of Manchester, Manchester, UK Ruth BoadenManchester Business School, University of Manchester,
Manchester, UK. 2007. The impact of performance measurement in the voluntary sector. International
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
Journal of Operations & Production Management 27:8, 826-845. [Abstract] [Full Text] [PDF]
103. Mike Bourne, Steven Melnyk and Norman FaullPatrizia GarengoDIMEG, University of Padua, Padova,
Italy Umit BititciDMEM, University of Strathclyde, Glasgow, Scotland, UK. 2007. Towards a contingency
approach to performance measurement: an empirical study in Scottish SMEs. International Journal of
Operations & Production Management 27:8, 802-825. [Abstract] [Full Text] [PDF]
104. S.C.L. Koh and S.M. SaadJeff PursgloveHull City Council, Kingston upon Hull, UK Mike
SimpsonSheffield University Management School, University of Sheffield, Sheffield, UK. 2007.
Benchmarking the performance of English universities. Benchmarking: An International Journal 14:1,
102-122. [Abstract] [Full Text] [PDF]
105. Jitesh ThakkarA.D. Patel Institute of Technology, Vallabh Vidyanagar, India S.G. DeshmukhIndian
Institute of Technology, Delhi, India A.D. GuptaIndian Institute of Technology, Delhi, India Ravi
ShankarIndian Institute of Technology, Delhi, India. 2006. Development of a balanced scorecard.
International Journal of Productivity and Performance Management 56:1, 25-59. [Abstract] [Full Text]
[PDF]
106. Beata Kollberg, Mattias Elg. 2006. Challenges Experienced in the Development of Performance
Measurement Systems in Swedish Health Care. Quality Management in Health Care 15:4, 244-256.
[CrossRef]
107. JenHer WuDepartment of Information Management, National Sun Yatsen University, HsiTze Wan,
Kaohsiung, Taiwan, Republic of China YuMin WangDepartment of Information Management, National
ChiNan University, Nantou Hsien, Taiwan, Republic of China. 2006. Measuring ERP success: the
ultimate users' view. International Journal of Operations & Production Management 26:8, 882-903.
[Abstract] [Full Text] [PDF]
108. Bhagyashree ParanjapePhD student in the Department of Engineering, Faculty of Engineering &
Information Technology, Australian National University, Canberra, Australia. Margaret RossiterSenior
Lecturer, in the Department of Engineering, Faculty of Engineering & Information Technology,
Australian National University, Canberra, Australia. Victor PantanoLead Researcher at the International
Automotive Research Centre, Warwick Manufacturing Group, University of Warwick, Coventry, UK. 2006.
Performance measurement systems: successes, failures and future a review. Measuring Business Excellence
10:3, 4-14. [Abstract] [Full Text] [PDF]
109. Jillian MacBryde and Zoe RadnorCraig ShepherdThe Institute of Work Psychology, The University
of Sheffield, Sheffield, UK Hannes GnterETH Zurich, Organization Work and Technology Group,
Zurich, Switzerland. 2006. Measuring supply chain performance: current research and future directions.
International Journal of Productivity and Performance Management 55:3/4, 242-258. [Abstract] [Full Text]
[PDF]
110. Valerie DecoeneGhent University, Ghent, Belgium Werner BruggemanGhent University, Ghent, Belgium.
2006. Strategic alignment and middlelevel managers' motivation in a balanced scorecard setting.
International Journal of Operations & Production Management 26:4, 429-448. [Abstract] [Full Text] [PDF]
111. Garry ColemanStrategic Performance Measurement 12-1-12-22. [CrossRef]
112. George ValirisDepartment of Business Administration, University of Aegean, Chios, Greece Panagiotis
ChytasDepartment of Business Administration, University of Aegean, Chios, Greece Michael
GlykasDepartment of Financial and Management Engineering, University of Aegean, Chios, Greece.
2005. Making decisions using the balanced scorecard and the simple multiattribute rating technique.
Performance Measurement and Metrics 6:3, 159-171. [Abstract] [Full Text] [PDF]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)
113. Zoe Radnor and Mike KennerleyEileen M. Van AkenGrado Department of Industrial and
Systems Engineering, Virginia Tech, Blacksburg, Virginia, USA Geert LetensBelgian Armed Forces,
Martelarenstraat, Vilvoorde, Belgium Garry D. ColemanTransformation Systems, Inc., Bristow, Virginia,
USA Jennifer FarrisGrado Department of Industrial and Systems Engineering, Virginia Tech, Blacksburg,
Virginia, USA Dirk Van GoubergenDepartment of Industrial Management, Ghent University, Ghent,
Belgium. 2005. Assessing maturity and effectiveness of enterprise performance measurement systems.
International Journal of Productivity and Performance Management 54:5/6, 400-418. [Abstract] [Full Text]
[PDF]
114. Mike BourneCentre for Business Performance, Cranfield School of Management, Cranfield, UK Mike
KennerleyCentre for Business Performance, Cranfield School of Management, Cranfield, UK Monica
FrancoSantosCentre for Business Performance, Cranfield School of Management, Cranfield, UK. 2005.
Managing through measures: a study of impact on performance. Journal of Manufacturing Technology
Management 16:4, 373-395. [Abstract] [Full Text] [PDF]
115. Derrick Purdue. 2005. Performance Management for Community Empowerment Networks. Public Money
and Management 25:2, 123-130. [CrossRef]
116. I Egan, J M Ritchie, P D Gardiner. 2005. Measuring performance change in the mechanical design process
arena. Proceedings of the Institution of Mechanical Engineers, Part B: Journal of Engineering Manufacture
219:12, 851-863. [CrossRef]
117. A. Grando *, F. Turco. 2005. Modelling plant capacity and productivity: conceptual framework in a single-
machine case. Production Planning & Control 16:3, 309-322. [CrossRef]
118. Monica Franco-Santos *, Mike Bourne. 2005. An examination of the literature relating to issues affecting
how companies manage through measures. Production Planning & Control 16:2, 114-124. [CrossRef]
119. Kit Fai Pun, Anthony Sydney White. 2005. A performance measurement paradigm for integrating strategy
formulation: A review of systems and frameworks. International Journal of Management Reviews 7:1, 49-71.
[CrossRef]
120. T. J. Turner *, U. S. Bititci, S. S. Nudurupati. 2005. Implementation and impact of performance measures
in two SMEs in Central Scotland. Production Planning & Control 16:2, 135-151. [CrossRef]
121. Petri Suomala LIFE CYCLE PERSPECTIVE IN THE MEASUREMENT OF NEW PRODUCT
DEVELOPMENT PERFORMANCE 523-700. [Abstract] [Full Text] [PDF] [PDF]
122. Mohamed E. KuwaitiBahrain Defence Force, Bahrain. 2004. Performance measurement process: definition
and ownership. International Journal of Operations & Production Management 24:1, 55-78. [Abstract] [Full
Text] [PDF]
123. Mike KennerleyCentre for Business Performance, Cranfield School of Management, Cranfield University,
Cranfield, UK Andy NeelyCentre for Business Performance, Cranfield School of Management, Cranfield
University, Cranfield, UK. 2003. Measuring performance in a changing business environment. International
Journal of Operations & Production Management 23:2, 213-229. [Abstract] [Full Text] [PDF]
124. Rajiv Sindwani, Vikram Singh, Sandeep GroverIdentification of Attributes of TQM in an Educational
Institute 123-141. [CrossRef]
125. Ibrahim H. Osman, Abdel Latef AnouzeA Cognitive Analytics Management Framework (CAM-Part 1):
1-79. [CrossRef]
Downloaded by Universitas Gadjah Mada At 17:22 09 October 2016 (PT)