A Model of Organizational Knowledge Management Maturity Based On People, Process, and Technology
A Model of Organizational Knowledge Management Maturity Based On People, Process, and Technology
A Model of Organizational Knowledge Management Maturity Based On People, Process, and Technology
Abstract
Organizations are increasingly investing in knowledge management (KM) initiatives to
promote the sharing, application, and creation of knowledge for competitive advantage. To
guide and assess the progress of KM initiatives in organizations, various process models have
been proposed but a consistent approach that has been empirically tested is lacking. Based on
the life cycle theory, this paper reviews, compares, and integrates existing models to propose
a General KM Maturity Model (G-KMMM). G-KMMM encompasses the initial, aware,
defined, managed, and optimizing stages, which are differentiated in terms of their
characteristics related to the people, process, and technology aspects of KM. To facilitate
empirical validation and application, an accompanying assessment tool is also developed. As
an initial validation of the proposed G-KMMM, a case study of a multi-unit information
system organization of a large public university was conducted. Findings indicate that GKMMM can be a useful diagnostic tool for assessing and guiding KM implementation in
organizations.
1. Introduction
Organizations are realizing that knowledge management (KM) is essential for them to remain
agile in a dynamic business environment and are increasingly investing in various KM
initiatives. It is estimated that companies in the United States spent close to $85 billion on
KM in 2008, an increase of nearly 16 percent from 2007 (AMR Research, 2007). Federal
government investment on KM is also expected to increase by 35 percent from 2005 to reach
$1.3 billion by 2010 (INPUT, 2005). Recognizing that KM is a complex undertaking
involving people, process, and technology, there is increasing need for a coherent and
comprehensible set of principles and practices to guide KM implementations (Pillai et al.
2008; Wong and Aspinwall, 2004). To better understand the ongoing development of KM in
organizations, this study adopts the perspective of life cycle theory to describe the process
through which KM is explicitly defined, managed, controlled, and effected in knowledgeintensive organizations.
Life cycle theory adopts the metaphor of organic growth to explain the development of an
organizational entity. It suggests that change is imminent and an entity moves from a given
point of departure toward a subsequent stage that is prefigured in the present state (Van de Ven
and Poole, 1995). Life cycle theories of organizational entities have depicted development in
terms of institutional rules or programs based on logical or natural sequences. For example, in
information system (IS) research, one of the best known models by Nolan (1979) describes six
stages of growth of electronic data processing (EDP), encompassing initiation, contagion,
control, integration, data administration, and maturity. These stages are ordered by both logic
and natural order of business practices. By organizing and representing data processing and
management practices in a coherent structure, the model has contributed significantly to our
understanding of data management and has become a recognized management concept in IS
research.
The wide acceptance and application of Nolans model demonstrate that life cycle theory is a
valuable approach for describing the development of IS. As information technology
transforms from providing basic data processing support to playing a more central role in
organizations, other life cycle models have been developed to depict the evolution of more
advanced systems such as end-user computing (Henderson and Treacy, 1986; Huff et al.,
1988) and enterprise resource planning systems (Holland and Light, 2001).
In the realm of KM, various life cycle models have also been proposed. They are commonly
known as KM maturity models (KMMM) (e.g., Gottschalk and Khandelwal, 2004; Lee and
Kim, 2001). These models are generally unitary (single sequence of stages), cumulative
(characteristics acquired in earlier stages are retained in later stages), and conjunctive (stages
are related in that they are derived from a common underlying structure) sequence as
characteristics of life cycle theory (Van de Ven and Poole, 1995). However, different models
adopt diverse concepts and are based on different assumptions. This makes their selection,
comparison, and application difficult for both researchers and practitioners. To develop a more
consistent and widely-accepted view of KM development, it is imperative to sift through the
various conceptualizations to identify the most central issues in KM development. To this end,
we review, compare, and integrate existing KMMMs to identify the core elements of a KM
development life cycle. A General Knowledge Management Maturity Model (G-KMMM) is
then proposed to describe the process and highlight the key aspects of KM development.
Existing KMMMs have been criticized as ad-hoc in their development (Kulkarni and St.
Louis, 2003) because their assessment tools are either proprietary or unspecified, rendering
their empirical assessment difficult. As a result, most KMMMs have not been validated
(Kulkarni and St. Louis, 2003) and there are reservations regarding their practical
applicability and the extent to which they reflect the actual state of affairs. This paper
addresses the gap by proposing an assessment tool accompanying the proposed G-KMMM. As
an initial validation of the proposed model and assessment tool, we also conducted an
exploratory case study of the KM initiative of an IS organization in a large public university.
Through this endeavor, we hope to contribute to research and practice in several ways. For
research, this study provides a systematic review and comparison of existing KMMMs, which
can potentially add to the cumulative knowledge of life cycle theory in general and KM
development in particular. The proposed G-KMMM also avoids oversimplifying the
phenomenon of KM development in organizations by adopting a multidimensional approach
encompassing people, process, and technology aspects. By synthesizing findings from
previous research and clearly defining important concepts, the proposed G-KMMM can
facilitate communication and improve understanding among researchers and practitioners.
For organizations engaging in KM initiatives, G-KMMM can be used to track the ongoing
development of KM initiatives or benchmark and compare the progress of different units.
Unlike prior work, this paper clearly defines the components of KMMM and develops an
accompanying assessment instrument, which allows the model to be independently assessed
and applied by researchers and practitioners. By highlighting the important issues in KM
development, G-KMMM can also assist managers in their planning of KM initiatives.
2. Conceptual Background
This section first defines the concepts of KM and maturity modeling. Existing KMMMs are
then reviewed and compared.
2.1 Knowledge and Knowledge Management
In the context of organizations, knowledge is defined as a justified belief that increases an
entitys capacity for effective action (Huber, 1991). This definition is deemed to be more
appropriate than a philosophical definition of knowledge because it provides a clear and
pragmatic description of knowledge underlying organizational knowledge management
(Alavi and Leidner, 2001), which is the entity of interest in this study. In a similar vein,
knowledge management refers to the process of identifying and leveraging collective
knowledge in an organization to help the organization compete (Alavi and Leidner, 2001).
Knowledge is often conceptualized as the most valuable form of content in a continuum
beginning with data, encompassing information, and ending at knowledge (Grover and
Davenport, 2001). Although information and knowledge are related, it is important to
distinguish KM, both as an area of scholarly enquiry and as a business practice, from the
concept of information management (IM). While KM presupposes IM (Klaus and Gable,
2000) and the success of KM depends on effective IM (Bukowitz and Williams, 2000), they
are different in terms of input, processing of data and information, and scope. With respect to
input, KM requires ongoing user contribution, feedback, and human input whereas IM
typically involves one-way information transfer and assumes that information capture can be
standardized and automated. In the processing of data and information, KM supports
performance improvement and innovation through adding value to data by filtering,
synthesizing, and exploration while IM supports existing operations by formatting and
presenting existing data (Bukowitz and Williams, 2000). In terms of scope, IM is usually
concerned with storing and disseminating electronic and paper-based information, while KM
deals with a far broader range of approaches to communicating, applying, and creating
knowledge and wisdom (Bukowitz and Williams, 2000).
2.2 Knowledge Management Models
Existing KM models are developed based on different theories and methods and they vary
greatly in terms of focus and scope. In general, they can be categorized as process-oriented,
social/technological enabler, contingency, and knowledge-oriented models (Alavi and
Leidner, 2001; Handzic et al., 2008). Process-oriented models examine the processes of
knowledge capturing, sharing, application, and creation to understand the mechanisms
best known models is the stages of growth of EDP (Nolan, 1979). The model identifies
various organizational issues in IS implementation and development and highlights the
priorities requiring managerial attention at different stages of growth. It has stimulated much
interest among IS scholars (e.g., Benbasat et al., 1984; Henderson and Treacy, 1986;
Kazanjian and Drazin, 1989) and is considered a significant conceptual contribution that
promotes a more structured approach to studying IS in organizations (King and Kraemer,
1984).
In this study, we focus on modeling the maturity of KM systems and initiatives. We define
KM maturity as the extent to which KM is explicitly defined, managed, controlled, and
effected. It describes the stages of growth of KM initiatives in an organization. KMMMs will
be discussed in greater detail in the following sections.
2.4 Characteristics of an Ideal KMMM
To ensure that KMMMs adequately portray the development of KM in organizations, past
studies have identified several requirements that an ideal KMMM should fulfill (Ehms and
Langen, 2002; Paulzen and Perc, 2002). It has been suggested that KMMM should be
applicable to different objects of analysis such as the organization as a whole, traditional and
virtual organizational units, and KM systems (Ehms and Langen, 2002). This can be achieved
by focusing on processes rather than specific objects of analysis (Paulzen and Perc, 2002).
It has also been recommended that KMMMs should provide a systematic and structured
procedure to ensure the transparency and reliability of assessment (Ehms and Langen, 2002).
The maturity model should also provide both qualitative and quantitative results (Ehms and
Langen, 2002). Paulzen and Perc (2002) emphasized the importance of measurement and
echoed the suggestion that the characteristics of each maturity level should be empirically
testable (Magal, 1989). In IS research, the lack of a clearly specified assessment procedure
for Nolans model has been identified as one of the reasons for its validation to be
7
inconclusive (Benbasat et al., 1984; Kazanjian and Drazin, 1989). Clearly articulating the
assessment procedure can help to avoid such a problem by allowing independent application
and validation.
In addition, it has been suggested that the underlying structure of KMMM should be
comprehensible and allow cross references to proven management concepts or models (Ehms
and Langen, 2002) to support continuous learning and improvement (Paulzen and Perc, 2002).
This can be achieved by reviewing existing literature to identify salient KM issues and
incorporate the findings into the development of the KMMM.
Other than identifying the criteria for an ideal KMMM, it is also important to consider the
criticisms of IS maturity models in general, since an ideal KMMM should also avoid these
weaknesses. Specifically, Nolans model has been criticized as being overly simplistic for
focusing on technology and overlooking development in other organizational aspects (Lucas
and Sutton, 1977). Therefore, it is important for the proposed KMMM to look beyond
technology. Indeed, it has been suggested that KM models should adopt a multifaceted and
socio-technical view of organizations by considering not just the technology but also its
people and processes (Alavi and Leidner, 2001; Apostolou et al., 2008-2009).
In reality, it can be challenging for a KMMM to satisfy all the requirements. One reason is
that some requirements may require tradeoff with other requirements when implemented
together. For example, Ehms and Langen (2002) have suggested that KMMM should ideally
be applicable to different objects of analysis. This may require higher level of flexibility in
the models formulation which may result in a less systematic assessment approach. Hence, it
is important to strike a balance among these requirements.
To identify important issues in the KM development lifecycle, we review existing KMMMs
that have been proposed and refined by KM researchers and practitioners. For ease of
comparison, they are categorized into two groups, depending on whether or not they are
developed based on the Capability Maturity Model (CMM).
2.5 KMMMs based on the Capability Maturity Model (CMM)
CMM is proposed to describe and determine the software engineering and management
process maturity of an organization. Its main purpose is to guide software organizations in
progressing along an evolutionary path from ad-hoc and chaotic software process to mature
and disciplined software process (Herbsleb et al., 1997). The model has gained considerable
acceptance worldwide and has been regarded by many as the industry standard for defining
software process quality. Like many other concepts that originated from practice, empirical
assessment of CMM by researchers lagged its adoption in organizations. Nevertheless, its
widespread adoption has allowed realistic evaluations to be conducted and many peerreviewed studies of CMM have provided empirical evidence of its validity in describing and
guiding the development of software organizations (e.g., Lawlis et al., 1995; McGarry et al.,
1998).
CMM defines five levels of maturity: initial, repeatable, defined, managed, and optimizing.
Each maturity level is described by a set of characteristics. For example, the level initial is
characterized as ad-hoc and chaotic, where few processes are defined and success is due to
individual effort. Except for level 1, several key process areas (KPA) are identified at each
maturity level to indicate the areas that an organization should focus on. Each KPA is further
described in terms of actionable practices.
Although CMM was originally proposed to describe software processes, it has been adapted
to develop several KMMMs, based on the premise that software process management can be
considered as a specific instance of KM and the concepts proposed in CMM may therefore be
also appropriate to describe KM (Armour, 2000; Paulzen and Perc, 2002). However, several
differences between software process management and KM are worth noting. KM covers a
9
wider range of issues and is less structured compared to software process management. Its
activities are also less standardized and outcomes are less quantifiable. Hence, KM maturity
must be judged from multiple perspectives, including technologies, processes, and employees,
in order to achieve a holistic assessment of KM development. Consequently, KMMMs have
KPAs that are somewhat different from CMM (Kulkarni and Freeze, 2004).
In our review, four KMMMs based on CMM were identified: Siemens KMMM, Paulzen and
Percs Knowledge Process Quality Model (KPQM), Infosys KMMM, and Kulkarni and
Freezes Knowledge Management Capability Assessment Model (KMCA). All four models
are developed based on CMM and thus have similar structures. The naming of maturity levels
in the four KMMMs are compared in Table 1.
Table 1. Naming of Maturity Levels of CMM-Based KMMMs
CMM-Based KM Maturity Models
Level CMM
Siemens
KPQM
KMMM
0
1
Initial
Initial
Initial
2
Repeatable
Repeatable Aware
3
Defined
Defined
Established
4
Managed
Managed
Quantitatively Managed
5
Optimizing Optimizing Optimizing
Infosys
KMMM
Default
Reactive
Aware
Convinced
Sharing
KMCA
Difficult / Not Possible
Possible
Encouraged
Enabled / Practiced
Managed
Continuously Improving
Each maturity level of these models is further described by a set of characteristics (see Table
2). However, it was observed that different KMMMs specified different characteristics.
Through careful analysis and consolidation of the characteristics in Table 2, a set of
characteristics that are repeatedly highlighted by different models were identified to represent
the important aspects of each KM maturity level (see Table 3).
10
Infosys KMMM
0
1
KMCA
-
11
KMCA
Level 1
Level 1
Level 1
Level 1
Level 2
Level 2
Level 2
Level 2
Basic KM infrastructure in
place
Level 3
Level 2
Level 3
Unspecified.
Probably Level 3
Level 3
Unspecified.
Probably Level 3
Level 4
Level 3
Level 3
Level 3
Unspecified.
Probably Level 3
Unspecified.
Probably Level 3
Unspecified.
Probably Level 3
Level 3
Level 2
Training for KM
Unspecified.
Probably Level 3
Unspecified.
Probably Level 3
Level 4
Common organizational KM
strategy
Level 4
Unspecified.
Probably Level 3
Level 4
Unspecified.
Probably Level 4
Level 4
Level 4
Level 5
Continual improvement of
KM practices and tools
Level 5
Level 5
Level 5
Level 5
Level 5
Unspecified.
Probably Level 5
Level 5
Unspecified.
Probably Level 5
Each KMMM also identified KPAs to indicate the areas that an organization should focus on
in its KM development (see Table 4). Different KMMMs have specified different KPAs, with
people, organization, process, and technology being the most common across models.
Table 4. KPAs of CMM-Based KMMMs
KMMM
Infosys
KMMM
People
Siemens
KMMM
KPQM
People
KMCA
Lessons-learned
Expertise
Data
Structured knowledge
Remarks
Process
Technology
Technology and
infrastructure
Knowledge
structures and
knowledge forms
Organization
Technology
Perceptual (behavioral) and factual
(infrastructure-related) characteristics
are identified for each of the 4 KPAs
12
Knowledge chaotic
Initial
Initial
Knowledge aware
Intent
Knowledge focused
Initiative
Knowledge centric
Knowledge
discoverer
Knowledge
creator
Knowledge
manager
Knowledge
renewer
6
7
8
Innovative
13
Klimkos KMMM
K3M
5iKM3
Widespread dissemination
and use of end-user tools
among knowledge workers
in the company.
- Awareness and
implementation of KM
across the organization
may not be uniform
- Pilot projects exists in
some areas
- Organization uses KM
procedures and tools
- Organization
recognizes that KM
brings some benefits to
the business
- Organizations have
knowledge enabled
their business processes
- Organizations are
observing benefits and
business impacts from
KM
- Has integrated
- Has matured
framework of KM
collaboration and
procedures and tools
sharing throughout the
- Some technical and
business processes
cultural issues need to - KM has resulted in
be overcome
collective and
collaborative
organizational
intelligence
IT provides knowledge
workers with access to
information that is typically
stored in documents and
made available to
colleagues. Here data
mining techniques will be
applied to store and
combine information in data
warehouses.
14
We observed that there are relatively less similarities across the non-CMM-based KMMMs
as compared to the CMM-based KMMMs. However, many of the common characteristics of
CMM-based KMMMs in Table 3 are also observed in the non-CMM-based KMMMs. For
example, all non-CMM-based KMMMs that have defined level 1 characterized it by
organizations lack of awareness of the need to manage knowledge formally and level 2 by
the presence of such awareness. Also, the need to have basic KM infrastructure at level 3 is
strongly implied in all non-CMM-based KMMMs.
In addition, we observed that all non-CMM-based KMMMs (except Klimkos KMMM which
does not identify any KPA and the Stages of Growth for KM Technology model which
focuses on technological aspects) identify KPAs that are largely similar to CMM-based
models, which includes people, process, and technology (see Table 7). Based on these
comparisons, a general KMMM was proposed, as discussed next.
Table 7. KPAs of Non-CMM-Based KMMMs
KMMM
Key Process Areas
V-KMMM
The Knowledge Journey
5iKM3
K3M
Culture Infrastructure
People Process and content
People Process
Process and technology
Remarks
Technology
Technology
Technology
Technology
15
16
General
Description
Process
Technology
1 Initial
Little or no intention to
formally manage
organizational knowledge
No formal processes to
capture, share and reuse
organizational knowledge
No specific KM
technology or
infrastructure in
place
2 Aware
Organization is aware of
and has the intention to
manage its organizational
knowledge, but it might
not know how to do so
Knowledge indispensable
for performing routine task
is documented
Pilot KM projects
are initiated (not
necessarily by
management)
3 Defined
4 Managed
Quantitative measurement
of KM processes (i.e., use
of metrics)
- Enterprise-wide
KM systems are
fully in place
- Usage of KM
systems is at a
reasonable level
- Seamless
integration of
technology with
content
architecture
- KM processes are
constantly reviewed and
improved upon
- Existing KM processes can
be easily adapted to meet
new business requirements
- KM procedures are an
integral part of the
organization
Existing KM
infrastructure is
continually
improved upon
G-KMMM proposes that organizations should progress from one maturity level to the next
without skipping any level. In practice, organizations may beneficially employ key practices
characterizing a higher maturity level than they are at. However, being able to implement
practices from higher maturity levels does not imply that levels can be skipped since they are
unlikely to attain their full potential until a proper foundation is laid.
17
18
(proprietary assessment tools were not accessible) and appropriate. Sources included the
Knowledge Journeys KM Framework Assessment Exercise, KPQM, KMCA, and KM
Assessment Tool (American Productivity and Quality Center and Arthur Andersen, 1996; de
Jager, 1999). KM Assessment Tool (KMAT) is a diagnostic survey that helps an organization
determines the effectiveness of its KM practices. When suitable items could not be found in
existing literature, new items were developed based on the proposed model (see Table 8) (i.e.,
PEO3b, PEO3c, PEO3f, PRO3a, and TEC4b). Among the five newly developed items, three
items measure the people KPA. This indicates that existing assessment tools may have
neglected this aspect compared to the technology and process aspects.
Table 9. Proposed G-KMMM Assessment Instrument
Level Question
Source
KPA: People
2
PEO2a Is organizational knowledge recognized as essential for the Knowledge Journey
long-term success of the organization?
PEO2b Is KM recognized as a key organizational competence?
KMAT
PEO2c Employees are ready and willing to give advice or help on Knowledge Journey, KMCA
request from anyone else within the company
3
PEO3a Is there any incentive system in place to encourage the Knowledge Journey
knowledge sharing among employees?
- Employees KM contribution are taken into consideration
- Rewards for team work, knowledge sharing/re-use
PEO3b Are the incentive systems attractive enough to promote the Self developed
use of KM in the organization?
PEO3c Are the KM projects coordinated by the management?
Self developed
PEO3d Are there individual KM roles that are defined and given Developed based on Siemens KMMM Level 3,
appropriate degree of authority?
Infosys KMMM Level 3
- CKO
Knowledge Journey
- Knowledge Officers / Workers
PEO3e Is there a formal KM strategy in place?
Developed based on Siemens KMMM Level 4
PEO3f Is there a clear vision for KM?
Self developed
PEO3g Are there any KM training programs or awareness Developed based on Infosys KMMM Level 3
campaigns? e.g. introductory/specific workshops for contributors,
users, facilitators, champions
PEO4a Are there regular knowledge sharing sessions?
Developed based on Infosys KMMM Level 4
PEO4b Is KM incorporated into the overall organizational strategy? Knowledge Journey
PEO4c Is there a budget specially set aside for KM?
Knowledge Journey
PEO4d Is there any form of benchmarking, measure, or assessment KMAT
of the state of KM in the organization?
- Balanced scorecard approach
- Knowledge Journey
- Having key performance indicators in place
- Knowledge Journey
- Knowledge ROI
- Developed based on Infosys KMMM Level 5
PEO5 Has the KM initiatives resulted in a knowledge sharing Developed based on Infosys KMMM Level 5
culture?
19
20
4. Research Design
To initially assess the applicability of the proposed G-KMMM, we studied a multi-unit IS
organizations KM maturity. The case study methodology allows us to gather rich data and
gain better understanding of the complex interactions among people, technologies, and units
(Dub and Par, 2003) in KM development. Since our purpose was to study the utility of the
G-KMMM in an actual context, we adopted the descriptive positivist approach, in which data
collection and interpretation were guided by a pre-specified model (Orlikowski and Baroudi,
1991).
4.1 Case Background
The IS organization in our case, Computer Hub, provides computing and IT infrastructure
support for a large public university, which consists of over 30,000 students and more than
4,000 teaching, research, and administrative staff. The organization was a suitable context for
our study because the nature of its work was knowledge-intensive and involved specialized
expertise that must be carefully managed. It had also begun to explore various KM
applications since 2002. In addition, the Computer Hub was made up of multiple units which
is typical of many large organizations. This provided a unique opportunity for us to examine
whether the G-KMMM is flexible enough to be applied in complex organizations of this form.
During the study, we focused on ten units of the Computer Hub: the academic unit (AU),
corporate unit (CU), call center (CC), and seven faculty units which included Architecture
(ARU), Arts and Social Sciences (ASU), Business (BSU), Computing (CMU), Dentistry
(DTU), Engineering (ENU), and Scholars Program (SPU). Each of these units served a large
number of users, ranging from 150 to 6,000 people. Other faculty units were excluded
because they were very small (i.e., less than five employees) and the use of KM applications
were minimal at the time of the study.
The main roles of the AU and CU included university-wide IT application development and
21
maintenance. The AU was in charge of systems serving the student population (e.g., course
registration system), while the CU was responsible for managing systems tailored to the
corporate segment (e.g., student administration system).
In contrast, the CC provided frontline call center and walk-in technical support for the
university community. It was also in charge of campus-wide programs such as staff PC
upgrade and student laptop ownership plan. It also managed a university-wide content
management system (CMS) and electronic document management system (EDMS).
The faculty units catered to the specific IT needs of their respective faculties. Such a
distributed structure was necessary because each faculty had different IT requirements. For
example, the DTU required sophisticated imaging technology, while the SPU focused more
on providing user support on the use of administrative systems. In addition to utilizing the
infrastructure and services provided by the Computer Hub, these faculty units also hosted
their own servers and developed their own applications to cater to faculty-specific
requirements.
4.2 Data Collection and Analysis
A total of twenty interviews (two per unit) were conducted with managers and employees of
the AU, CU, CC, and faculty units over three months. An interview guide was developed
based on the assessment instrument proposed in Table 9. Each interview lasted 30 to 90
minutes. Table 10 provides the descriptive statistics of the participating units. With the
permission of the interviewees, all interviews were recorded and transcribed for further
analysis. We also requested for related documents and demonstrations of various KM
systems. To improve the validity of our data, we triangulated the data sources of information
by verifying interviewees accounts against one another. Secondary data was also gathered
from relevant documents and websites. The results of our analysis also agree with that of a
subsequent independent study that assessed and compared the KM capabilities of various
22
units in the Computer Hub in terms of people (e.g., expertise), process (e.g., knowledge
sharing, knowledge creation), and technological (e.g., investment in KM technology) factors
using a different assessment instrument. This provides some indication that the proposed
assessment instrument possesses concurrent validity.
Table 10. Descriptive Statistics of Faculty Information System Units
Faculty Strength
Faculty Unit Staff
Website1
3
Students
Staff
IT Professionals
Technicians / Others Faculty
Unit
ARU
1500-2000 100-150
6
7
Y
Y
ASU
6000
500
8
3
Y
Y
BSU
1500
100
5
3
Y
N
CMU
3000
150-200
13
30
Y
Y
DTU
150
30-40
4
2
Y
Y
ENU
5000-6000 300
15
8
Y
Y
SPU
800-1000
50-60
2
3
Y
N
1. Website refers to a collection of interconnected web pages that are publicly accessible.
2. Intranet refers to a private network that is only accessible by users in the associated faculty or unit.
3. The figures include both academic and administrative staff.
Note: Y indicates that the unit has a website/intranet. N indicates otherwise.
Faculty
Unit
Intranet2
Faculty
Unit
Y
Y
Y
Y
Y
Y
Y
Y
Y
N
Y
Y
Y
N
Based on information collected during interviews and from secondary sources such as
documents and websites, the KM maturity of each unit was assessed by evaluating whether or
not a particular practice (described by an item in the assessment instrument in Table 9) was
carried out. To qualify for a maturity level in a KPA, a unit must carry out all key practices of
that level. For example, a unit that carried out the practices described in items PEO2a to
PEO4a but not item PEO4b can be said to have attained maturity level 3 in people KPA,
since it has not implemented all practices characterizing level 4.
5. Results
The findings for the AU, CU, and CC, which managed university-wide applications and
catered to staff members and students in general, are first presented. This is followed by the
results for faculty IT units, which focused on serving the needs of specific faculties. A
cumulative assessment is then conducted for the Computer Hub as a whole based on these
analyses. The resultant maturity levels of each unit are summarized in Table 11.
23
ARU
ASU
BSU
CMU
DTU
People Maturity
2
2
2
1
2
1
PEO2a
Y
Y
Y
N
Y
N
PEO2b
Y
Y
Y
N
Y
Y
PEO2c
Y
Y
Y
Y
Y
Y
PEO3a
N
N
N
N
N
N
PEO3b
N
N
N
N
N
N
PEO3c
Y
Y
N
N
Y
N
PEO3d
N
N
N
N
N
N
PEO3e
N
N
N
N
N
N
PEO3f
Y
N
N
N
Y
N
PEO3g
Y
N
N
N
N
N
PEO4a
N
N
N
N
N
N
PEO4b
N
N
N
N
N
N
PEO4c
N
N
N
N
N
N
PEO4d
N
N
N
N
N
N
PEO5
N
N
N
N
N
N
Process Maturity
3
2
2
1
3
1
PRO2
Y
Y
Y
N
Y
N
PRO3a
Y
N
N
N
Y
N
PRO3b
Y
N
N
N
Y
N
PRO4a
N
N
N
N
N
N
PRO4b
N
N
N
N
N
N
PRO5
N
N
N
N
N
N
Technology Maturity
2
3
3
1
3
1
TEC2a
Y
Y
Y
N
Y
N
TEC2b
Y
Y
Y
N
Y
N
TEC3
N
Y
Y
N
Y
N
TEC4a
N
N
N
N
Y
N
TEC4b
N
N
N
N
N
N
TEC5
N
N
N
N
N
N
Overall Unit Maturity
2
2
2
1
2
1
Note: Y indicates that the unit demonstrated the characteristic described by the item. N indicates otherwise.
ENU
SPU
2
Y
Y
Y
N
N
N
N
N
N
N
N
N
N
N
N
2
Y
N
N
N
N
N
1
Y
N
N
N
N
N
1
1
N
N
Y
N
N
N
N
N
N
N
N
N
N
N
N
1
N
N
N
N
N
N
1
Y
N
N
N
N
N
1
5.1 Academic (AU), Corporate (CU), and Call Center (CC) Units
In relation to the people KPA, AU, CU, and CC recognized knowledge as a critical resource
that must be competently managed. Every few months, sharing sessions were held to facilitate
knowledge transfer among staff members. Staff members were aware of the benefits of
knowledge sharing and were generally willing to advice or help their colleagues. The
management also articulated a clear KM vision and conducted KM training workshops.
However, since there was a lack of incentive systems to encourage staff members to participate
in KM activities (PEO3a), it was concluded that the units were at maturity level 2 for the
people KPA (see Table 11).
24
Pertaining to the process KPA, the units had some processes for capturing, sharing, and
reusing routine documents and knowledge and specific KM technologies were used to
support these processes formally. For example, project portals were set up with Microsoft
SharePoint to facilitate collaboration and knowledge sharing among application development
project team members. A Developers Corner was also set up to encourage knowledge
sharing among system developers. Content on the Developers Corner included system
procedures, guidelines, links to programming websites, and articles on SAP system interfaces.
However, the use of these KM systems was still lower than expected (PRO4a) at the time of
study as users lacked motivation to adopt them. This suggested that the units were at maturity
level 3 for the process KPA.
With respect to the technology KPA, other than EDMS, CMS, Microsoft SharePoint, and
Developers Corner, an IS for tracking the inventory of software developed was also
implemented. This system served as a basis for encouraging component reuse across projects
in different IT units and had gained acceptance quickly. At the time of data collection, it had
over 80 registered applications. Overall, although AU, CU, and CC had implemented several
KM systems to support various KM activities, these systems contained mainly technical
information and offered little support to administrative staff members and managers (TEC3).
Considering this, the units were deemed to be at maturity level 2 for the technology KPA.
In summary, the AU, CU, and CC were at maturity level 2 for the people and technology
KPAs, since they were aware of the need to formally manage their knowledge resources and
had initiated several pilot KM projects. They were at level 3 for the process KPA since
processes for content and information management were formalized. At the unit level, it was
thus concluded that they were at KM maturity level 2 since they had not achieved maturity
level 3 for the people and technology KPAs.
25
27
28
forums, mailing lists, and source code management to further support knowledge sharing
during application development projects.
5.3 Computer Hub
To determine the KM maturity of the Computer Hub as a whole, the distribution of individual
units maturity rating was analyzed (see Table 11). For the Computer Hub to achieve a
certain maturity level, all its units must achieve positive ratings for all items characterizing
the level. In other words, the maturity level of the least mature IT unit will determine the
maturity level for Computer Hub.
With regard to the people KPA, it was observed that seven out of ten units had achieved
maturity level 2. The remaining units had less recognition for the importance of formal KM
in their long-term success and were considered to be at level 1. Hence, the Computer Hubs
people KPA was at level 1. However, even in less matured units, staff members were
generally willing to share their expertise with one another. Hence, the potential for these units
to improve their people KPA was fairly high. It was also observed that although none of the
units offered formal incentives to encourage participation in KM activities, some informal
incentives were in place. For example, the ENU's manager noted that better appraisal was
likely for staff members who participated in KM activities.
In terms of the process KPA, two units were still at level 1 as documentation of knowledge
critical to the performance of routine tasks was not yet guided by any formal process. Hence,
the Computer Hubs overall process KPA was still at level 1. To improve its rating,
Computer Hub could encourage other units to share their experience with the less advanced
units to help them establish suitable KM processes that could address their specific needs.
Regarding the technology KPA, four out of ten units were at maturity level 1. Therefore, the
Computer Hubs overall technology KPA was also at level 1. The four units were lacking in
29
infrastructure that could adequately support KM activities. Although systems such as the
CMS and EDMS were provided by the Computer Hub, these units had not exploited the
systems fully to address their KM needs. One possible reason for this was that staff members
of these units were collocated and it was therefore more convenient and natural to share
knowledge face-to-face. Another reason was that the lack of human resources prevented them
from exploring and experimenting with the potential of KM systems.
Overall, it was observed that the KM maturity of the Computer Hub was still at level 1.
However, noting that many units were already at level 2 for the three KPAs and some had
even reached level 3, it appeared that the organization was closing in on level 2.
6. Discussion and Conclusion
The G-KMMM identifies salient aspects of KM development that allow organizations to
grasp the essential elements of the phenomenon. Its applicability in assessing KM
development and indicating possible future improvements was demonstrated in an
exploratory case study. In particular, a unit found to be at one maturity level seldom
implements practices characterizing higher maturity levels. Apart from the few exceptional
cases (e.g., ARUs people KPA is at maturity level 2 but it implemented practices
characterizing level 3), such occurrence was not observed in other units. This suggests that
the proposed model possesses some degree of convergent validity and seems to realistically
capture the development of KM in the case organization. It is also flexible enough to be
applied to many levels of aggregation, including units, departments, and the organization as a
whole. In addition, it is independent of the type of KM system and can be applied to
personalization as well as codification-based KM strategies.
It is important to clarify that the G-KMMM does not consider all organizational units to be
equally appropriate for assessment of KM maturity. Rather, focus should be on knowledge-
30
intensive units (e.g., research and development) where effective KM is critical since the
manifestations and effectiveness of KM are likely to be most clearly discerned in these units.
As shown in the case study, different units of an organization may be at different maturity
levels for different KPAs. For example, the CMU was at maturity level 2 for the people KPA
but its process and technology KPA were found to be at level 3. Other units such as the AU,
CU, CC, ARU, ASU, and ENU also had different maturity ratings for different KPAs. This
suggests that the three KPAs are distinct and unlikely to be correlated with one other,
providing preliminary evidence for discriminant validity. This also demonstrates the models
usefulness as a diagnostic tool that is able to pinpoint areas needing further improvement. It
also allows the assessment outcome to be reported at different levels of abstraction as the
ratings for different units can be aggregated into a single rating for the organization as a
whole.
It is also important to note that although the G-KMMM defined the fifth maturity level to be
the most advanced level, it does not suggest that organizations at this level will cease
developing their KM competence. Rather, as KM concepts and technologies evolve, the
conditions for attaining maturity are likely to change and serve more like moving targets to
encourage continuous learning and improvement rather than a definite end by themselves.
The case study has highlighted a few areas for future investigation. An avenue for future
research will be to investigate the relative importance of practices in each KPA at different
stages of maturity. Identifying and understanding these dynamics may help organizations in
charting their KM development better. In addition to people, process, and technology aspects,
it may also be important to consider situational factors in the development of KM. For
example, in the case study, the manager of CC highlighted that a major roadblock hindering
users adoption of documentation systems was that local legal jurisdiction did not recognize
the legality of electronically-filed documents unless their process flow was certified by an
31
established accounting firm. As the certification process was tedious and costly, the
university found it more economical to stick to paper documents and use of the EDMS was
often seen as nonessential. This suggests that future refinements of the proposed model may
need to consider environmental conditions outside the control of the organizations.
To assess its generalizability, future research can apply the G-KMMM to different contexts.
More quantitative data in the form of summarized statistics can also be collected from a
larger sample of organizations by developing survey questionnaire based on the proposed
instrument and using finer measurement scales such as Likert scales. This could facilitate the
comparison of KM development patterns across organizations and allow a more thorough
assessment of the validity of the proposed model.
While the underlying objective of the proposed model is to improve KM development in
organizations and eventually enhance organizational performance, the current model focuses
on identifying the key aspects of KM development and assessing the level of KM maturity
and does not explicitly hypothesize or predict any relationship between maturity level and
organizational performance. Although studies related to CMM have provided empirical
evidence that organizations progressing along the pathway of CMM witnessed improved
performance (e.g., Herbsleb et al., 1997; Lawlis et al., 1995; Lucas and Sutton, 1977), there is
a lack of studies verifying such effects for KMMMs and IS stage models (Benbasat et al.,
1984). To assess the predictive validity of KM maturity on organizational performance, largescale studies examining the maturity and organizational performance of organizations in
various industries are needed.
The proposed G-KMMM recognizes that KM is unlikely to be achieved in one giant leap. Its
staged structure provides a general understanding of the gradual and holistic development of
KM. It is hoped that the G-KMMM can serve as both an effective diagnostic tool for
assessing KM efforts and a coherent roadmap that guides academic and practical KM
32
endeavors.
References
Alavi, M. & Leidner, D. E. (2001). Review: Knowledge management and knowledge
management systems: Conceptual foundations and research issues. MIS Quarterly, 25(1),
107-136.
American Productivity and Quality Center (APQC) & Arthur Andersen (1996). The
knowledge management assessment tool (KMAT)
(http://www.kwork.org/White_Papers/KMAT_BOK_DOC.pdf)
AMR Research (2007). The knowledge management spending report 2007-2008.
(http://www.amrresearch.com/Content/View.asp?pmillid=20744)
Apostolou, D., Mentzas, G., & Abecker, A. (2008-2009). Managing knowledge at multiple
organizational levels using faceted ontologies. The Journal of Computer Information
Systems, 49(2), 32-49.
Armour, P. G. (2000). The case for a new business model: Is software a product or a medium?
Communications of the ACM, 43(8), 19-22.
Benbasat, I., Dexter, A. S., Drury, D. H., & Goldstein, R. C. (1984). A critique of the stage
hypothesis: Theory and empirical evidence. Communications of the ACM, 27(5), 476485.
Bukowitz, W. & Williams, R., (2000). Knowledge pool. CIO Magazine, July.
de Jager, M. (1999). The KMAT: Benchmarking knowledge management. Library
Management, 20(7), 367-372.
Dub, L. & Par, G. (2003). Rigor in information systems positivist case research: Current
practices, trends, and recommendations. MIS Quarterly, 27(4), 597-635.
Ehms, K. & Langen, M. (2002). Holistic development of knowledge management with
KMMM
33
(http://www.knowledgeboard.com/doclibrary/knowledgeboard/kmmm_article_siemens_2
002.pdf)
Gottschalk, P. & Khandelwal, V. K. (2004). Stages of growth for knowledge management
technology in law firms. Journal of Computer Information Systems, 44(4), 111-124.
Grover, V. & Davenport, T.H. (2001). General perspectives on knowledge management:
Fostering a research agenda. Journal of Management Information Systems, 18(1), 5-21.
Handzic, M., Lagumdzija, A., & Celjo, A. (2008). Auditing knowledge management
practices: model and application. Knowledge Management Research & Practice, 6(1), 9099.
Henderson, J.C. & Treacy, M. (1986). Managing end-user computing for competitive
advantage. Sloan Management Review, 27(2), 3-14.
Herbsleb, J., Zubrow, D., Goldensen, D., Hayes, W. & Paulk, M. (1997). Software quality
and the capability maturity model. Communications of the ACM, 40(6), 30-40.
Holland, C. P. & Light, B. (2001). A stage maturity model for enterprise resource planning
systems use. The DATA BASE for Advances in Information Systems, 32(Spring), 34-45.
Huber, G. P. (1991). Organizational learning: The contributing processes and the literature.
Organization Science, 2(1), 88-115.
Huff, S. L., Munro, M. C., & Martin, B. H. (1988). Growth stages of end-user computing.
Communications of the ACM, 31(5), 542-550.
INPUT (2005). Federal knowledge management market view report.
(http://www.input.com/corp/press/detail.cfm?news=1091)
Kazanjian, R. K. & Drazin, R. (1989). An empirical test of stage of growth progression
model. Management Science, 35(12), 1489-1503.
King, J. L. & Kraemer, K. L. (1984). Evolution and organizational information systems: An
assessment of Nolans stage model. Communications of the ACM, 27(5), 466-475.
34
35
McGarry, F., Burke, S., & Decker, B. (1998). Measuring the impacts individual process
maturity attributes have on software projects. Proceedings of the 5th International
Software Metrics Symposium, 52-60.
Mohanty, S. K. & Chand, M. (2004). 5iKM3 knowledge management maturity model for
assessing and harnessing the organizational ability to manage knowledge. TATA
Consultancy Services, 2004
(http://www.tcs.com/NAndI/default1.aspx?Cat_Id=7&DocType=324&docid=419)
Nolan, R. (1979). Managing the crisis in data processing. Harvard Business Review, 57(2),
115-126.
Orlikowski, W. J. & Baroudi, J. J. (1991). Studying IT in organizations: Research approaches
and assumptions. Information Systems Research, 2(1), 1-28.
Paulzen, O. & Perc, P. (2002). A maturity model for quality improvement in knowledge
management. Proceedings of the 13th Australasian Conference on Information Systems,
243-253.
Pillai, S., Gupta, R. K., & Saxena, K. B. C. (2008). From rhetoric to reality: an enquiry into
KM initiatives in an organisation of higher learning. Journal of Information & Knowledge
Management, 7(3), 135-143
Powers, V. J. (1999). Xerox creates a knowledge sharing culture through grassroots efforts.
Knowledge Management in Practice, APQC.
(http://www.socialeconomy.info/en/node/1342)
Van de Ven, A. H. & Poole, M. S. (1995). Explaining development and change in
organizations. Academy of Management Review, 20(3), 510-540.
Weerdmeester, R., Pocaterra, C., & Hefke, M. (2003). VISION: Next generation knowledge
management: Knowledge management maturity model. Information Societies
Technology Programme.
36
37