CDS-SPC e
CDS-SPC e
CDS-SPC e
Prepared by
the Centre of Excellence for Evaluation
Wednesday, March 13th, 2002
Objective
The objective of the CDS is to assist in renewing and repositioning the evaluation
function within the federal government and building the capacity of evaluation units, both
for evaluators themselves, and for managers. The intention is to ensure, within the
federal government, that evaluation is employed to:
assist with the design and redesign of programs that support Results-based
Management.
As an outcome of the CDS, the CEE aims to assist the Evaluation Community increase
its current numbers (estimated at 230) by 30 per cent or 70 additional positions, by
2005-2006.
managers who are accountable for results based management and employ the
discipline of evaluation to ensure the design and execution of effective programs,
policies and initiatives.
plans for training and development for program evaluators employed at all levels by
the federal government.
Given the current demographics of the federal public service, the key element of the
CDS is to assist evaluation managers to recruit, train, develop and retain professional
program evaluators to assist with the implementation of the revised Evaluation Policy.
As for managers, the CDS aims to provide them with orientation on the uses of
evaluation in the context of Results-based Management. In fulfilling this aspect the
strategy, the CEE will contribute to an integrated management training plan to inform
managers on all aspects of modern comptrollership. In addition, the CEE will seek
opportunities to work with evaluation managers in departments and agencies to
influence program managers at all levels to employ evaluation methods and expertise
as a key element of results based management.
Operating Principles
The CEE is guided in the implementation of the CDS, and other initiatives, by a Senior
Advisory Committee (SAC) composed of selected Heads of Evaluation, as well as a
representative from the Centre of Excellence for Internal Audit. One member of the SAC
serves as a champion for Community Development. The Champion heads a subcommittee, the Community Development Advisory Group, the mandate of which is to
provide specific and expert advice to the CEE on the CDS. In addition, the CEE consults
directly with the Heads of Evaluation of departments and agencies subject to the
Evaluation Policy, on an as needed basis, to ensure that objectives are aligned with
community needs.
The organizational structure of the CEE includes the position of Manager, Capacity
Building. The primary purpose of the position, and those that support it (two FTEs) is to
provide leadership and facilitation for the evaluation community in the area of training
and development, and where appropriate, in recruitment.
As a matter of course, the CEE will encourage the community to lead on specific
initiatives where it makes sense to do so. As an example, CEE will actively encourage
Heads of Evaluation to collaborate on recruitment and staffing initiatives within the
framework of delegated authorities.
policy centres within the Treasury Board Secretariat, such as the Centre of
Excellence for Internal Audit, and the Comptrollership Modernization Office. The
CEE is aware that many functional communities are also involved in capacity
building. The intention is to learn from good/best practices, and use or modify
existing programs and processes wherever feasible to meet the overall objectives of
our Community Development Strategy. We are working in partnership with the
Centre of Excellence for Internal Audit (TBS) on specific projects and will continue to
seek out similar alliances within TBS.
The CEE will work with the Evaluation Community to determine community needs and
issues from a global perspective to establish priorities, and ensure that these are
addressed in a timely and cost effective manner. The intention of the CDS is to support
the goal of community renewal and repositioning as required by the Evaluation Policy. In
most cases, the CEE will not itself deliver training or development to the Evaluation
Community. Rather, the role of the CEE will be to seek out and establish practical
options to ensure that evaluators receive the appropriate training and development
opportunities. Partners, as identified above, will be a potential source for the delivery of
training and development for the Evaluation Community.
Timeframe
The CDS is designed to support ongoing Evaluation Policy implementation. As such, it
includes both short-term and longer-term objectives. The short-term objectives include
the establishment of formalized mechanisms for entry-level recruitment, implementation
of a pilot internship program and the establishment of a recruitment/retention and
training and development program for in-career evaluators. The key longer-term
objective of the CDS is to help departments and agencies increase the numbers of
program evaluators actively engaged in the implementation of the Evaluation Policy to
numbers approaching 300 by 2005-2006, and to support them with an appropriate and
sustainable training and development plan. Key elements of the current and upcoming
year are outlined here.
A study of the potential for community-wide recruitment effort using the PostSecondary Recruitment Program. Learning from the examples of the FORD/IARD
program (recruitment program for financial officers and internal auditors), the
Accelerated Economist Program and the Policy Development and Research
Program, the CEE is exploring the possibility of implementing a recruitment program
for entry level evaluators on behalf of the community.
Review of options for training and development program for mid-career and
senior evaluators. The evaluation community requires a coherent and structured
approach to training and development to ensure that professional evaluators are
acquainted with and able to employ the appropriate methodologies and approaches
to support ongoing implementation of the Evaluation Policy.
A synopsis of the key reports completed to date, and related to these initiatives, is
available in Annex A.
redevelopment and renewal of the CEE web presence, devoted to information and
tools to support evaluators and managers. The initiative may include formal
partnering with the Comptrollership Community extranet project. Elements of the
CEE web presence will include:
evaluation methodology.
Annex A
Treasury Board Secretariat
Center of Excellence for Evaluation
Performance-based Management
Other Topics: While respondents suggested additional topics that would be of interest,
further study is required as there was no overriding consensus on any one topic/s.
Recommendations:
TBS continue to support the offering of the current courses over the
immediate term.
CES and TBS identify courses and seminars that most closely match the
interests of the respondents, as described in the survey, for publication on the
CES web site or in CES mail out publicity to members.
TBS and CES review the communication on course offerings to ensure that
departments and agencies are fully aware of what is available.
CES and TBS develop a reliable course calendar and training strategy.
Percent of work effectively learned on the job: Almost half thought 60% or more of
the work could be learned on the job while the other half felt that less than 60% could be
effectively learned on the job.
Training and Development Needs: The Evaluation and E&R groups identified the
following priorities:
Skill/ability in:
General tools, skills and methodologies
Data analysis
RMAF
Management
Interpersonal aspect of work with clients
As can be seen, the hands-on types of needs in the skills and abilities section ranked as
more important overall than the acquisition of knowledge and understanding.
Role of Centre of Excellence in Evaluation: Respondents clear expectation was that
the Centre of Excellence should lead, performing a centralized function in terms of
developing the required capacity, the standards, policies and even in terms of the
provision of training and development services. This would include the education of the
client departments and the functional community as to their respective roles in this new
enterprise. The need for TBS to build capacity was a unanimous response.
Reports Conclusions: While there are indications of recent recruitment activity in
small pockets, and some recent staffing activity, the eligibility to retire rates combined
with the self-reported plans to leave the government suggest a need for strategic HRM
action now. As there is not a direct University stream out of which to draw talent, it is
suggested that on the job training may not fill the demographic gaps rapidly enough to
maintain community stability and at the same time, transmit corporate memory. This may
be a situation where the development of corporate programs to recruit along the lines of
AETP or CAP, with some opportunity for a non-management stream, would address the
demands in an accelerated and systematic format. The development of talent pools from
employees within the government such as the AEXDP corporate program might help to
cultivate and diversify the skills of the community in a more horizontal way.
10
11
Entry-level recruitment
Performance feedback
Team building
The activities of program evaluators vary according to level and departments. Most,
however, are involved in the preparation of frameworks/RMAFs and the front-end
12
and back-end work of evaluations. As program evaluators gain experience, they play
an active advisory role for managers and high management.
There is an expressed need for internships and support for a mentoring approach.
Past experiences show that effective recruitment, support from high management,
appropriate training courses, and good mentoring/support are key success factors in
personnel development strategies.
Many documents and courses have been developed by the PSC and could be
adapted for internship and mentoring purposes.
Key stakeholders will come to an agreement that a centrally coordinated, interdepartmental recruitment drive is both effective and efficient for them, and that the
advantages of such an approach outweigh its initially onerous logical and
administrative disadvantages.
13
Heads of evaluation will agree to pay into a common pool to conduct recruitment
drives when they may be competing with each other over the best candidates.
People responsible for hiring evaluation staff will accept to delegate their candidate
screening power to staff of other departments as 36 departmental screening agents
will not be needed on site to select the best candidates.
Key stakeholders will agree to modify their practices of classifying staff in different
occupational groups. Harmonization of entry levels for candidates is required.
Once candidates will have been deemed acceptable and put on the PSCs database,
they will turn down other job offers until they are hired by the federal government.
14
Findings:
Defining the Junior Evaluation Officer:
The core competencies from the competency profile need to be assessed through
the proper tools and incorporated into a generic job description.
A generic job description is required. This study outlines the content of key Universal
Classification Standard of a junior evaluator job description (See pages 20-23 of the
study.)
Recommendation: The PSC and the CEE collaborate to make a case to Human
Resources offices to harmonize the occupational groupings for newly hired junior
evaluators and to institute an ES 2 entrance level.
How to Make Best Use of the PSR to Select Junior Level Evaluators:
Since Sept 2001, the PSC has implemented an electronic process in which
departments can interact with the information contained in a database to search for
the best candidates. In addition, once a department has officially encumbered its
vacant positions, its remaining, short-listed candidates can be accessed by other
departments for hiring.
15
Integrated Assessment Centre: Blends and integrates all instruments of Option E into
one package.
Note that all options except Option A require the collaboration and the coordination of
departments in developing tools, reviewing and accepting them, administering them to candidates
as well as a correction and appeals process.
Recommendations:
PSR campaigns be run twice a year late fall for early spring hiring and winter for
early summer hiring with a view to filling in available posts rather than with the goal
of creating a roster of available on-call candidates.
The following step-wise approach be used to implement a coordinated PSR for the
evaluation community:
2001-2002: Use basic PSR and draw candidates from the overflow of other
departments to fill the available posts.
2002-2003:
Several additional tests are needed to turn the PSR into the powerful recruitment tool
it could be for the evaluation community.
One time costs to departments vary from $12,500 to$74,250 depending on the level
of complexity and rigour of the PSR process.
16