Lecture 3 The HRD Cycle (Moodle)

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 42

The HRD Cycle

Lecture 3

Aishath Thashkeel | HRD&KM


Session outline
 Outline the systematic training and HRD cycle
 Identify training needs – TNA
 Explore the six stages of training design
 List the skills and knowledge required to deliver training
 Define the concept of evaluation
 Explain the differences between different evaluation models
 Appreciate the business reasons for undertaking evaluation.
Learning and Development
What is a training
need?
The gap between knowledge, skills and attitude possessed by
individual or group and those needed to perform required
occupational role (Gold et al, 2013).
What is a training
need analysis (TNA)?

 The process of identifying

training needs in an organisation

for the purpose of improving

employee job performance.


WHAT ARE SOME
TECHNIQUES OF TNA?
TNA techniques

Consultation with
persons in key
Direct observation Questionnaires positions, and/or with
specific knowledge

Assessments
Interviews Focus groups
/surveys

Records & report


Work samples
studies
Questions to ask

 What tasks are performed?

 How frequently are they performed?

 How important is each task?

 What knowledge is needed to perform the task?

 How difficult is each task?

 What kinds of training are available?


TNA in practice
Identifying learning and
development needs

Strategic – preparing
Is to be more
for future
proactive
requirements

Personal learning and


Career aspirations
development
The systematic training
cycle
Identification
of training and
development
needs

Evaluation of Design of
training and training and
development development
intervention intervention

Delivery of
training and
development
intervention
Criticisms of the STC
 Mechanistic

 Inflexible

 Time consuming

 Reactive

 Old school!

 Too reliant on the trainer;


limited input from other
stakeholders
The HRD cycle
The Six Stages of
Training Design
1. Agree
aim and
objectives

6. 2.
Integrate Determine
learning your
theory into approach
the design to training

3. Select
5. Choose
evaluation
a venue
methods

4. Agree
content
and
methods
 The starting point is the identification of learning or training
1. Agree
aim and objectives
objectives
 It is important at this stage to decide how the intervention
will be evaluated

“Effective objectives help to design the training precisely to fit the needs of trainees and the
organisation. Written objectives indicate that conscious decisions have been made about what skills,
knowledge, and abilities to include in the training…based on the needs analysis process” (Vaughn, 2005:
66).

Learning Objectives
A good objective should contain the following components:
 a clear statement of learner behaviour (performance) at the end of
training;
 the conditions under which the performance will take place;
 the standards of performance expected.
Learning Objectives - Example

Training on Typing. What could be an


objective?
Learning Objectives - Example

Performance statement: by the end of this training session participants will be


able to type a letter

Conditions statement: by the end of this training session participants will be


able to type a letter at a speed of 120 words per minute using Microsoft Word

Standards statement: by the end of this training session participants will be


able to type a letter at a speed of 120 words per minute using Microsoft Word
without making any errors.
2. Here we’re referring to your specific approach to the
Determine
your delivery of the intervention – this will guide the design
approach to
training process. How will you deliver the training/learning?

Typical examples in the workplace


include:  Off-the-job (internal)

 E-learning  On-the-job

 Blended/Bite-size learning  Self development –


shadowing/observations/coac
 Off-the-job (external)
hing/mentoring/job rotation
Evaluation is usually considered as the last
3. Select
evaluation stage of the systematic approach but often
methods
gets neglected.

 Evaluation includes the assessment of


measuring learner understanding and/or
ability before, during, and after the
delivery phase of the intervention.

 Assessment can be summative and


formative
4. Agree Deciding the content of an intervention is
content and
methods relatively straightforward and is determined by the
learning objectives.

 The mix of materials need to be considered carefully to ensure


the delivery of content will be at the right level and appropriate
to the learning objectives (Hackett, 2003).

 Training methods are the different ways in which specific


elements within an intervention can be delivered to learners.

 Examples of methods include: case study, video/DVD, team


exercise, role play, group discussion, coaching etc.
Be Creative

 DO YOUR HOMEWORK
Acquiring detailed background knowledge on the subject is key to
creativity.
 BE BRAVE
You need to escape from the conventional way of thinking and take
risks.
 HAVE ALTERNATIVES
Keep hold of your ideas, gather them and don’t dismiss any you think
may be useless.
 RECOMBINE THE OLD
Use past ideas, training courses, activities etc. and combine them to
create new ones
This depends on the training/learning approach
5. Choose a
venue adopted – online? On-the-job? Off-the-job? Training
centres?

The venue for training should be appropriate for:


 The learning objectives: performance, conditions, and standards.
 Assessment measures: individual or group, use and feedback.
 Training methods: group rooms, work rooms, workshop, test place.
 Media: equipment and facilities to use.
 Learners: expectations, status, travelling.
 Trainers: administrative and technical support.
 Image/culture: the organisation, training, learning event.
 Budget: costs of use, travel, subsistence.
6. Integrate
learning
theory into
the design

 Participation in learning is important for both sustaining


motivation and enabling new learning to be integrated with
existing knowledge and skills.

 The learning needs to be interesting and offer variety to


sustain motivation and attention.

 There also needs to be some flexibility in a design to allow for


differences in learners preferred ways of learning.
The Effective Trainer

Subject Matter Expert or an expert at delivery techniques, or


both?
Delivery Knowledge and
Skills
Knowledge of…
Skills…
Speaking clearly, varying explanations to
help learners understand, modulating
voice, using eye contact and
Verbal and non-
positive/encouraging non-verbal language;
verbal language
Communicating injecting humour, and telling anecdotes
and stories; avoiding jargon and
complicated language.
Active listening and providing feedback:
Giving and
the ability to listen to and respond
receiving
effectively to learner inputs, such as
feedback
questions, anecdotes and presentations.
Asking open and closed and/or probing
Questioning questions appropriately, for instance, to
clarify learner understanding.
Knowledge Skills…
of…

The ability to ‘stand back’: guiding


Different
and encouraging learners (rather than
learning
always telling them); allowing learners
Facilitating theories;
to discuss issues amongst themselves
group
and to share stories with each other.
dynamics

Time Managing time effectively; providing


Organising
management timely summaries of key points.

Using methods effectively; using


Methods, technology and software (e.g.
Technical media and computer and data projector)
software effectively.
Reasons for Ineffective
Interventions
Reasons for Ineffective
Interventions
 Useless Negative feedback from learners.

 High drop-out rates or poor attendance

 Low motivation from learners

 Ineffective trainer

 Mismatch between learners and intervention

 Lack of enthusiasm from both the trainer and learners

 The trainer takes little interest in the learning outcomes.

 The trainer fails to adhere to timings resulting in some aspects


of the intervention being rushed or missed out completely.
Training Evaluation:
purpose
Learner
Certification
of
Basis for knowledge/
continuous Evidence of competence
development achievement/
results Managers
Justifies
Identifies investment of
future needs Enhances
resources
HRD for T&D perception of HRD
as a strategic tool
to improve
performance
Raises strategic Successful Improves HRD
profile and outcomes elicit design and
credibility of HRD motivation for delivery
function participation
Reasons for not Evaluating
 They are unaware of the methods of evaluation and how
they can be used.

 They do not have the time, expertise or resources to


analyse the learning results of any evaluation.

 People feel threatened.

 Poorly designed - No training objectives or assessment


criteria has been identified.

 The organisation has no agreed policy for evaluation to take


place.
Evaluation Models:
Kirkpatrick

Learnin Behavio
Reaction Results
g r
Evaluation Models:
Kirkpatrick

Evaluation level Questions

1. Reaction What are participants’ reactions at the end of the


training? Were they happy with the training?

2. Learning What did the participants learn from the training?


Were the learning objectives achieved?

3. Behaviour Has job performance changed? Did the


participants change their behaviour based on
what was learned?

4. Results Did the behaviour change have a positive effect


on the organisation?
Level Questions Methods Timing Outcomes
1. What were the • Happy During and Are the learners still
Satisfaction reactions of the sheets/reaction immediatel motivated to learn? Has
/Reaction learners to the questionnaires y afterward the credibility of the
(happiness) activity? • Observations learning trainer/intervention
interventio been maintained or
n increased?
2. Learning What learning • Pre and post tests Before, Internal validation of
has been • Role during and changes to knowledge,
accomplished? plays/assessment immediatel skills and attitudes.
activities y What has been learnt?
• Observations afterwards What still needs to be
learnt?

3. What direct • Self-appraisals After a What improvements to


Impact/Job changes have • Peer-appraisals month or performance have been
performanc there been to • Manager-appraisals 3 months or measured?
e job • Performance 9 months What changes have
(transfer/ performance? indicators / targets been sustained?
application) • Interviews/
discussions
4. Business What was the • Reporting process After 3 Evaluation of learning
results ultimate value • Performance months or objectives.
(bottom- of the training indicators after 9 What has been the
line value) and learning? • Cost/benefit months benefit to the
analysis organisation? What still
Phillips added 5th Level

Reaction Learning Behavior Results ROI

Did the training investment provide a positive return on investment?


Other Evaluation Models

 Hodges’s Components for HRD Evaluation – formative


assessment and summative assessment. Still using Kirkpatrick’s 4
levels for summative assessment.

 Holton’s HRD Evaluation Model – Evaluation uses 3 levels –


Learning, Transfer and Results.

 Kraiger Decision-Based Model – three areas of evaluation –


Training Content and Design (sign off), Changes in Learners (cognitive
and behavioural), and Organisational Payoffs (business results,
training transfer).
Questions?
The Future of Evaluation
 The literature favours a more natural and holistic view that
encourages a dynamic, connected and organic approach.

 Reflection, feedback, and cycles of continuous improvement


need to flow between the learner, the department, and the
organisation – and back again!

 The greater the learning, so greater will be the effects,


creating wider and higher impact.
To conclude
 The HRD cycle is a useful analytical tool and guide to HRD practice.
 Create SMART learning objectives.
 Know your audience and build on prior knowledge, experience and
skills.
 Be creative with your design.
 Communicating, facilitating, organising and technical knowledge and
skills are vital for effective training delivery.
 Level of delivery required is dependent on learning intervention.
 Training evaluation, despite poor execution in practice, is vital for
raising the strategic profile and credibility of the HRD function.

You might also like