CTR Level 3 Presentation Evaluation Kirkpatrick

Download as pdf or txt
Download as pdf or txt
You are on page 1of 43

Capture

Elusive Level 3 Data:


The Secrets of
Survey Design

Presented by:
Ken Phillips
Phillips Associates
March 22, 2018

Phillips Associates 1
Agenda

1. Examine Level 3 evaluation facts


2. Analyze survey creation errors in a
sample Level 3 evaluation

3. Discover 12 tips for creating valid,


scientifically sound Level 3 evaluations

Phillips Associates 2
Kirkpatrick / Phillips Evaluation Model
Degree to which participants find the training
Level 1: Reaction
favorable, engaging, and relevant to their jobs

Degree to which participants acquire the intended


knowledge, skills, attitude, confidence
Level 2: Learning
and commitment based on their participation in the
training

Degree to which participants apply what they learned


Level 3: Behavior
during training when they are back on the job

Degree to which targeted outcomes improve as a


Level 4: Results result of the training, and the support and
accountability package

Degree to which monetary program benefits exceed


Level 5: ROI
program costs

Phillips Associates 3
Level 3 Evaluation Facts

Organizationsview
Organizations
of live
of classroom

18%
tech based

60%
75%
33% evaluate
programs
asprograms
some
data collected
programsbeing
evaluated
at Level
or very
being
having high
evaluated
high3value

Source: ATD Research Study, “Evaluating Learning Getting to Measurements That Matter,” 2015

Phillips Associates 4
Data Collection Methods

Source: Donald & James Kirkpatrick, “Evaluating Training Programs: The Four Levels,” 2006.

Phillips Associates 5
5
Possible Survey Respondents

Direct
Peers/ External
reports
Learners Colleagues customers
Managers
of learners

Phillips Associates 6
How to Decide

Who has first-hand knowledge of


learners’ behavior?

How credible do results need to be?

Phillips Associates 7
Sample
Level 3
Participant Survey

Phillips Associates 8
Instructions

1. Form a group of 3, 4 or 5 persons


2. Review sample Level 3 participant survey in
handout and see how many different survey
creation errors you can find (Hint: 9 different
errors are built into the survey)

3. Be prepared to discuss your findings with the


whole group

Note: Survey respondents are the direct reports


of managers/ supervisors who attended an
interpersonal feedback learning program.

Phillips Associates 9
Format
Content

Measure
ment

Scientifically Sound
Survey Design

Phillips Associates 10
Content

Phillips Associates
What’s Wrong With These?

8. Before providing employees with feedback


about their job performance, my manager
considers whether or not he or she is
knowledgeable about their job.

25. When giving feedback to an employee my


manager considers whether it should be
done privately or in the presence of others.

Phillips Associates 12
Tip 1: Content

Focus on observable behavior


not thoughts or motives.

Source: Palmer Morrel-Samuels, “Getting the Truth into Workplace Surveys”,


Harvard Business Review, 2002.

Phillips Associates 13
What’s Wrong With These?

14. My manager gives his or her employees


feedback just as soon as possible after an
event has happened and avoids getting
emotional or evaluative.

18. My manager provides employees with


regular ongoing feedback about their
job performance and speaks in a normal
conversational tone or manner when
delivering the feedback.

Phillips Associates 14
Tip 2: Content

Limit each item to a


single description of behavior.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 15
Example

My manager gives his or her employees


feedback just as soon as possible after
an event has happened.

My manager avoids getting emotional or


evaluative when giving feedback to his or
her employees.

Phillips Associates 16
What’s Wrong With These?

2. My manager doesn’t get to know his or


her employees as individuals before
providing them with feedback about their
job performance.

7. When giving employees feedback about


their job performance, my manager
doesn’t distinguish between patterns of
behavior and random one-time events.

Phillips Associates 17
Tip 3: Content

Word about 1/3 of the survey items so that


the desired answer is negative.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 18
Format

Phillips Associates 19
What’s Wrong With These?

Building Trust Feedback Timing

Credibility Feedback Frequency

Feedback Sign Message Characteristics

Phillips Associates 20
Tip 4: Format

Keep sections of the survey unlabeled.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 21
Tip 5: Format

Design sections to contain


a similar number of items
and questions to contain
a similar number of words.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 22
Tip 6: Format

Place questions regarding


respondent demographics
(e.g. name, title, department, etc.)
at end of survey, make completion optional
and keep questions to a minimum.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 23
Measurement

Phillips Associates 24
Tip 7: Measurement

Collect data from multiple observers


or a single observer multiple times.

Source: Ken Phillips, “Capturing Elusive Level 3 Data: The Secrets of Survey Design”,
Unpublished Article, 2013.
Phillips Associates 25
What’s Wrong With This?

Strongly Strongly
Agree Agree Disagree N/A
Disagree

4 3 2 1

Phillips Associates 26
Tip 8: Measurement

Create a response scale


with numbers at regularly spaced intervals
and words only at each end.

*Palmer Morrel-Samuels, 2002.


Source: Palmer Morrel-Samuels, 2002

Phillips Associates 27
Examples
This:
Not at all Completely
True True

1 2 3 4 5 6 7

Not This:
Not at all Rarely Occasionally Somewhat Mostly Frequently Completely
True True True True True True True

1 2 3 4 5 6 7

Or This:
Not at all Rarely Occasionally Somewhat Mostly Frequently Completely
True True True True True True True

Phillips Associates 28
Tip 9: Measurement

Use only one response scale


with an odd number of points
(7, 9 & 11 point scales are best)

*Palmer Morrel-Samuels, 2002.


Source: Palmer Morrel-Samuels, 2002

Phillips Associates 29
Odd vs. Even Scale

This:

Not This:

Phillips Associates 30
Tip 10: Measurement

Use a response scale


that measures frequency
not agreement or effectiveness.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 31
Examples
This:
Never Always

1 2 3 4 5 6 7

Or this:
Not at all Completely
True True

1 2 3 4 5 6 7

Phillips Associates 32
Tip 11: Measurement

Place small numbers


at left or low end of scale
and large numbers at right
or high end of scale.

Source: Ken Phillips, 2013

Phillips Associates 33
Examples
This:
Not at all Completely
True True

1 2 3 4 5 6 7

Not This:
Completely Not at all
True True

7 6 5 4 3 2 1

Phillips Associates 34
Tip 12: Measurement

Include a
“Did Not Observe” response choice
and make it different.

*Palmer Morrel-Samuels, 2002.


Source: Palmer Morrel-Samuels, 2002

Phillips Associates 35
Example

Not at all Completely Did Not


True True Observe

1 2 3 4 5 6 7

Phillips Associates 36
Summary: Content

Focus on observable behavior

Limit ideas to a single description of behavior

Word 1/3 of items as reverse score

Phillips Associates 37
Summary: Format

Keep survey sections unlabeled

Design sections to contain similar number of


items & questions similar number of words

Place questions regarding respondent


demographics at end of survey, make
completion optional and keep questions
to a minimum
Phillips Associates 38
Summary: Measurement

Collect data from multiple observers or


multiple times

Create a response scale that:


 Has words only at each end
 Has an odd number of points
 Measures frequency
 Has small numbers at left and large numbers
at right
 Includes a “Did Not Observe” that is different
Phillips Associates 39
The difference between a good survey
and a bad one… quite simply, is careful
and informed design.

Source: Palmer Morrel-Samuels, 2002

Phillips Associates 40
Phillips Associates 41
Free Articles
Phillips, Ken, “Eight Tips on Developing Valid Level 1
Evaluation Forms”, Training Today, Fall 2007, pps. 8 & 14.

Phillips, Ken, “Developing Valid Level 2 Evaluations”, Training


Today, Fall 2009, pps. 6-8.

Phillips, Ken, “Capturing Elusive Level 3 Data: The Secrets of


Survey Design”, Unpublished article, 2013.

Phillips, Ken, “Level 1 Evaluations: Do They Have a Role in


Organizational Learning Strategy?”, Unpublished article, 2013.

Phillips, Ken, “Business Results Made Visible: Designing Proof


Positive Level 4 Evaluations”, Unpublished article, 2013.
Phillips Associates 42
Ken Phillips

Phillips Associates
[email protected]
(847) 231-6068

www.phillipsassociates.com

34137 N. Wooded Glen Drive


Grayslake, Illinois 60030

Phillips Associates 43

You might also like