Safety Human Error PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Safety, Accident

and
Human Error

Definitions

What is human error?

An inappropriate or undesirable human decision or


behavior that reduces or has the potential for
reducing effectiveness, safety, or human performance

Classification scheme for human error (Swain


and Guttman, 1983)

Omissionfailure to do something
Commissionperform an act incorrectly
Sequenceperform an act in the wrong order
Timingfailure to perform the act in the allotted time period
Unintentionalaccidental performance of an act (knew it
was wrong)
2

Reason (1990)
(distilled from J. Reason, Human Error, 1990)

Preliminaries: the three stages of cognitive


processing for tasks
1) planning

a goal is identified and a sequence of actions is selected to reach


the goal

2) storage

the selected plan is stored in memory until it is appropriate to


carry it out

3) execution

the plan is implemented by the process of carrying out the actions


specified by the plan

A theory of human error

Each cognitive stage has an associated form of error

slips: execution stage

lapses: storage stage

incorrect execution of a planned action


example: miskeyed command
incorrect omission of a stored, planned action
examples: skipping a step on a checklist, forgetting to restore
normal valve settings after maintenance

mistakes: planning stage

the plan is not suitable for achieving the desired goal


example: TMI operators prematurely disabling HPI pumps

Origins of error: the GEMS


model

GEMS: Generic Error-Modeling System

an attempt to understand the origins of human error

GEMS identifies three levels of cognitive task processing

skill-based: familiar, automatic procedural tasks


rule-based: tasks approached by pattern-matching from a set
of internal problem-solving rules

observed symptoms X mean system is in state Y


if system state is Y, I should probably do Z to fix it

knowledge-based: tasks approached by reasoning from first


principles

when rules and experience dont apply

GEMS and errors

Errors can occur at each level

skill-based: slips and lapses

rule-based: mistakes

usually errors of inattention or misplaced attention


usually a result of picking an inappropriate rule
caused by misconstrued view of state, over-zealous pattern matching,
frequency gambling, deficient rules

knowledge-based: mistakes

due to incomplete/inaccurate understanding of system, confirmation


bias, overconfidence, cognitive strain, ...

Error frequencies

In raw frequencies, SB >> RB > KB

61% of errors are at skill-based level


27% of errors are at rule-based level
11% of errors are at knowledge-based level

But if we look at opportunities for error, the


order reverses

humans perform vastly more SB tasks than RB, and


vastly more RB than KB

so a given KB task is more likely to result in error than a


given RB or SB task

The Automation Irony

Automation is not the cure for human error

automation addresses the easy SB/RB tasks, leaving the


complex KB tasks for the human

automation hinders understanding and mental modeling

humans are ill-suited to KB tasks, especially under stress


decreases system visibility and increases complexity
operators dont get hands-on control experience
rule-set for RB tasks and models for KB tasks are weak

automation shifts the error source from operator errors


to design errors

harder to detect/tolerate/fix design errors

Dealing with Human Error


Selection select people

with capabilities and skills


required to perform the job

Design strategies

Training tell people

what to do
Problems:
1.
2.
3.
4.

People dont like to be


told what to do
People forget or revert
to old habits
Expensive
Must be repeated
periodically and for all
new hires

Exclusion strategiesmake
it impossible to make the
error (design it out) (e.g.
put a finger activator on
punches, )
Prevention designsmake
it difficult (not impossible)
to commit the error (guard
against)
Fail-safe designsdecrease
the consequences of error
(e.g.use steel gloves when
working with cutting tools)

Accidents

How is an accident defined?

An unanticipated event which damages the system and/or


individual or affects the accomplishment of the system mission
or individual task;
It can but not necessarily result in an injury

Analysis of accidents

Nature of the injury (death, amputation, laceration, etc.)


Part of the body affected
Type of accident (struck by, caught between, etc.)
Source of injury (tools, body movement, etc.)

10

Accidents (2)

Factors contributing to accidents

Trait theories (later) such as accident proneness


Age (younger and over 60 more)
Immediate environment (noise, temperature,
light, workspace)
Equipment (controls, displays, compatibility, visibility,
guarding)
Work (pacing, physical workload, mental workload, motor
skills, etc.)
Worker (skill, experience, training, etc.)
Management (policies, safety, productivity requirements,
incentives)
Psychosocial (morale, climate, union, communication)
11

Accidents

Theories for describing/explaining accident


occurrence
1.
2.
3.
4.

Accident proneness
Accident liabilitysome people more prone to
have accident due to situation
Job demands vs. worker capability
Psychosocial

12

Factors Contributing to
Accidents

Personnel
Trait theories (later) such as
accident proneness
Age
Experience
Gender

Workload (mental/physical)
Work-rest cycles
Shift
Pacing
Procedures
Arousal, fatigue

Equipment and tools

Control/displays
Electrical hazards
Mechanical hazards
Thermal hazards
Pressure hazards
Toxic substances

Physical Environment

Job

Illumination
Noise
Temperature
Vibration
Humidity
Radiation
Fire hazards
Airborne pollutants

Social/Psychological
Environment

Management practices
Social norms
Morale
Training
Incentives
13

Human error and accident theory

Major systems accidents (normal accidents)


start with an accumulation of latent errors

most of those latent errors are human errors

latent slips/lapses, particularly in maintenance

latent mistakes in system design, organization, and


planning, particularly of emergency procedures

example: misconfigured valves in TMI

example: flowcharts that omit unforeseen paths

invisible latent errors change system reality


without altering operators models

seemingly-correct actions can then trigger accidents

Accidents are exacerbated by human errors


made during operator response

RB errors made due to lack of experience with system


in failure states

KB reasoning is hindered by system complexity and


cognitive strain

training is rarely sufficient to develop a rule base that


captures system response outside of normal bounds

system complexity prohibits mental modeling


stress of an emergency encourages RB approaches and
diminishes KB effectiveness

system visibility limited by automation and defense


in depth

results in improper rule choices and KB reasoning

Active versus Latent Failures (Reason, 1990)


Organizational
Factors

Latent Conditions
Excessive cost cutting
Inadequate promotion policies

Unsafe
Supervision

Latent Conditions
Deficient training program
Improper crew pairing

Preconditions
for
Unsafe Acts

Latent Conditions
Poor CRM
Mental Fatigue

Unsafe
Acts

Failed or
Absent Defenses

Robbing the pillar

Active Conditions
Failed to Scan Instruments
Penetrated IMC when VMC only

Accident & Injury

Crashed into side of


mountain

Hazards

Set of conditions that, together with other conditions,


will inevitably lead to an accident

Electrical
Mechanical
Pressure
Toxic substances
Radiation (300 REM kills 50% in 50 days)
Fire
Fall and trip (11ft fall kills 50%, but flight attendant survived
33,000 ft fall)

17

Managing hazards

Hazard elimination

Hazard reduction

Design for control


Lockouts

Hazard control

Substitution (lambs rather than lions)


Decoupling
Elimination of specific human errors

Reduce exposure
Isolation and containment

Damage minimization
18

Hazard management

Focus on severe hazards

Criticality = Severity X Frequency

Reduce hazards by addressing:

Source (elimination, control, limit damage)


Path (guards, task redesign)
Person (warnings, training)
Administration (rules, regulations)

19

Warnings

Purpose

Inform users or potential users of a hazard or danger through misuse


of the product
Provide information on likelihood/severity of injury from misuse
Provide information on how to decrease likelihood/severity of injury
from misuse
Remind users of danger at time and place where danger encountered

Design guidelines (minimum)

Signal word (danger, hazard, caution)


Nature of the hazard (high voltage, tightly wound part)
Consequences (can kill, lung cancer)
Instructions (stay away, do not dispose of in fire, keep away from
flame)
20

10

Warnings

Factors affecting detectability

Must catch users attention (size, shape, color, symbols, contrast,


placement, active attn getting words, durability)
Level of hazard

Dangerimmediate severe personal injury or death


Warningpossibility of severe personal injury or death
Cautionminor personal injury or property damage

Factors affecting understanding

Dont use vague or ambiguous, highly technical language (use


words that can understood at the 8th grade level)
Dont use long phrases
Be careful with symbols
Communicate consequences but dont horrify

21

Risk Perception & Warnings


!

DANGER

WARNING

CAUTION

NOTICE
22

11

Criteria for Symbols and Codes


Recognitionpeople must be able to understand the info you are
conveying
Appropriate Responsemake sure the action to take based on the symbol
is well known and understood
Coloradhere to the standards (blue and green are information, yellow is
warning, orange is hazard, red is danger)
Figure/Ground Clarity
Closureuse solid shapes with lined boundaries
SimplicityKiss principal, minimize the amount of detail, but be clear
Unity
Population Stereotypeswe expect certain symbols to mean certain
things
23

Common Warning Symbols


poison

flammable

biohazard

alert

radiation

electric shock
24

12

Common Hazard Symbols


slip hazard

prohibition

men at work

25

13

You might also like