Intelligence Manual 2
Intelligence Manual 2
Intelligence Manual 2
FOR INTELLIGENCE
ANALYSIS
A guide for practitioners
CHARLES VANDEPEER
Published by:
Air Power Development Centre, Department of Defence
PO Box 7932, CANBERRA ACT 2610, AUSTRALIA
Telephone: + 61 2 6128 7041 | Facsimile: +61 2 6128 7053
E-mail: [email protected]
Website: www.airforce.gov.au/airpower
iii
Acknowledgements
Many people have been involved in helping to get this book from an idea to
reality. I would like to thank the efforts of those who supported this project
from the outset, in particular Squadron Leader Andrew H, Wing Commander
ColinC and Squadron Leader Stephanie S. Without each of you, this book simply
could not have happened. I would also like to thank the consistent support and
encouragement from Bruce Vandepeer and David Olney, who were instrumental as
sounding-boards during the development of the book. The practical feedback and
suggestions from Group Captain Trotman-Dickenson also helped to ensure the
success of this endeavour.
I would like to thank the many people involved in reviewing the draft and providing
invaluable critique, comments and suggestions. I deliberately sought feedback
from a diverse range of people from numerous fields including former and current
intelligence personnel, academics, operations research scientists and police
analysts. I am indebted to the feedback from Dr Dirk Maclean, Colonel Michael
iv
Foreword
As a specialisation, intelligence is going through a remarkable period of
development and adjustment reflecting the dynamic changes in the world
politically and technologically. Whole new operating environments such as the
cyber domain are rapidly opening up, while the threats and opportunities for
intelligence in the physical domain become more shadowy and nebulous. For those
of us in the military, new levels of intelligence integration into advanced and semiautonomic weapons systems will be required in just a few years time to support
the future data-fused battlespace. However, no matter the environmental changes
or the complexities of the systems, the fundamentals of intelligence analysis remain
and will endure.
Charles Vandepeer is somewhat unique in the Australian context in that he has had
three careers: as an active duty Air Force intelligence officer, as a civilian research
scientist, and more latterly as an academic specialising in intelligence analysis.
I cannot think of someone more qualified to pull together the many strands of
what constitutes effective intelligence analysis and synthesise them into a coherent
package.
In Applied Thinking for Intelligence Analysis: A Guide for Practitioners, Charles
has produced a book aimed at the practical level and designed to empower the
intelligence analyst with concepts and tools to achieve a tangible result. A book
focused on applied thinking is long overdue and with this volume Charles fills the
void for a ready reckoner on intelligence analysis. Although principally written
to support introductory training for junior analysts, this easily accessed and
immensely readable book is a singular resource for intelligence professionals both
young and old, and for that matter anyone with an interest in improving their
analytical techniques.
Charles takes the reader on a logical journey through the conceptual foundations
of effective modern intelligence analysis in the early chapters, and introduces and
discusses the real-world problems and constraints that will confront any analyst.
For those of us with some experience in the intelligence business, the dangers of
failing to identify the specific audience and the specific requirements, wasting
precious time and resources and delivering a weak analytical product, will resonate.
In his chapters Problems, Time and Research and Knowledge, Information and
Evidence, Charles examines the key factors underpinning the intelligence analysis
process and provides thoughtful, practical advice on how to situate, structure and
deliver that analysis.
vi
vii
NGT
RAAF
viii
Contents
The Air Power Development Centre . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iii
About the Author . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . iv
Foreword . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Abbreviations And Acronyms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
Chapter 1
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Chapter 2
Situating Intelligence Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Importance of intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Expectations of intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Why is intelligence analysis difficult? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
A working definition of intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Intelligence analysis as decision-making . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
Chapter 3
Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Defining the Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Problem Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Chapter 4
Time and Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Time . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Research . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Be Prepared . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Backcasting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
The Planning Fallacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Chapter 5
Knowledge, Information and Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Tacit and Explicit Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Non-knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Deductive Reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Inductive reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Abductive Reasoning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Overabundance of Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
ix
Information Diversity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Information credibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
Diagnostic Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Negative Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Absence of Evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Alexanders Question . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Chapter 6
Language: Analogies and Metaphors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Analogy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Metaphor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Chapter 7
Expertise, Prediction and Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Expertise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Prediction Markets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Prediction versus Retrodiction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Surprise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Anticipation Versus Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Chapter 8
Complexity, Uncertainty, Unintended Consequences and Assumptions . . . 57
Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Unintended Consequences . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .60
Assumptions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Chapter 9
Explanations and Estimates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Cause and Effect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
Interpreting Behaviour . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Statements of Uncertainty . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
Scenarios and Likelihood . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Chapter 10
Environment and Situation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Work Environment and Social Situation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Conformity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
The Corporate Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Our Previous Assessments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Dissent . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
Chapter 11
Critical Thinking and Formal Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
Critical Thinking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
x
Formal Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
Nominal Group Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Pre-Mortem Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Indicators and Warnings . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Chapter 12
Mindsets, Biases and Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Mindsets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Cognitive Biases . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Chapter 13
Unfinished Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Deferred Judgement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
Unfinished Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
Critique and Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Peer Review . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
After Action Reviews and Crew Debriefs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
Analytic Principles . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Chapter 14
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Key Points . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
xi
List of Figures
Figure 2.1: Cognitive challenges facing analysts . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Figure 3.1: Questions to help understand the context of a problem . . . . . . . . . . . . 11
Figure 3.2: Relevant questions for identifying health risks . . . . . . . . . . . . . . . . . . . . 14
Figure 3.3: Delineation of problems into puzzles, problems and messes . . . . . . . . 17
Figure 4.1: Indicative backcast for buying land and building a house . . . . . . . . . . 23
Figure 4.2: Indicative backcast for delivery of a formal report . . . . . . . . . . . . . . . . . 24
Figure 5.1: Knowable and unknowable based on time . . . . . . . . . . . . . . . . . . . . . . . 29
Figure 5.2: Information into facts, judgements and pertinent unknowns . . . . . . . 30
Figure 5.3: Aspects of knowledge and non-knowledge . . . . . . . . . . . . . . . . . . . . . . . 32
Figure 5.4: Adapted version of the Johari window . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Figure 8.1: Example of mind mapping technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Figure 11.1: Benefits of formal methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
List of Tables
Table 3.1: Brief typology of commonly used terms for describing problems . . . . 18
xii
Chapter 1
Introduction
Chapter 2
Importance of intelligence
Intelligence is important. That is certainly the perception evident in the
formalisation and significant growth of intelligence agencies since World WarII.
Governments around the world have established entirely new agencies and
departments in response to actual or perceived threats from state and nonstate actors. Each year, governments spend billions of dollars on intelligence
agencies, reflecting a belief that intelligence is important to national security. The
presumption is that intelligence identifies and prevents threats, informs important
decisions, and helps users of intelligence better understand uncertain and changing
situations.
Expectations of intelligence
Governments, policymakers, military commanders and the public all have
expectations of what intelligence is, what agencies and analysts do, and what
intelligence can provide. There is now a greater public awareness and expectation
of what intelligence agencies can and should provide.1 This has come about for a
number of reasons. One reason is a result of the collective shock experienced by
the public at actually witnessing the images of mass-casualty attacks and their
aftermath (for example, New York and Washington, Bali, Madrid, London and
Boston). When intelligence is thought to have failed the publics expectations of
what intelligence should deliver become apparent; namely: to be accurate, identify
threats, prevent surprise, and protect the nation and its citizens. Governments
Ormand, Sir David, 2009, The National Security Strategy: Implications for the UK intelligence
community, Institute for Public Policy Research, Discussion Paper, London, February, p 5.
own actions have also increased public awareness and expectations of intelligence
through the use of intelligence analysis to inform and shape public perceptions
of threats at a national and international level.2 Recent disclosures of intelligence
material have also raised debate over the role of intelligence agencies and
government use of information, a debate that is likely to continue for as long as
intelligence agencies exist.
What people want intelligence analysts to deliver is based on what they think about
intelligence, which can be based on positive or negative experiences and accurate
or inaccurate perceptions. So before we consider what military commanders,
operators or even other analysts want from intelligence, we need to think about
what these people understand by intelligence and what they expect that intelligence
can deliver. Each person, commander and analyst will have their own expectations
about what intelligence can and cannot do. Some military commanders will have
realistic expectations that analysts can provide support to operations, detailed
tactical information and assessments on threat, and judgements on likely or
plausible enemy actions. Other military commanders might have more unrealistic
expectations, like analysts being able to predict the future, give absolute certainty of
what an enemy will do, or provide assurance of mission success. Still others might
think intelligence is a waste of time, preferring instead to rely solely on their own
judgements about a situation.
This book will assist analysts in developing an understanding of what intelligence
can and cannot do. It is important that we as analysts have a foundational
understanding of what is and what is not possible, as our own understanding
shapes the decisions and judgements that we make on a daily basis. So what do
practitioners and researchers within the field think intelligence analysts should
provide? For a consideration, refer to the table of quotes on page 6.
Organisational
Context
Inadequate
Tools
Time
Pressure
Synthesising
Multiple Sources
of Information
Complex
Judgements
Data
Overload
Potential
for Error
High
Mental
Workload
Coping with
Uncertainty
Hutchins, Susan G, Pirolli, Peter L & Card, Stuart K, 2007, What makes intelligence analysis
difficult?: a cognitive task analysis, in Hoffman, Robert R (ed), Expertise Out of Context, Lawrence
Erlbaum Associates, New York, NY, pp 298304.
Spellman, Barbara A, 2011, Individual reasoning, in Fischhoff, Baruch & Chauvin, Cherie (eds),
Intelligence Analysis: Behavioral and Social Scientific Foundations, The National Academies Press,
Washington, DC, p 117.
Quiggin, Thomas, 2007, Seeing the Invisible: National Security Intelligence in an Uncertain Age, World
Scientific Publishing, Singapore, pp 9798.
Petersen, Martin, 2011, What I learned in 40 Years of doing intelligence analysis for US foreign
policymakers, Studies in Intelligence, vol 55, no 1, March, p 19.
Thompson, JR, Hopf-Weichel, R & Geiselman, RE, 1984, The Cognitive Bases of Intelligence Analysis,
U S Army, Research Institute for the Behavioral and Social Sciences, Alexandria, VA, January, p 4-1.
Davis, Jack, 2002, Improving CIA Analytic Performance: Strategic Warning, Sherman Kent Center for
Intelligence Analysis Occasional Papers: Volume 1, Number 1, September.
Fingar, Thomas, 2011, Analysis in the U.S. intelligence community: missions, masters, and methods,
in Fischhoff, Baruch & Chauvin, Cherie (eds), Intelligence Analysis: Behavioral and Social Scientific
Foundations, The National Academies Press, Washington, DC, p 4.
Royal Australian Air Force, Australian Air Publication 1000DThe Air Power Manual, Sixth
Edition, Air Power Development Centre, Canberra, 2013, p.223.
Lefebvre, Stephane, 2004, A look at intelligence analysis, International Journal of Intelligence and
CounterIntelligence, vol 17, no 2, p 235.
For those familiar with the intelligence cycle (planning, collection, processing, analysis and
production, and dissemination), this book can be argued as primarily focusing on the analysis
and production step. However, the concept of an intelligence cycle is not without criticism, in
particular that the model provides an inaccurate representation of the intelligence process. For a
useful critique refer to Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community:
An Ethnographic Study, The Center for the Study of Intelligence, Washington, DC, Chapter 4.
Cooper, Jeffrey R, 2005, Curing Analytic Pathologies: Pathways to Improved Intelligence Analysis,
Center for the Study of Intelligence, Washington, DC, pp 1415.
Chapter 3
Problems
Intelligence analysts deal with problems on a daily basis. These problems can be
seen in the myriad of questions that analysts are asked. Familiar questions include
variations of some of the following:
A problem has two critical attributes: it is an unknown entity in some context; and
finding or solving this unknown has a value.1 In this chapter, we will consider a
number of aspects of problems, specifically:
When we think back to the questions that intelligence analysts are asked, they can
easily be defined as problems because: the answer is currently unknown, and the
question is considered important to answer. If military commanders already had
the answer to the question, then they would not ask an analyst for an answer (it
would not be a problem). It is worth highlighting that the problem and the question
are not necessarily the same thing. As analysts, we first have to determine exactly
what the problem is; taking the question that has been asked as our cue. For
example, What is the threat to X? is a question, but the problem driving it might
be quite different, namely: Do we go or not go on this mission? Is the risk level too
high? Is this the right mission at all? We will now look at the context, definition
and types of problems in turn.
Context
What do we mean by context? Effectively, the context is the broader meaning, the
surrounding issues that give us an understanding of why the question was asked.
Addressing any problem relies on a broader understanding of the context of the
problem. The less we know about the context of the problem, the more difficult it is
to understand what is actually required and whether or not the analysis we deliver
actually meets a military commanders needs. Some useful initial questions to help
us understand the context of the problem include those shown in Figure 3.1:
Jonassen, David H, 2004, Learning to Solve Problems: An Instructional Design Guide, Pfeiffer, San
Francisco, CA, p 3.
10
Problems
I got
have
g
on I doing it for
am the audience
is
o t is the problem e be c
a will the respons
e
xp
e
ow t is the
r
requeste
ha
W
Ho
W wl
W ho
W h
H h
Context
sid
ion
est
qu asked
ng d a problem
e
er
W hy y is w
W h no
hy
W
us
ed
o
tin
sp
gm
e
r
e to provide in
sk
be
am ave I ques ed th
y
t
ion
e
h
is
h
h
W hy is t this con bei
ns
e
this
oing
d
I
en a
Understanding the context of the problem also includes knowing what the response
will be used for. The more generic the understanding of how analysis will be used
(ie to aid decision-making) the less useful the analysis, because it will be similarly
generic. A good relationship with the people asking the questions is key, but
unfortunately not always achievable. Poor communication between a requestor and
an analyst risks misunderstanding the problem, wasting resources and delivering
irrelevant analysis. This is why expectations, context and the audience are all
important when understanding and defining the problem itself. As one experienced
analyst argued, weak analytical products are a result of analysts failing to identify
a specific audience and specific question that they need addressed.2 So, wherever
possible, analysts need to get an understanding of the who, what and why of those
requesting the analysis and the problem at hand.
Petersen, Martin, 2011, What I learned in 40 years of doing intelligence analysis for US foreign
policymakers, Studies in Intelligence, vol 55, no 1, March, p 17.
11
One reason that people go straight to coming up with solutions, even before
understanding the problem, is because of time pressures. By developing an
understanding of the actual problem saves time and effort by ensuring that we
are researching and analysing the right problem.4 Further, given that intelligence
analysts are often working in teams, defining the problem ensures that everyone
is working towards addressing the same problem. Ultimately, the aim of problem
definition is to come up with a specific question that can be addressed.
It is worth observing that not everyone jumps into a task before defining the
problem. Research suggests that experts go about solving problems differently
than novices by spending more time at the beginning of a task thinking about the
problem itself compared with novices who attempt to immediately find a solution.5
The key point is starting with an accurate understanding of the problem because
how we define a problem impacts on how we research, analyse and report on the
problem. Consequently, we are considering:
Drner, Dietrich, 1989, The Logic of Failure: Recognizing and Avoiding Error in Complex
Situations, Perseus Books, Cambridge, MA, p 186.
Jones, Morgan D, 1998, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, Three
Rivers Press, New York, NY, p 63.
Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community: An Ethnographic
Study, The Center for the Study of Intelligence, Washington, DC, p 64.
12
Problems
the approach would be identifying a range of recruiting strategies and selecting the
cheapest. However, this may not actually address the real problem in relation to
staffing. One approach for getting to the actual problem is the 5 Whys approach,
that is asking Why five times, in order to get to the root cause of the problem.6 If
we applied this to the companys desire for cost-effective recruitment, it would look
something like this:
1.
2.
3.
Why are we are having to replace a lot of our staff? Because a lot of staff
are leaving the company.
4.
Why are a lot of people leaving the company? Because they are unhappy
with the management and leadership.
5.
Why are people unhappy with the management and leadership? Because
nobody knows what is going on, how decisions are being made, and
ultimately if the company has a future.
The point of the exercise is to identify the root cause and get to the actual problem.
In this case, we can see that what starts out as a recruitment problem is actually
a retention problem and, ultimately, a communication issue for the leadership and
management.
Another way of better understanding a problem is to restate it in a number of
ways before choosing the statement that best captures the problem.7 The benefit
of a more specific question is that we can better focus our research and it makes
it easier to determine what is relevant and what is not. For example, the problem
What are the health risks to people travelling overseas? is enormously broad, lacks
definition and specifics. In contrast, What are the health risks for people travelling
to country X? provides a much more specific and bounded question. With the
problem defined, analysts can then identify relevant questions that they will need to
Taiichi Ohno, 1988, Toyota Production System: Beyond Large-Scale Production, Productivity
Press, Portland, OR, pp 1718.
7 Jones, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, Chapter 3.
13
answer to provide an assessment of the problem.8 For the problem above, a list of
questions might include those shown in Figure 3.2:
What
standard of
health care is
available?
Is
there any
reporting of
specific illnesses
or diseases
within the
country?
What
medical
facilities are
available in
country X?
WHAT ARE
THE HEALTH
RISKS
FOR PEOPLE
TRAVELLING
TO COUNTRY X ?
How
quickly can
people be
repatriated if
needed?
Are they
any seasonal
health issues?
If so, then
what are
they?
What
are the
current health
issues within
country?
What
inoculations are
recommended
for people
travelling to
country?
How do
these differ
from our
country?
Defining the problem and developing a list of relevant questions that need to be
answered saves us time by:
Petersen, Martin, 1985, Managing/teaching new analysts, Studies in Intelligence, vol 30, Fall, p6,
viewed 4 September 2014, <http://www.nationalsecuritylaw.org/files/received/CIA/PetersenManaging_Teaching_New_Analysts.pdf>.
9 ibid.
14
Problems
Problem Types
Instinctively we know that not all problems are alike. Some problems we can
confidently answer, while for others we can only provide informed guesses or
estimates; some problems are solvable while others are not. We must acknowledge
this at the outset, because the way we understand the problem affects the way we
conceptualise and research it and the type of answer that we provide. If we think
that a problem has a neat, single solution then we will approach it looking for the
correct answer. In contrast, if we think that a problem is complex and constantly
changing, then this too will frame the kind of response that we provide. By
deliberately considering the nature of the problem confronting us we can then
appreciate what is and what is not possible.
One of the critical factors influencing the nature of a problem is time; namely,
when is the problem set. Is the problem one that has happened in the past and we
are trying to understand something that has happened? Is the problem set in the
present and we are trying to understand what is happening? Alternatively, is the
problem set in the future and we are trying to understand what will happen? A
useful approach is to think about this in terms of your own life. You might be most
confident about describing what you have done in the past and what you are doing
right now, but less confident in speculating on what you are going to be doing in the
future. Why? Because the future has not yet happened. In saying that, our level of
uncertainty about the future depends on how far out we are looking. For example,
you might be fairly confident that you know what you will be doing tomorrow, but
very unsure about what you will be doing one year from now.
Now apply this to military operations. We can assess, with some confidence, what
an adversary has done in the past; we know that it has happened. Similarly, we can
assess what that adversary is doing right now, based on observing their behaviour.
However, in assessing what the adversary will do in the future we would have the
least confidence in our judgement, simply because it has not happened. Again,
our uncertainty about an adversarys future actions is based on how far ahead
we are looking. We might be relatively confident about the adversarys behaviour
tomorrow, but far less confident of what they will be doing in a week, a month or a
year from now.
Having considered problems in terms of when they are set, let us look a little more
closely at the nature of the problems themselves. Problems also differ in terms of
the actual contentwhat is it that we are trying to solve? For example, what does
2+2 equal? is a problem, as is what is the enemy going to do? The first problem is
mathematical; the second is about peoples future behaviours. Both are problems
15
but the two are very different in subject matter and the method of answering
them. In intelligence analysis, we can expect to get all manner of problems, some
that have a single solution, some with multiple possible solutions, and others that
we can only provide estimates or conjectures. Neil Browne and Stuart Keeley
point out that arriving at definitive answers to questions very much depends on
the subject matter, and contrast questions about our physical environment with
questions about our social environment. They note that we are more likely to arrive
at definitive answers when dealing with the physical environment because it is
more dependable and predictable than the social world. In contrast, [t]he causes
of human behaviour are so complex that we frequently cannot do much more than
form intelligent guesses about why or when certain behaviour will occur.10
Where does this leave us as analysts? Much of the time intelligence problems tend
to be people problems. To be more specific, many intelligence problems are futurebased people problems. Therefore, assessing what an adversary will do is difficult,
both because it is set in the future and because it is based on human behaviour.
When we are referring to future behaviour there are multiple plausible scenarios of
what could happen.
This leads us to how people describe different types of problems. Within
Operations Research, there is a commonly referred to delineation of problem types
into puzzles, problems and messes (Figure 3.3).11
10 Browne, M Neil & Keeley, Stuart M, 2007, Asking the Right Questions: A Guide to Critical
Thinking, Eighth Edition, Pearson Prentice Hall, Upper Saddle River, NJ, p 7.
11 This delineation is described by Michael Pidd, who attributes the concepts to Russell Ackoff s
ideas. Pidd, Michael, 2004, Tools for Thinking: Modelling in Management Science, Second Edition,
Wiley, Chichester, pp 5862.
16
Problems
Puzzle
?
?
A well-structured,
understood
problem usually
with a single
correct solution
that can be
identified
? ?
Problem
A defined and
potentially agreed
upon issue, with
several valid solutions
but no one correct
answer
? ?
Mess
An ambiguous, complex
and uncertain system of
problems, with
disagreement over the
definition of the problem
and no certainty that a
solution even exists
12 ibid.
17
Solvable
Unsolvable
NATO
Puzzles
Problems
Wicked Problems
Treverton
Secrets
Mysteries
Complexities
Ackoff
Puzzles
Problems
Messes
Jones
Simplistic Indeterminate
Deterministic
Random
Table 3.1: Brief typology of commonly used terms for describing problems
While some problems are solvable, others are not. Some problems are so complex
with innumerable interacting parts (such as, What will country X look like after
the withdrawal of peacekeeping forces?). These types of problems require a
consideration of the likelihood of a variety of plausible futures, rather than a single
solution. As we find out more about situations, even ones set in the past, we can
find that our understanding changes significantly and what we once thought was
resolved needs to be reconsidered. Consequently, there are times when even
solved problems need to be revisited in the light of new information.
Even when analysts do adopt the same terms to describe problems, we will not
necessarily agree on the type of problem that we are addressing. Consequently,
we might find ourselves working on a problem we believe to be unsolvable
alongside someone who believes that the problem is solvable. In this instance, it
is worth clarifying which aspects of the problem each of us believe to be solvable
and which we believe to be unsolvable. The terms used to describe problems are
important in communication amongst analysts and consumers of intelligence; they
are useful when they help to achieve a shared appreciation and avoid potential
misunderstandings.
18
Chapter 4
Time
Time is a critical factor for intelligence analysts. Time pressure is a constant theme
throughout intelligence literature, with analysts consistently identifying time as
one of the greatest constraints of the job.2 Whether given minutes, hours, days
or weeks, analysts are required to deliver assessments; we will have a time limit
within which to deliver a required analytical output. Unfortunately, these pressures
are unlikely to go away. Indeed, with more information becoming available more
quickly, people expect faster decisions without the time allowed in the past for
thoughtful reflection.3 Rather than simply repeating that time constraints are an
issue, this chapter suggests some approaches that can assist us in making better use
of the limited time available. The point is to recognise that time pressures are going
to be a significant issue and that we can factor this into planning and preparation.
Intelligence analysis is used to support decisions, actions and responses to events.
Consequently, decision-makers will (naturally) seek to maximise the time they have
to make a decision, squeezing the time available for considered analysis. It is worth
Puvathingal, Bess J & Hantula, Donald A, 2012, Revisiting the psychology of intelligence analysis:
from rational actors to adaptive thinkers, American Psychologist, vol 67, no 3, April.
For example, refer to Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community:
An Ethnographic Study, The Center for the Study of Intelligence, Washington, DC, p 13.
Klein, Gary, 1998, Sources of Power: How People Make Decisions, MIT Press, Cambridge, MA, p
279.
19
highlighting that decisions are often a rolling series rather than a one-off; yet, the
observation standsdecision-makers want to have all the information available
as early as possible in the hope of making the most informed and best decisions.
At least some of the issue of decreased response time appears to be people
wanting responses faster, rather than a genuine requirement for faster responses.
Some people will have unrealistic expectations about how long it takes to analyse
information and form a meaningful and considered judgement. In contrast, critical
situations do exist where there is a genuine need for rapid decisions, particularly
when the information is time sensitive (for example, person X will be in this
location in three hours).
Time pressures increase the risks of mistakes, confirmatory thinking,4 cognitive
bias,5 and seizing on the first piece of relevant information an analyst finds.6 Time
pressures also decrease analysts use of what they perceive to be lengthy analytic
techniques7 and self-conscious critical thinking approaches.8 In addition to
increasing mistakes, the amount of available time influences:
whether or not we can submit requests for information from other areas,
or simply go with what is already available
who and how many people are available to work on the problem
As analysts, we face a dilemma; less time means less depth of analysis, less
information that we can get across, and more chance of mistakes and inaccuracies.
Hastie, Reid, 2011, Group processes in intelligence analysis, in Fischhoff, Baruch & Chauvin,
Cherie (eds), Intelligence Analysis: Behavioral and Social Scientific Foundations, The National
Academies Press, Washington, DC, p 179.
Hall, Wayne Michael & Citrenbaum, Gary, 2010, Intelligence Analysis: How to Think in Complex
Environments, Praeger Security International, Santa Barbara, CA, p 111.
Puvathingal & Hantula, Revisiting the psychology of intelligence analysis: from rational actors to
adaptive thinkers, p 207.
Moore, David T, 2007, Critical Thinking and Intelligence Analysis, Occasional Paper Number
Fourteen, National Defense Intelligence College, Washington, DC, p 66.
20
Even so, analysis is often expected to be delivered in limited time frames for it to be
deemed to be useful.
Given the significant impact of time pressures, it is worth clarifying whether the
time lines are externally set or self-imposed. Organisations and individuals can
develop a habit of setting unnecessarily short time frames, which do not reflect
the actual time lines set by those requiring the analysis. As a result, valuable time
that could have been invested in research and review can be wasted with reports
often sitting unread for days. Useful indicators of this are deadlines that are close
of business (ie will not be read until the next morning, at the earliest) or the end
of the week (ie will not be read until Monday, at the earliest). So just asking the
valuable question, When is the analysis actually required?, can provide valuable
additional hours or days to produce a more robust, detailed and considered
product.
Research
What do I mean by research? Simply identifying and collating relevant information
relating to a specific problem. For intelligence analysts, when to stop researching
is defined by the deadlines we have been given. This is in contrast to many other
fields where research only stops once the problem is resolved. The reason that
many analysts fail to meet deadlines is that they spend too long researching before
attempting to finalise their formal report or brief. For many problems, however,
the research phase could go on endlessly. We can always go deeper, review more
sources, follow more leads, and consider more angles. Therefore, time limits
become invaluable in putting a clear limit on the research, analysis and write-up.
So far, this chapter has only focused on the negative aspects of time pressures.
A positive aspect of time pressure is that time limits give analysts a boundary in
which to work. The rest of this chapter deals with the issue of how we can make the
best use of the time that we do have.
Be Prepared
Taking the motto of the Scouts, the first guidance on dealing with time pressures
is simply to Be Prepared. To paraphrase one agency, the short notice requests
placed on intelligence analysts require them to be able to rapidly determine what
21
they know on a subject and be able to respond quickly to difficult questions.9 How
does this work in practice? This works by knowing our area, our systems and our
resources, and having an idea of what is happening in the areas on which decisionmakers are likely to be focusing.
As an example, if we are supporting a specific aircraft, we should understand how
the aircraft operates, what are the upcoming deployments or exercises, and what
particular information is of relevance to the aircraft and aircrew. Even if we have
not been looking at exactly the topic that comes up, by being proactive we will
already have developed knowledge that will be useful and help to determine what
questions and information gaps are likely to need to be addressed.
Backcasting
Beyond being prepared, there are also useful planning tools that help us to identify
what we need to do and when we need to do it. One approach is backcasting, a
technique originally used in large-scale planning projects. In backcasting we start
with a desired future end state and work backwards to identify all the tasks that
need to be completed to get from where we are today to where we want to be. The
point is to start with the end in mind and work backwards.10
For instance, today we might be standing in front of a vacant block of land that is
for sale. Our desired future end state is moving into our brand new house that we
have built on that land. Using backcasting, we would start with moving into our
completed house and think back to where we are today, standing in front of the
vacant block. We then identify all the things that need to happen to make that
future a reality (Figure 4.1).
Office of the Director of National Intelligence, 2008, Analytic Transformation: Unleashing the
Potential of a Community of Analysts, Office of the Director of National Intelligence, Washington,
DC, September, pp 45.
10 Backcasting is also a useful technique to apply when thinking about an adversary, and the steps
and actions that they would need to achieve an end state or outcome.
22
FOR
SALE
G
fo e t a
rl p
oa pr
n ov
al
Ne
pu go
rc tia
ha te
se
La
nd
su
rv
ey
De
ho ve
us lop
e
pl
an
Ap
s
fo pr
r b ov
ui al
ld
Se
le
ct
bu
ild
La
er
y
fo
un
da
tio
Ho
ns
us
e
bu
ild
E
in
co sse
g
nn nti
ec al
te se
d rv
ic
F
es
an inal
d in
ap sp
pr ec
ov tio
M
al n
ov
e
in
Figure 4.1: Indicative backcast for buying land and building a house
Of course, the reason that we got onto backcasting is that we are considering how
we can best deal with time pressures. As analysts, we are usually provided with a
time line, so we can add this into the backcast, knowing that the desired end state
(finished report) already has a specified deadline. For example, as an analyst, let
us say that our team is required to deliver a written threat assessment in five days
to support a short notice deployment. Using backcasting, we have a tool that we
can use to identify what we need to do between now and five days time to get the
report completed. Figure 4.2 provides an indicative example.
As discussed earlier, given the enormous amount of information likely to be
unearthed, it can be quite difficult to stop researching and start writing up. Using
backcasting, we can identify a time when we need to stop researching; and then we
need to stick to it.
The aim of using backcasting is to reduce (we cannot entirely remove) mistakes
attributed to time pressures by having a plan. When working in a team it also
provides a visual tool with tasks and timings to which everyone can refer. Of
course, plans and schedules can be overly optimistic, which leads us to the planning
fallacy.
23
08 Mo
:0 nd
0 a
-1 y
2:
00
M
12 ond
:0 ay
0
An
al
Co ysis
& nf P
Ex irm ro
pe R ble
m
De ctat equ R
Pr fin ion ire ec
ob e
s me eiv
le
nt ed
Id m
s
Q ent
ue ify
st R
io es
ns e
ar
ch
R
Ph ese
as arc
e h
ed
n
08 esd
:00 ay
C
R om
& ese pili
W ar ng
rit ch
e
Up
Th
u
12 rsd
:0 ay
0
D
Co raft
m Re
pl p
et or
e t
12 Thu
:0 rs
0 da
-1 y
3:
00
13 Thu
:0 rs
0 da
-1 y
7:
00
Pe
er
Re
vie
In
Ch co
an rpo
ge ra
s te
In Fe
to e
Fi db
Fi
na ac
n
M a
lR k
in l R
or e
ep &
or
Ed ad
t
it Th
ro
ug
Ap
h
pr
ov
al
s
Pr
oc
es
s
F
08 rida
:0 y
0
F
09 rida
:0 y
0
R
Re elea
po se
rt
F
12 rida
:0 y
0
24
Get statistics for these similar projects (how long they took, how much
they cost, what resources they used)
Compare our project with the reference class to identify the most likely
outcome12
The argument is that reference class forecasting removes the emotional attachment
to our projects and helps overcome issues such as optimism bias by using hard data
from already completed projects, rather than overly optimistic views of our own
uncompleted project.13 Adapting reference class forecasting to intelligence analysis
could look something like the following:
How long did they take to produce? How many people worked on them?
How much was able to be researched? What was the quality of the
product?
11 For a discussion, refer to Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books,
London, pp 249251.
12 Flyvbjerg, Bent, 2006, From Nobel Prize to project management: getting risks right, Project
Management Journal, vol 37, no 3, August, p 8.
13 ibid, p 9.
25
What is the most likely outcome for this product based on other similar
products? This includes time, resources and quality.
26
Chapter 5
Despite the proliferation of information, we are all still seeking that elusive
element knowledge.
Weiss & Wright1
Knowledge
What is knowledge? It is a question that philosophers have considered over
thousands of years (the technical term is epistemology). Like it or not, analysts are
in the knowledge business. Intelligence deals with knowledge and the types of
problems being addressed are knowledge problems. So we need a working concept
of knowledge. That is, we need a basic understanding of what we know and how we
know it, what we do not know, and even what is knowable and what is not.
There are a number of ways that we can gain knowledge, including through personal
experience, education or training and by research. Personal experience is based on
what we learn from our own individual circumstances and the situations to which
we are exposed. In contrast, education and training is based on learning from
others; with research based on what is reported by others and also what we discover.
Within intelligence, experiential learning of an event is rare, as we tend to be
dislocated from the object of analysis. As analysts, we will usually only have indirect
experiences of an opponent, situation or event. Instead, experiential learning tends
to relate to how analysts do their job, such as putting together a report, delivering a
brief, learning a new computer program, or using a new analytic technique.
Weiss, Arthur & Wright, Sheila, 2006, Dealing with the unknown: a holistic approach to
marketing and competitive intelligence, Competitive Intelligence Magazine, vol 9, no 5,
SeptemberOctober, p 15.
27
How does this apply to us as analysts? Analytical products are principally knowledge
about something; for example, knowledge of the specific characteristics and nature
of a threatening group or organisation. The doing of analysis is more about knowing
how to do something; for example, how to write a threat assessment. Finally,
knowing by acquaintance could relate to knowing the people with whom we are
working. So we have identified some types of knowledge and ways we can attain it,
but what actually is knowledge?
A common method has been to take Platos concept of knowledge as justified true
belief. This leads on to a number of questions, including: What is true? What is
justification? What is belief? The definition is not without criticism, as justified belief
can turn out to be incorrect.3 However, as the debate over defining knowledge
has gone on for thousands of years (and remains ongoing), defining knowledge as
justified true belief provides a useful working model. The issue then is what beliefs we
determine to be true and how do we justify the conclusions we reach.
If we accept the argument that for something to be known it must be true and
knowable, we can rule out future events being knowable because they have not
yet happened.4 When it comes to the future, the argument that an intelligence
estimate is a conjecture is useful.5 A conjecture can be defined as a judgement or
statement believed to be true but not yet proven. When assessing the likelihood
and nature of future events, behaviours or situations, we are making a conjecture.
Baggini, Julian & Fosl, Peter S, 2010, The Philosophers Toolkit: A Compendium of Philosophical
Concepts and Methods, Second Edition, Wiley Blackwell, Chichester, p 167.
For the most famous refer to Gettier, Edmund L, 1963, Is justified true belief knowledge?,
Analysis, vol 23, no 6, June, pp 121123.
Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, London, p 201.
Ben-Israel, Isaac, 1989, Philosophy and methodology of intelligence: the logic of estimate process,
Intelligence and National Security, vol 4, no 4, p 667.
28
We cannot know, we can only speculate and justify this speculation based on
information that is available now.
When dealing with the past, we can be presenting knowledge. The event,
behaviour or situation has happened and we hold our understanding to be a true
belief, justified by the information detailing its occurrence. That is not to say that
in dealing with a previous event that we cannot be wrong. Clearly, this is not
correct; a person can have an incorrect understanding of the past. Additionally,
our knowledge of the past is not fixed, but changes as we are exposed to new
information.6 Similarly, with events, behaviours and situations that we believe are
occurring in the present, we argue that our understanding is a true belief, justified
by the information that these things are currently happening. Again, as with
what happened in the past, we can be mistaken in our interpretation of what is
happening in the present. Which leads us to the following summary of knowledge:
We can know what happened in the past (though our knowledge of it can
be correct, incorrect or partial)
We cannot know what will happen in the future, we can only make
conjectures
KNOWABLE
PAST
UNKNOWABLE
PRESENT
FUTURE
Thompson, JR, Hopf-Weichel, R & Geiselman, RE, 1984, The Cognitive Bases of Intelligence
Analysis, US Army, Research Institute for the Behavioral and Social Sciences, Alexandria, VA,
January, p 3-7.
29
Taking Powells statement, Herbert argues that the best analysts can hope to
achieve is to channel the flood of information into facts, judgements and pertinent
unknowns.8 Visually, we can display this guidance as follows (Figure 5.2):
FACTS
JUDGEMENTS
INFORMATION
PERTINENT
UNKNOWNS
Judgements are our conjectures on what has, is or will happen (but are
currently unproven)
Quoted in Weiner, Tim, 2007, Pssst: some hope for spycraft, The New York Times, 9 December,
viewed 4 September 2014, <http://www.nytimes.com/2007/12/09/weekinreview/09weiner.
html?pagewanted=all&_r=0>.
30
31
Non-knowledge
What we know is a drop. What we dont know is an ocean.
Sir Isaac Newton9
There is much, much more that we do not know than we do know. Once we
recognise this, it helps to bring proportion and humility to our efforts. It is also a
liberating thought. The idea that what we do not know far outweighs what we do
know applies to everyone, no matter what some may claim. As analysts, identifying
what we do not know is often as critical as identifying what we do. Given the vast
amount of information available, pointing out that we do not know something
probably highlights the limits of the organisations knowledge on a particular
subject. This, in itself, can be invaluable.
There are a number of aspects to this concept of non-knowledge or what we dont
know. It is worth highlighting Donald Rumsfelds oft-quoted concept of knowledge,
given its frequent use. Though Rumsfelds concept came in for some ridicule at the
time, it had already been in use in engineering and project management for decades
previously. Further, the approach continues to appear in both the analytical and
academic fields (often with some form of alteration) suggesting that it provides a
useful approach for thinking about what we know and what we do not know. An
amended version of this concept is displayed graphically in Figure 5.3:
Known Knowns
Known Unknowns
What we know
that we know
What we know
that we dont know
Unknown Knowns
Unknown Unknowns
32
KNOWN TO
OTHERS
OPEN
NOT KNOWN
TO OTHERS
HIDDEN
NOT KNOWN
TO ANALYST
BLIND
UNKNOWN
The more we read and observe, the more we are increasing our knowledge and
becoming aware of new questions to ask and pursue. Even so, the more we learn,
the greater our appreciation of just how much more we do not know. The point to
10 This idea has been advocated by Sheila Wright and David Pickton: see Wright, Sheila & Pickton,
David, 1998, Improved Competitive Strategy through Value Added Competitive Intelligence,
Proceedings of the Third Annual European Conference, Society of Competitive Intelligence
Professionals, Berlin; and Weiss &Wright, Dealing with the unknown: a holistic approach to
marketing and competitive intelligence, pp 1520.
11 Luft, Joseph & Ingham, Harrington, 1970, The Johari window, a graphic model of interpersonal
awareness, in Luft, Joseph, Group Processes: An Introduction to Group Dynamics, Second Edition,
Mayfield Publishing, Palo Alto, CA.
33
take away is not one of despair but one of humility; we are gaining more knowledge
even as we gain a greater appreciation of our limitations.
Reasoning
Through reasoning we make judgements, arrive at knowledge, and justify why we
believe what we believe. There are different types of reasoning that people use,
which directly affect the confidence that we can have in the judgements that we
reach. It is worth being aware of these to understand the thinking behind these
judgements and the strengths and weaknesses of the approaches.
Deductive Reasoning
In deductive reasoning, the conclusion logically follows from the premise. If the
premises are true then the conclusion must be true. A commonly used syllogistic
example is:
Most of the time, deductive reasoning involves applying a general rule to specific
examples. Deduction is about crawling slowly towards conclusions, not jumping to
them.12 Deductive reasoning tells us what we already know but have not necessarily
deliberately considered. Given that intelligence largely deals with people-based
problems, it is important to emphasise that deduction is not about determining
future behaviour; one cannot deduce future behaviour based on previous behaviour
because it has not yet happened. Deduction, therefore, is useful in clarifying what
we know.
Inductive reasoning
In inductive reasoning, the conclusions do not necessarily follow from the
premises, though there is some evidence that they should. An example of how
inductive reasoning works is:
12 Baggini & Fosl, The Philosophers Toolkit: A Compendium of Philosophical Concepts and Methods,
p 7.
34
Inductive reasoning is going out on a limb, which may or may not support the
weight of our conclusions. Analogies, rules of thumb and typical examples
are all types of inductive reasoning.13 Generally, inductive reasoning takes a
specific example and attempts to develop a general rule. This is more applicable
for intelligence problems. For example, when we use previous behaviour as an
assessment of future behaviour or when we use a previous mission as the basis
for a judgement about a future mission. Nevertheless, just because we have seen
something before, it does not necessarily follow that it applies in all similar cases.
Abductive Reasoning
Abductive reasoning is the most likely or plausible explanation in light of the facts
based on the available evidence. It is the conclusion that best fits the information
available to us at the time. Medical diagnoses provide a useful example of abductive
reasoning. Doctors make a diagnosis of the most likely ailment based on the
symptoms being displayed in a patient. These can be life and death decisions, made
within very tight time frames, where there might not be the time for a search for
additional information. Again, abductive reasoning is a useful tool so long as one
recognises that abductive reasoning can be right or wrong.
Intuition
What about gut feel, when we make a judgement based on our intuition? An oftenquoted definition of intuition provides a useful guide on when and when not to
trust our instincts. It has been argued that intuition is simply recognition: [t]he
situation has provided a cue; this cue has given the expert access to information
stored in memory, and the information provides the answer. Intuition is nothing
more and nothing less than recognition.14 Therefore, for us to rely on our intuition
we need to have a problem within a stable environment that we are able to learn
from and can clearly determine cause and effect.15 If we are relying solely on our
intuition, it would want to be based on repeated exposure to a stable environment.
13 ibid, p 9.
14 Simon, Herbert, 1992, What is an explanation of behaviour?, Psychological Science, vol 3, no 3,
May, p 155.
15 Kahneman, Thinking, Fast and Slow, p 241.
35
The point is not to ignore creative thinking or intuition, but it cannot be enough for
us to say Ive got a hunch and leave it at that.
Information
Analysts use information to form, amend and support their judgements. In the
absence of information on a specific problem, we will commence researching
information that we think is relevant to the problem. Information here is defined
as data containing meaning16 in contrast with intelligence, which we previously
defined as information that has been analysed. As analysts, we form and justify
our judgements based upon our understanding of information. This means that
how we interpret information is critical. As has been noted, [t]he data do not
interpret themselves. Humans must take and analyse them or they simply remain
a mass of unconnected pieces of information.17 When reviewing information,
we are attempting to arrive at a coherent and credible explanation or conjecture,
in essence a story. Analysts are attempting to make a coherent story out of
voluminous, diverse and often conflicting information. However, information
presents a number of challenges for analysts.
Overabundance of Information
The increasing volume of information produced on any given topic means that we
will rarely (if ever) be able to get across all relevant information. The ubiquitous use
of information technology in most aspects of our lives has seen the proliferation
of information at such an enormous rate that it has become an issue for society.
Rather than decreasing uncertainty, an overabundance of information has often
served to confuse and create uncertainty. Moreover, despite recent excitement over
big data, a more cautious approach to some of the claims being made has been
suggested.18
36
Information Diversity
The diversity of information also provides a challenge for analysts. The common
term all source should underscore that analysts are dealing with very diverse bits of
information from entirely different contexts:
Unlike the domains of science and business, where experts data are tightly
controlled, the data of intelligence analysis are enormously complex and
multivariate. A HUMINT report is far different from a satellite image, and both
are different, in several ways, from a conversation recorded on a Web log.20
Single-source analysts still face difficulties of interpretation, but they can at least
build up an understanding of the context, limitations and strengths of a particular
source of information (eg satellite imagery). In contrast, the all-source analysts
are faced with the difficulty of attempting to understand the context and an
enormously diverse range of information.21
Information credibility
Assessing the credibility of information is critical for determining what confidence
we should have in the information. Ultimately, we need to ask ourselves, Can I
trust this information? Establishing the validity of evidence or information on
which we base our judgements is critical, albeit not always done.22 There are useful
19 Spellman, Barbara A, 2011, Individual reasoning, in Fischhoff, Baruch & Chauvin, Cherie (eds),
Intelligence Analysis: Behavioral and Social Scientific Foundations, The National Academies Press,
Washington, DC, p 129.
20 Herbert, The intelligence analyst as epistemologist, p 680.
21 Thompson, Hopf-Weichel & Geiselman, The Cognitive Bases of Intelligence Analysis, pp 3-7.
22 Moore argues information does not appear to be fully validated within intelligence analysis for a
number of reasons, including: wishful thinking, initial readings suggest validity, assumed validity
based on past experience, and decision that it is not worth the effort. Moore, David T, 2011,
Sensemaking: A Structure for an Intelligence Revolution, National Defense Intelligence College,
Washington, DC, p 146.
37
questions we can ask to help validate information, looking at both the information
itself as well as the source. For example:
Who or what was the source?
Morgan D Jones
Hank Prunckunb
Jones, Morgan D, 1998, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, Three Rivers
Press, New York, NY, p 180.
Prunckun, Hank, 2010, Handbook of Scientific Methods of Inquiry for Intelligence Analysis, Scarecrow
Press, Lanham, MD, p 29.
Perhaps the most well-known criteria for judging the credibility of information and
reliability of the source is the Admiralty Scale.23 Under the Admiralty Scale, source
reliability is defined using letters (A to F):
A Completely reliable
B Usually reliable
C Fairly reliable
D Not usually reliable
E Unreliable
F Reliability cannot be judged24
Information credibility is defined numerically (1 to 6):
1 Confirmed by other sources
2 Probably true
3 Possibly true
4 Doubtful
5 Improbable
6 Truth cannot be judged25
23 Ministry of Defence (UK), 2011, Joint Doctrine Publication (JDP) 2-00: Understanding and
Intelligence Support to Joint Operations, Third Edition, Development, Concepts and Doctrine
Centre, Shrivenham, pp 3-20 & 3-21.
24 ibid, p 3-21.
25 ibid.
38
Context
As analysts, we bring our own perspectives and context to a problem.27 Analysts
will interpret identical information differently, based on their own mindsets,
but also on the context of the problem that they are working and the group,
organisation or agency from which they have come. Within a military context,
people would be no doubt aware of how different Services (Navy, Army and Air
Force), even different countrys agencies, will interpret exactly the same information
differently. Just as the buyer and seller of a house are viewing exactly the same
object, they are considering it from different perspectives, and are arriving at
different conclusions.
Why does this matter? Once we realise that people can interpret the same
information differently, it highlights the importance of getting back to the source
documents and reading them for ourselves. Wherever possible, we should look
at primary data for ourselves rather than rely on already-analysed information.
Where the time or access prevents this, then there is the risk of extrapolating
misunderstandings and misinterpretations or of having to accept others
judgements at face value.
Evidence
We can define evidence as information that supports an argument or hypothesis.
Evidence is important to consider as it gets back to what we are using to justify
our beliefs. One can consider something as evidence only against a problem,
hypothesis, theory or question; information is taken as evidence of something. The
key question then is, What is this information evidence of? A common approach
within intelligence analysis has been the use of Richard J Heuers Analysis of
Competing Hypotheses (ACH) that considers how a number of hypotheses
39
Diagnostic Evidence
Information that clearly supports one theory over all others is described as highly
diagnostic evidence.29 This concept comes out of the medical profession and it
is useful to highlight how this might work with a medical example adapted from
Psychology of Intelligence Analysis. If someone goes to the doctor with the following
symptoms: a headache, nausea, a runny nose, and tiredness, these symptoms can
point to any number of possibilities so are of limited diagnostic value. In contrast, if
they also had clear fluid flowing from their ears, this is a highly diagnostic piece of
evidence because it indicates a serious head injury. To put this within an analytical
context, the more diagnostic the evidence, the more useful it is in confirming a
specific hypothesis.30
Negative Evidence
The issue of negative evidence is worth briefly touching on. Being able to say
something is not happening can be as important as saying something is, because
28 For more information, refer to Heuer, Richards J, Jr, 1999, Psychology of Intelligence Analysis,
Center for the Study of Intelligence, Washington, DC, Chapter 8. In addition, criticism of the
approach including research into its utility (or lack of ) in not assisting trained analysts overcome
confirmation bias is included in Cheikes, Brant A, Brown, Mark J, Lehner, Paul E & Adelman,
Leonard, 2004, Confirmation Bias in Complex Analyses, Mitre Technical Report, Center for
Integrated Intelligence Systems, Bedford, MA, October. However, like many areas within the field
of intelligence analysis, the research on the strengths and weaknesses of this approach remains
limited. For example, the MITRE research involved just 24 analysts. Consequently, like most
structured techniques within intelligence analysis, we are a long way from definitive conclusions
on the approach. The benefit of the approach is that it makes analysis more transparent, which in
itself is a positive.
29 Heuer, Psychology of Intelligence Analysis, p 45.
30 ibid, p 45-46.
40
it can save money, time and resources.31 For example, we can look at a country
and confirm that they are not preparing for war if they are not mobilising their
military forces. This conclusion is based on a number of observable factors, namely
personnel, weapons and equipment that are not at a higher state of readiness,
are not being deployed, and are not being armed. This differs from an absence of
evidence.
Absence of Evidence
Depending on the problem, the absence of evidence is not necessarily the evidence
of something not occurring. A useful question to ask is, What is happening above
or below the threshold of detection?32 or What can I reasonably expect not to
see in the information, even if it is occurring? The scale of the problem can be
useful in determining the significance of an absence of evidence. Following on
from the previous example of confirming if a country is mobilising for war, the
absence of evidence of mobilisation is significant because we are talking about a
large-scale event. If they were mobilising, we would expect to see some evidence
of it, regardless of how much they were trying to hide it. In contrast, the absence
of evidence of a terrorist attack being planned does not provide this level of
confidence because it is on a much smaller scale. Further, recent history has
highlighted the difficulty in attempting to detect plans for terrorist attacks.
Alexanders Question
The use of Alexanders Question is a useful technique for problem solving and
analysis.33 Alexanders Question is simply, What new information would make
you change your mind? By answering this question, we are deliberately forced to
consider our own judgements and assumptions. Alexanders Question also serves
to highlight whether or not we are setting reasonable standards for changing our
position on an issue. Key for us as analysts is to develop falsifiable conjectures
31 Cooper also makes the point that [p]erhaps the scarcest resource is a senior decisionmakers
attention, which can easily be wasted. Cooper, Jeffrey R, 2005, Curing Analytic Pathologies:
Pathways to Improved Intelligence Analysis, Center for the Study of Intelligence, Washington, DC,
p 17.
32 Smith, Andrew, 2005, Detecting terrorist activity: defining the states Threshold of Pain,
Australian Defence Force Journal, no 168, pp 3132.
33 Jones, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, p 185; and Arkes, Hal R
& Kajdasz, James, 2011, Intuitive theories of behavior, in Fischhoff, Baruch & Chauvin, Cherie
(eds), Intelligence Analysis: Behavioral and Social Scientific Foundations, The National Academies
Press, Washington, DC, p 165.
41
(ie theories that can actually be proven wrong) rather than theories in which
everything supports my assessment, a kind of heads-I-win-tails-you-lose approach
to analysis. Alexanders Question promotes analytical humility by forcing us to
consider that our initial judgement might, in fact, be wrong. This approach also
provides us with the opportunity to look for the information that would show us we
were wrong, and enable us to better calibrate our assessment.
42
Chapter 6
The words we use to think about and describe things are fundamentally important.
Words help us to explore and expand our knowledge but they can just as easily
trap and constrain our thinking. Two incredibly powerful aspects of language and
thinking that are fundamental to analysis are analogies and metaphors.
Analogy
Analogous reasoning is one of the most popular analytic approaches within
intelligence analysis.1 An analogy is a comparison between two distinct entities
based on their perceived similarities. Analogies reflect the following reasoning:
Y is well understood.
Treverton, Gregory F, 2005, Foreword, in Johnston, Dr Rob, Analytic Culture in the U.S.
Intelligence Community: An Ethnographic Study, The Center for the Study of Intelligence,
Washington, DC, p xi.
43
Previous conflicts are used as the basis for understanding the nature and
outcomes of current and future conflicts.
Analogies are useful based on the strength of their similarities. Everything can
be compared on some levelfor example, a bumblebee and a nuclear bomb can
both cause damagebut it is the strength of the similarities that is important. In
this respect, analogous reasoning operates as a form of inductive reasoning. Philip
Tetlock provides some sound guidance on the use of analogies: use multiple
analogies for a situation, and identify how situations are both alike as well as how
they are different.2 So, when we hear somebody using an historical analogy to
support an argument, we should ask:
Research into forecasting has found that those better at guessing what will happen
in the future draw on multiple analogies for any situation. It was found that those
who use multiple analogies when making predictions are still not necessarily very
good, but they are better than those who rely on a single analogy.3
This brings us to the question of how unusual is the situation that we are seeking to
understand? If the situation is rarely seen then we will have fewer analogies to draw
on, and have less confidence in our use of analogous reasoning. In contrast, if the
situation is seen more frequently we will have many analogies to compare it to, and
will be more confident in the conclusions we arrive at with analogous reasoning.
Often intelligence analysts are dealing with the first type of situation, relatively
rare or unique events (unconventional warfare, terrorist attacks, and insurgencies).
Because these events occur rarely, there is a risk of using a small number of
similar events to come up with an overconfident assessment of what will happen.
Tetlock, Philip E, 2005, Expert Political Judgement: How Good Is It? How Can We Know?,
Princeton University Press, Princeton, NJ, p 92.
3 ibid.
44
Metaphor
Its a jungle out there.
Aristotle described a command of metaphors as the mark of genius, for to make
good metaphors implies an eye for resemblances. Metaphors are a type of analogy
in that they compare two things, but go further in claiming that on some level they
are the same. For example, Its a jungle out there claims that society is a life and
death struggle for survival. Metaphors are usually so ingrained in our language
that we use them without being conscious of them. So we can use metaphors
like assumptions, unquestioningly accepting the validity of the similarities being
claimed.
Metaphors within intelligence are plentiful: intelligence illuminates, intelligence is
the front line, information is wheat or chaff, information is a signal or noise. Taking
the metaphor of information as either wheat or chaff, we accept information as
either useful or irrelevant, overlooking that the metaphor masks the fact that
information can be both useful for one question while irrelevant for another
(but chaff can never change to wheat). The same applies to the signal and noise
metaphor. Whether or not information is a signal or merely noise depends
entirely on the problem at hand, information can be both. Though these might
seem overly pedantic examples, the point is that when we unconsciously accept a
metaphor we accept its validity entirely and all the assumptions that come with it.
Certainly, metaphors can have a powerful influence over the way that those inside
and outside of intelligence think about analysis.
The connect-the-dots metaphor was used by the 9/11 Commission to describe
intelligence analysis, and continues to be used. Yet, it has also been highlighted that
this is an overly simplistic metaphor for what analysts actually do.5 The concept is
4
For a detailed analysis on stratagem in modern warfare refer to Whaley, Barton, 2007, Stratagem:
Deception and Surprise in War, Artech House, Boston, MA.
Puvathingal, Bess J & Hantula, Donald A, 2012, Revisiting the psychology of intelligence analysis:
from rational actors to adaptive thinkers, American Psychologist, vol 67, no 3, April, pp 199210.
45
taken from a childrens activity in which a line is drawn between clearly numbered
dots on a page to reveal a picture. As has been observed, the metaphor:
assumes that all the dots in a given puzzle are present and the task facing
the analyst is simply to join them in order to solve the puzzle. In reality, the
intelligence analyst is more likely to be confronted by a picture that is unclear
precisely because it contains too few dots. The job of the analyst is to make
sense of the picture that sight of the missing dots would bringi.e. reveal the
full picture.6
Another popular metaphor within the field is that of intelligence analysis as a jigsaw
puzzle. There are a number of positives with the metaphor, including: everybody
understands what a puzzle is, a puzzle suggests that structure and patterns exist,
that situations can be resolved, and it provides a mental picture for junior analysts.
However, the puzzle metaphor contains a number of questionable assumptions.
The metaphor assumes that we are dealing with a fixed, unchanging picture
rather than a dynamic situation. This overlooks the fact that our understanding
of situations changes (often entirely), rather than being revealed one neat piece
at a time. Additionally, the metaphor claims that each new piece of information
fits precisely, without contradiction, and makes the picture clearer. Indeed,
the metaphor suggests that there is no need for interpretation; once correctly
assembled, the picture is self-evident to everyone.
The problem with many metaphors like connect-the-dots, puzzle, and wheat
and chaff is that they present intelligence analysis as a simple problem, fixed,
unchanging, straightforward and solvable. They suggest that intelligence analysis
is simply drawing a line, connecting some puzzle pieces, separating some objects
and the situation will become obvious. Whenever we use metaphors (or hear them
used), rather than accepting them at face value, we should think them through. This
includes identifying the assumptions that they contain, why they work and why
they fall down.
Phythian, Mark, 2009, Intelligence analysis today and tomorrow, Security Challenges, vol 5, no 1,
Autumn, p 70.
46
Chapter 7
Expertise
We all have an idea of what an expert is. They are somebody who has extensive
knowledge and skills in a particular field. They are the kind of person that we go to
for answers and the kind of person whose opinion we rely upon. Within the field of
intelligence analysis, this concept of an expert is criticalboth because, as analysts,
we draw upon the opinions and judgements of those considered to be experts and
because at some stage people will come to see ourselves as experts. Even after a
very short time, we might actually find ourselves as the most knowledgeable person
on the specific problem or situation that we are analysing.1 The important point
is for us as analysts to understand the limits of expertise and attach appropriate
confidence to our assessments based on what is knowable and what is not.
Before we simply take experts opinions at face value or before our own heads get
too big thinking of how we might someday be considered an expert, let us actually
look a little more deeply at the concept of expertise. Analysts deal with reports,
statements and assessments on a daily basis by people purporting to be experts.
Understanding when we can rely on somebody elses expertise and when we cannot
is important. This also helps us to appreciate areas in which we might consider
ourselves expert and areas where we should not. So what do we mean by experts,
and when should we pay attention to what they have to say?
A lot of research has already gone into what expertise is and what makes someone
an expert. Several factors that have been identified as defining an expert include:
Petersen, Martin, 1985, Managing/teaching new analysts, Studies in Intelligence, vol 30, Fall,
pp 34, viewed 4 September 2014, <http://www.nationalsecuritylaw.org/files/received/CIA/
Petersen-Managing_Teaching_New_Analysts.pdf>.
47
knowledge, traits, skills, strategies, and the characteristics of the task that they
are doing.2 Of these, research suggests that task characteristics are critical in
determining whether or not people can be considered to be experts. The task
characteristics are different depending upon the particular domain (ie subject
area or field); for example, physics, mathematics, psychology, intelligence analysis,
engineering, or medicine. When looking at performance in terms of expertise we
are talking of the ability to perform a task competently, something that regularly
includes the ability to make accurate judgements within a subject area.3
In considering the literature on expertise, James Shanteau found the following
domains in which good expert performance had been observed: weather forecasters,
livestock judges, astronomers, test pilots, soil judges, chess masters, physicists,
mathematicians, accountants, grain inspectors, imagery analysts, and insurance
analysts. Poor expert performance had been observed in: clinical psychologists,
psychiatrists, astrologers, student admissions, court judges, behavioural researchers,
counsellors, personnel selectors, parole officers, lie detector judges, intelligence
analysts, and stockbrokers. Of interest, nurses, physicians and auditors had been
observed displaying both good and poor expert performance (a point we will
discuss). The point of these findings is that it is not so much about the expert as it
is about the subject area. Of those areas in which people demonstrated poor levels
of expertise, one of the consistent factors was that they were making decisions
about peoples behaviour rather than things. The people-based problem areas were
not producing experts.4
Identifying cause and effect in human behaviour is inherently difficult. This is
because people can react entirely differently to identical influences, and even
differently to the same situation. In terms of research, controlling variables to
allow robust experiment design to identify cause and effect in human behaviour
is extremely difficult. We might make generic statements about future human
behaviour at the collective levelthat is, there will be wars in future, people will
continue to obey the law while some others will break itwithout knowing specific
situations ahead of time. The difficulty with judgements about human behaviour is
that it often fails to provide the needed cues for timely feedback, and at a collective
level is subject to too many unpredictable events and decisions at the same time as
Shanteau, James, 1992, Competence in experts: the role of task characteristics, Organizational
Behavior and Human Decision Processes, vol 53, no 2, p 257.
3 ibid.
4
ibid, p 258.
48
it is subject to known trends and forces.5 Further, the actions of our own forces and
governments will change the basis for original analysis, making it more difficult to
identify cause and effect.6 Does this mean that analysts cannot be experts?
To answer this question, we can consider Shanteaus research showing that three
professions (nurses, physicians and auditors) displayed characteristics of experts in
some tasks and non-experts in others. Daniel Kahneman and Gary Klein referred to
this as fractionated expertise, arguing that this is common across professions, with
expertise able to be developed where the focus is on hard data rather than soft data.7
Taking this concept, it is worth suggesting some areas where expertise might be
developed and others where analysts will be unlikely to achieve expertise. We can
put more confidence in things rather than people, eg what weapons are available,
but not necessarily how they will be used. These areas of expertise could include
organisations, orders of battle, tactics and doctrine, weapons, and historical
behaviours. However, we are unlikely to be experts in terms of predicting an
adversarys future behaviour or their reactions to external influences. To verify
this, we need only look to prewar assessments (of any conflict) of an adversarys
assessed future behaviour and contrast this with what actually happened. So, as
analysts we might know a great deal about a conflict,8 an organisation, a countrys
military and weapons systems, and may be expert on these. We can also develop
expertise in the application of analytic techniques and methods themselves, without
being an expert in the domain to which they are applied. When dealing with human
behaviour, if we believe that we are expert on judging how people will behave then
we are likely to come unstuck.
The observation that expertise does not transfer across domains and fields has been
consistently made within the literature but is worth deliberately stating here to
highlight the importance of this to analysts. Being an expert in one area does not
make us an expert in another. This is an important factor for analysts to consider,
both in terms of who is making an assessment and on what and when, as analysts,
we ourselves begin looking at in new areas or subjects. For example, being an
Moore, David T, 2011, Sensemaking: A Structure for an Intelligence Revolution, National Defense
Intelligence College, Washington, DC, p 80.
Hastie, Reid, 2011, Group processes in intelligence analysis, in Fischhoff, Baruch & Chauvin,
Cherie (eds), Intelligence Analysis: Behavioral and Social Scientific Foundations, The National
Academies Press, Washington, DC, p 172.
Kahneman, Daniel & Klein, Gary, 2009, Conditions for intuitive expertise: a failure to disagree,
American Psychologist, vol 64, no 6, September, p 522.
ibid, p 523.
49
expert in the technical functioning of a weapons system does not make that person
an expert in the how, why or when a group or individual might use that weapon.
Rob Johnston illustrates this, pointing out that being an expert chess player does
not automatically make that person an expert poker player (though both are
games), and a biochemist cannot simply go and perform neurosurgery (though
both look at human physiology). As Johnston notes, the more complex a task, the
more specialized and exclusive is the knowledge required to perform that task.
Johnstons conclusion is highly pertinent for us as analysts, an expert may know
his specific domain, such as economics or leadership analysis, quite thoroughly,
but that may still not permit him to divine an adversarys intention, which the
adversary may not himself know.9 Having considered how areas involving making
judgements about human behaviour lacked true experts, this should provide us
with a basis for considering the area of prediction.
Prediction
there is nothing usual about the future.
Nassim Taleb10
Given the frequent use of the term prediction within the field, consideration of what
is and is not predictable is worthwhile. Concepts such as Predictive Battlespace
Awareness encourage the perception that intelligence can achieve prediction, even
in complex and dynamic human environments.11 There are essentially two types
of change that must be predicted: regular, or cyclical, changes (eg the changing
seasons); and discontinuous changesthose that occur on a one-time, ad hoc,
basis.12 We can expect people to be able to predict recurring natural phenomenon
or where clear causal relationships can be established, but where changes are
unusual or ad hoc events then prediction becomes virtually impossible.13
Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community: An Ethnographic
Study, The Center for the Study of Intelligence, Washington, DC, pp 63 & 66.
10 Taleb, Nassim Nicholas, 2010, The Black Swan: The Impact of the Highly Improbable, Penguin
Books, London, p 135.
11 For a critique of concepts like Predictive Battlespace Awareness, refer to Lewis, Major Dennis,
2004, Predictive Analysis: An Unnecessary Risk in the Contemporary Operating Environment, US
Army School of Advanced Military Studies, Fort Leavenworth, KS, pp i &22.
12 Mintzberg, Henry, 1994, The Rise and Fall of Strategic Planning, The Free Press, New York, NY, pp
228, referred to in Lewis, Predictive Analysis: An Unnecessary Risk in the Contemporary Operating
Environment, pp 2324.
13 Lewis, Predictive Analysis: An Unnecessary Risk in the Contemporary Operating Environment,
pp2324.
50
So where do we stand on predicting future events? People have been trying and
researching this problem for years, and when it comes to predicting peoples
behaviour and dealing with people-based problems the findings are not positive.
Nate Silver, who is known for a string of successful predictions of election outcomes
and voter tendencies in the United States, makes the following observation: There
is no reason to conclude that the affairs of man are becoming more predictable.
The opposite may well be true. The same sciences that uncover the laws of
nature are making the organization of society more complex.14 TN Dupuy in his
book Numbers, Prediction, and War argues, [t]here is no known methodology,
no conceivable methodology, that can accurately predict future events. This of
course applies to all models used for predictive purposes.15 Again, the key issue
relates to the problem of identifying causal relationships that provide a firm basis
for predictions of future behaviour. Moreover, even when we know the causes of
events, this does not make these events predictable. Tetlock highlights that the
US National Transportation Safety Board (NTSB) identifies a number of common
causes of aircraft accidents, including tired pilots, bad weather, uncertain and
cryptic communication, breakdown in radio communication, and people panicking
in the face of death. After an accident the NTSB can pick out the combination
of causes of disasters, and explain what happened, but they cannot (and do not)
attempt to predict the if or when of future aircraft accidents.16
The following activity is worthwhile to highlight the difficulties of prediction, as
well as the impact that the distance of time has on our confidence in making such
predictions. Ask yourself the following questions in terms of both one week from
today and five years from today:
Will I be in a relationship?
14 Silver, Nate, 2012, The Signal and the Noise: Why So Many Predictions Fail But Some Dont,
Penguin Press, New York, NY, p 448.
15 Dupuy, TN, 1985, Numbers, Prediction, and War: The Use of History to Evaluate and Predict the
Outcome of Armed Conflict, Hero Books, Fairfax, VA, p 147, quoted in Lewis, Predictive Analysis:
An Unnecessary Risk in the Contemporary Operating Environment, p 20.
16 Tetlock, Philip E, 2005, Expert Political Judgement: How Good Is It? How Can We Know?,
Princeton University Press, Princeton, NJ, p 35.
51
What will have been the most significant events to have directly impacted
my life over the period?
If we are honest with ourselves, the activity will highlight just how difficult
prediction is, even if you are the best-placed person to make the prediction. The
activity encourages us to think about our own internal motivations and actions as
well as external factors, highlighting that we are influenced by our own deliberate
behaviour as well as external events. The activity also emphasises that the longer
the period of time the more difficult it is to make a confident judgement. However,
that is not to suggest that even guessing one week out is easy. For example, while
we might not have changed jobs a week from today, we will still be challenged by
the question, What will have been the most significant events to have directly
impacted my life over this week? The activity also encourages humility by
recognising that if it is difficult to predict our own behaviour due to numerous
external and unidentified factors, how much harder is it for us as analysts looking at
an adversary from the outside.
So do analysts attempt prediction within the field? The most common predictions
within analysis are often unconsidered statements about an opponents future
behaviour. These predictions usually only appear verbally in discussions and
debates, and take the form of definitive statements that an opponent wont do
that or they will do this. The problem with such statements is that they actually
encourage surprise to occur, because the person making it has entirely disregarded
all but one behaviour, instead of recognising that there might be many plausible
actions that an adversary might take.
Like any field, intelligence analysis has many confident (and at times overly
confident) analysts when it comes to judgements about what will happen in
the future. This confidence can be actively encouraged (even highly regarded)
within organisations. The problem is that research has shown that high levels of
confidence do not necessarily indicate accurate predictions.17 Even in the face of an
incorrect prediction, people may very well continue being overly confident simply
17 Slovic, Paul, 1973, Behavioral problems of adhering to a decision policy, paper presented at the
Institute for Quantitative Research in Finance in May 1973, Napa, CA, p 5.
52
explaining away failed predictions as beyond their control, nearly right, mostly
right, or just wrong about the timing (and that it will still happen).18
At the macro level, anticipating the types of attacks that might occur or the kinds
of warfare in which militaries are likely to engage is very different to predicting a
specific timing and location of an attack or identifying the protagonists that will
be involved. Otherwise these attacks could be prevented, conflicts avoided or
militaries entirely prepared, equipped and trained for a specific type of conflict
against a specific enemy. As Moore notes, broad anticipation of trends, even
generic concerns, are not predictions, highlighting that:
one might detect indicators suggesting that an upcoming event similar to
one in the past is possible, likely, or even reasonable. On this basis, one could,
for example, have anticipated that sooner or later foreign terrorists would again
attack the United States by targeting some high-value building or event, such
as the World Trade Center. This is a far cry from predicting that Al Qaeda
terrorists would fly airplanes into the World Trade Center and the Pentagon on
the morning of 11 September 2001.19
Where does this leave us as analysts? Perhaps the best approach is to distinguish
between a prediction and a conjecture20 and look at judgements of future behaviour
as conjectures (a hypothesis) rather than predictions (as a certainty). The term
conjecture underscores the idea of refuting and a recognition that these things are
not certain and are open to alternative explanations.
Prediction Markets
Given recent attention to prediction markets, it is worth briefly touching on these.
Prediction markets involve groups of people speculating on the outcomes of
usually well-defined, short-term future events, such as political elections, product
delivery dates, commodity pricing or sporting outcomes. As Hastie observes
prediction markets are restricted to applications in which a well-defined outcome
set to occur in the near future can be verified.21 Prediction markets can certainly
form a source of information for analysts to consider if looking at relevant or
related topics. Nevertheless, they lack many of the basic requirements expected
18 Tetlock, Expert Political Judgement: How Good Is It? How Can We Know?, pp 129143.
19 Moore, Sensemaking: A Structure for an Intelligence Revolution, p 40.
20 Ben-Israel, Isaac, 1989, Philosophy and methodology of intelligence: the logic of estimate process,
Intelligence and National Security, vol 4, no 4, p 667.
21 Hastie, Group processes in intelligence analysis, p 188.
53
of analysts who are making judgements that inform often critical and significant
decisions and actions. For example, prediction markets lack the transparency
expected of analytical reporting, including: Who is making the judgement? Why are
they making the judgement? What information are they using? What information
are they deliberately omitting? What information are they ignorant of? There have
been concerns raised over an inability to understand the rationale behind forecasts,
the potential for prediction markets to be manipulated, and the ethics of using
prediction markets for national security issues.22 Ultimately, what government
would justify making potentially significant decisions or taking actions based upon
the suggested outcome of a prediction market?
22 Heuer, Richards J, Jr & Pherson, Randolf H, 2011, Structured Analytic Techniques For Intelligence
Analysis, CQ Press, Washington, DC, p 210.
23 Tetlock, Expert Political Judgement: How Good Is It? How Can We Know?, p 35.
24 Taleb, The Black Swan: The Impact of the Highly Improbable, p 321.
25 ibid.
54
Surprise
It is natural for us to hold certain expectations about our short, medium and
long-term futures. These can include what we are going to do for lunch, how our
sporting team will perform this season, and what we will do in retirement. We
all have expectations about how the future will play out. These expectations are
often based on previous experiences, meaning that simply by holding expectations
we will be surprised when the future does not follow these expectations. One
example is the Japanese airborne attack on Darwin on 19 February 1942. Despite
the Japanese previously conducting an attack on Pearl Harbor two months earlier,
the attack on Darwin still came as a surprise. With expectations being based on
previous experience, as there had never been an attack on Darwin, the expectation
was that 19 February would be the same. Once the first attack on Darwin occurred,
peoples expectations of the future changed based on their new experience, with a
greater expectation of further attacks.
While some things might turn out the way we expected, or within bounds of which
we are comfortable, at other times the future will be nothing like we expected. The
result? Surprise. If we pay attention, we will notice how we are surprised every single
day because our perceived future never quite matches the reality that unfolds. The
55
scale of the surprise and the consequences of being incorrect will mostly be minor:
we do not get our usual car park, somebody with whom we work was off sick, an
unexpected email arrived, we were given a new task, or there was a long line at the
shop (or a short one). Each of these actually represents a surprise in the form of an
unexpected or unanticipated future. Therein lies the surprise, the future turns out
differently to what we expected, however small or insignificant. So where does this
lead us in terms of analysis, prediction and dealing with human behaviour?
26 ibid, pp 211213.
27 Moore, Sensemaking: A Structure for an Intelligence Revolution, p 129.
56
Chapter 8
Complexity
The term complex frequently appears within the intelligence field to describe any
number of situations, operations, issues and problems.1 We could define complexity
as relating to a situation, issue or topic that is inherently complicated, often because
of multiple interacting parts. Yet, we are often made to believe that even the most
complex situations can be adequately explained in a 30-second television report or
a three-paragraph article. Instead, even when we look at what appear to be simple
situations, the more we look the more we become aware of their complexity. Heuer
provides some practical guidance for dealing with complexity:
There are two basic tools for dealing with complexity in analysisdecomposition
and externalization.
Decomposition means breaking a problem down into its component parts
We are differentiating here from complex as relating to technical equipment and weaponry which,
though complicated in terms of numerous interacting parts and technological applications, is a
bound problem. That is, these systems can be understood, albeit potentially with a lot of effort.
In contrast, how a group or individual might adapt such weapons to a specific conflict, and the
timing, locations and factors influencing their use, might be a complex problem.
57
Externalization means getting the decomposed problem out of ones head and
down on paper or on a computer screen in some simplified form that shows
the main variables, parameters, or elements of the problem and how they relate
to each other.2
There are a number of tools that can be used to decompose and externalise a
problem, for example mind mapping. Mind mapping, popularised by Tony Buzan,
is a useful visualisation tool for displaying numerous aspects of a problem and their
relationships.3 Let us take a relatively simple problem: Should I buy a car? The
question is a common one, which most of us have already had to make. While a
simple problem, even relatively straightforward issues can become quite complex
when we identify all of the factors that can potentially influence this decision
(Figure 8.1).
CASH
PURCHASE
PRICE
FUEL
LOAN
LEASE
FINANCE
INITIAL
SECOND
HAND
TYPE
SMALL
WAGON
SUV
MECHANIC
FAMILY &
MODEL
INSURANCE
REGISTRATION
TAXI
SHOULD I
BUY A
CAR ?
SIZE
OPERATING
COSTS
COST
NEW
MAKE
& MODEL
MAINTENANCE
ALTERNATIVES
BICYCLE
RENTAL CAR
MOTOR BIKE
ADVICE
PUBLIC TRANSPORT
MAGAZINES
REASON
WEBSITES
IMAGE
WORK
SAVE TIME
TAKE PEOPLE
PLACES
FREEDOM
TRAVEL
When it comes to intelligence problems, the situations we deal with will usually
contain far greater levels of complexity. Further, these situations are dynamic,
Heuer, Richards J, Jr, 1999, Psychology of Intelligence Analysis, Center for the Study of Intelligence,
Washington, DC, pp 8586.
58
Uncertainty
If complexity relates to a situation, we can say that uncertainty relates to our own
understanding. A useful definition of uncertainty is what we do not know or
understand about a given situation.4 As analysts, we should understand that
uncertainty is normalwe will always have limits in our knowledge. We can seek to
reduce our uncertainty, but we will never entirely eliminate it.
In conflict there already exists a term that captures this uncertainty, namely the fog
of war. Whaley defines this inspired phrase as the state of uncertainty resulting
from the inability of a military information system to either accurately or speedily
monitor the events of battle.5 As analysts, our understanding of situations will also
be incomplete because of time pressures to make judgements, an over or underabundance of information, and the often-changing nature of situations.
Information on peoples stated future intentions can be useful in reducing
uncertainty about the future, so long as we understand that people who have
decided on an action might not carry it through to completion. Certainly,
intended behaviour might not occur as planned, even to the surprise of those
planning to undertake it. History indicates that at least one third of large-scale
military operations are unable to meet their own time lines, with delays due
to unrealistic planning or external factors, rather than attempts at surprise or
deception.6 Additionally, people frequently change their minds; they might decide
to do nothing or something entirely different. People will act and react in entirely
unanticipated ways based on emotional responses rather than calculated logic,
adding to our uncertainty of what will happen. As has been observed, Countries,
armies, and individuals are not mechanical, and much more of human nature
Schmitt, Major John F & Klein, Gary A, 1996, Fighting in the fog: dealing with battlefield
uncertainty, Marine Corps Gazette, vol 80, no 8, August, p 63.
Whaley, 2007, Stratagem: Deception and Surprise in War, Artech House, Boston, MA, p 136.
ibid, p 94.
59
involves emotion rather than logic.7 This leads us to the issue of unintended
consequences.
Unintended Consequences
Unintended consequences affect both the complexity of situations and our own
uncertainty about the situation. People will often react differently to how we might
expect them to do. Why is this important? Because we are often analysing how an
opponent (well-defined or otherwise) might react to us. Adversaries are living and
thinking entities that will change their behaviour based on our actions or other
unanticipated factors.
How people will react to threats or incentives is not predictable. Steven Levitt and
Stephen Dubner argue, one of the most powerful laws in the universe is the
law of unintended consequences. To illustrate the point they describe how many
governments around the world have started to base their rubbish pick-up fees
on volumewith the expectation that if people had to pay for each extra bag of
garbage they would have a strong incentive to produce less garbage. However, this
new way of pricing resulted in a number of unexpected consequences, including
people stuffing more rubbish in their bags, the dumping of garbage in the woods,
sewers in Germany being infested with rats because so much uneaten food was
being flushed down the toilet, an increase in backyard rubbish burning in Ireland,
and increased hospitalisation of people accidentally setting themselves on fire when
burning rubbish.8 While there is no guarantee that we will be able to identify
unintended consequences before they occur, simply recognising that actions can
have unintended consequences is a good start. Keeping an open mind is better than
ignoring the possibility that people may not react the way we anticipate.
Assumptions
Assumptions have been described as the most dangerous form of knowledge. Why?
Because an assumption carries with it unconsidered information, knowledge that
is not subject to thought or critique. However, assumptions are a fact of life; we all
have them and we all rely on them. Within intelligence analysis, assumptions are
Lanning, Lt Col (Ret) Michael Lee, 1996, Senseless Secrets: The Failures of U.S. Military
Intelligence, From George Washington to the Present, Carol Publishing Group, Secaucus, NJ, p 297.
Levitt, Steven & Dubner, Stephen, 2009, SuperFreakonomics, Allen Lane, New York, NY, pp xii &
139.
60
critical because of their potential consequences. The best that we can do is identify
them and make them explicit.9
Whenever we read a piece of analysis or an article, or watch a report, it will
always contain assumptions. These are not necessarily a matter of people trying
to mislead but can simply be a reflection of peoples world views, perspectives and
opinions. The point is that assumptions exist and, as analysts, we need to be aware
of them and identify them. There are really two forms of assumptions to identify:
assumptions reflected in what is written or stated; and assumptions made in terms
of what is not written or stated. While not suggesting that assumptions are lies,
the delineation of lies of commission and omission appear relevant to identifying
assumptions. That is, there are:
61
62
Chapter 9
We all like to provide explanations for behaviour, situations and events. This
powerful instinct influences the way we interpret situations and the judgements
about future behaviour that we make. In life, we look for patterns and explanations
for events and behaviour to help us understand our world. This concept of cause
and effect influences our analysis as we seek coherency and explanations out of a
mass of diverse and contradictory information on a specific problem. The more
coherent the story we develop the more convincing it will be, but that does not
mean that it is accurate. As Morgan D Jones notes:
We feel the need to find explanations for everything, regardless of whether the
explanations are accurate.
It isnt that we humans like to have explanations for things; we must have
explanations. We are an explaining species. Explanations, by making sense
of an uncertain world, apparently render the circumstances of life more
predictable and thus diminish our anxieties about what the future may hold.1
Explaining cause and effect in human behaviour appears most promising when
looking backwards and multiple possibilities have already been narrowed to one
event. However, this ignores that even if we know the event, we will often disagree
about the actual causes. Further, identical stimuli can produce different effects. As
an example, if we were to give a group of people $100 each, they will use this money
in entirely different ways. Some people would save the money, some would spend it
(and those that do would spend it on different things), some might give it away, and
some might do combinations of some or all of the previous.
Interpreting Behaviour
Because we are pattern-seeking beings, we are also fooled by apparent patterns in
situations and activities around us, even when these often involve unintentional
or random behaviours. Thus, we can infer far more intentionality from poorly
considered, accidental or even uncoordinated behaviour. One example of overanalysing observed behaviour, described by Thomas Gilovich, relates to German
V-1 rocket attacks on London in World War II. The V-1 rocket was an unmanned
flying bomb, which the Germans used to attack England between June and October
1944. A number of theories were developed by Londoners as to the Germans
intentions and rationale behind the locations of where these rockets were landing
(including avoiding neighbourhoods where German spies were living). After the
1
Jones, Morgan D, 1998, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, Three
Rivers Press, New York, NY, p 34.
64
war, the locations of these bombings were analysed and an explanation of the
bombing locations was confirmed.2
Thomas Gilovich, 1991, How We Know What Isnt So: The Fallibility of Human Reason in
Everyday Life, The Free Press, New York, NY, pp 1920, discussed in Thaler, Richard H &
Sunstein, Cass R, 2008, Nudge: Improving decisions about health, wealth, and happiness, Penguin
Books, London, pp 3032.
3 ibid.
65
hundred metres apart at the end of the marathon course. An analyst could argue
very quickly that the bombings were coordinated and deliberately targeted the
Boston Marathon. It is highly unlikely that two independent groups or individuals
would coincidentally plant bombs several hundred metres apart to go off on exactly
the same day, at exactly the same time, during the same major event. As analysts,
we should neither disregard patterns or coincidences, nor assign deliberate design
at each level of human behaviour. Deliberate behaviour can be well coordinated but
other times the outcomes can appear more clearly planned or coordinated than was
actually the case.
Statements of Uncertainty
We have previously discussed how uncertainty relates to our lack of knowledge
about a given situation, emphasising that uncertainty is normal and unavoidable.
What we will look at now is a method for capturing this uncertainty in our analysis.
The issue of uncertainty in analysts use of language has been highlighted within the
field since (at least) Sherman Kents 1964 article Words of Estimative Probability.
Kent highlighted that people interpret words differently and when asked to put a
numerical estimate against words commonly used in intelligence analysis reporting.
The example Kent gave was interpretations of the likelihood meant by the term
serious possibility, observing that it was taken to mean anything from 20 to 80 per
cent certainty, depending on the individual being asked.4
Words alone can be interpreted differently by people in terms of the certainty
that they are believed to convey. People can read what they want into these words
and mistake our uncertainty for confidence and our confidence for uncertainty.
Consequently, the argument that using odds ratios or numerical probability ranges
should be standard practice appears sound.5 That does not mean that assigning
numerical probabilities is easy. To assist, Kent provided seven terms with proposed
numerical ranges:
Certainty (100%)
Kent, Sherman, 1964, Words of Estimative Probability, Center for the Study of Intelligence,
Washington, DC, viewed 4 September 2014, < https://www.cia.gov/library/center-for-the-studyof-intelligence/csi-publications/books-and-monographs/sherman-kent-and-the-board-ofnational-estimates-collected-essays/6words.html>.
Heuer, Richards J, Jr, 1999, The Psychology of Intelligence Analysis, Center for the Study of
Intelligence, Washington, DC, p 183.
66
Impossibility (0%)6
A popular suggestion has been the use of Bayesian probability in intelligence analysis for assigning
numerical probabilities to hypotheses and scenarios.
Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, London, p 160.
67
68
Chapter 10
Zimbardo, Philip, 2011, The Lucifer Effect: How Good People Turn Evil, Random House, New
York, NY, pp 8 & 211212.
For example, refer to the BBC Prison Study, viewed 4 September 2014, <http://www.
bbcprisonstudy.org>.
69
observation remains that social settings and the organisational environment can
have a significant impact on an individuals behaviour.3
As analysts, we will be heavily influenced by the organisational culture and social
behaviours of the groups in which we work. Most of us will not have a choice of the
teams to which we are assigned or the people with whom we work. Consequently,
social and organisational factors are part of the analytic process and we cannot
separate the cognitive aspects of intelligence analysis from the cultural context
in which we work.4 What this suggests is the importance for us to deliberately
thinking about the environment in which we work, and the influence that it has on
the way we behave and interpret the world.
In any work environment, but particularly a military one, there are pressures
that exist, both explicit and implied. Examples of environmental and situational
influences facing us include: working in a hierarchy (as evident by uniforms, rank,
medals, etc); group pressures; commanders expectations; pressure for early closure;
pressure to conform to existing analysis; limited or no feedback on analysis; and a
high level of responsibility. Where analysts are deployed, we can also experience
fatigue, long work hours, potentially high-consequence decisions, and isolation
from external views. By being conscious of these pressures, we can try to avoid
unquestioning acceptance of these and adopt a more considered approach in our
behaviour and analysis.
Conformity
Rob Johnston highlights two types of conformity pressures that we face as analysts:
the pressure to conform to a corporate judgement; and the pressure on analysts to
conform to their own previous assessments.5 Both of these are worth considering
in some detail.
For example, refer to Walter Mischel as discussed in Arkes, Hal R & Kajdasz, James, 2011,
Intuitive theories of behavior, in Fischhoff, Baruch & Chauvin, Cherie (eds), Intelligence Analysis:
Behavioral and Social Scientific Foundations, The National Academies Press, Washington, DC, pp
144145.
Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community: An Ethnographic
Study, The Center for the Study of Intelligence, Washington, DC, p 6.
ibid, pp 2224.
70
ibid, p 23.
ibid, pp 2224.
Ben-Zvi, Abraham, 1976, Hindsight and foresight: a conceptual framework for the analysis
of surprise attacks, World Politics, vol 28, no 03, April, p 386, quoted in Brei, William S, 2005,
Getting intelligence right: the power of logical procedure, in Lightfoot, James E (ed), Learning
With Professionals: Selected Works from the Joint Military Intelligence College, Joint Military
Intelligence College, Washington, DC, p 63.
71
this issue when he observes that [a]n analyst can change an opinion based on new
information or by revisiting old information with a new hypothesis; in so doing,
however, he or she perceives a loss of trust and respect among those with whom the
original judgement was shared. Along with this perceived loss of trust, the analyst
senses a loss of social capital, or power, within his or her group.9
As analysts, we need to make judgements based on available information and have
the common sense to change our judgements in response to new information. The
recommendation is to follow the facts, not our ego. While initially uncomfortable,
it is worthwhile for an analyst to develop a reputation for following the information
where it leads rather than maintaining a position in the face of information
indicating alternative explanations. As we do, people will then develop trust in
our assessments. Given the unpredictability in human behaviour, constant flow
of new information, and the uncertainty of conflict situations we should expect
assessments to change as our understanding of a situation improves.
Dissent
The importance of considering alternative views and contrary opinions is regularly
lauded within intelligence analysis. In practice, this should take the form of analysts
discussing and debating genuinely held differing opinions on a problem (not simply
being disagreeable). Considering dissenting and differing opinions is a valuable
means of considering a broader range of perspectives and judgements than might
otherwise be identified. Indeed, the desire for group consensus is an important
factor in poor group decisions (groupthink); while group consensus is perceived as
a success, it is often indicative of a failure to consider alternatives or the negative
aspects of the groups position.10 A willingness to consider a diverse range of
perspectives is an advantage for a team presented with a difficult problem. The
performance of problem-solving groups has been shown to improve where group
members had independently thought of correct solutions to a problem, rather than
considering solutions only within a group setting.11
72
12 ibid, p 184.
13 For example, refer to Schulz-Hardt et al, 2006, Group decision making in hidden profile
situations: dissent as a facilitator for decision quality, Journal of Personality and Social Psychology,
vol 91, no 6, December, pp 10801093.
73
74
Chapter 11
Critical Thinking
critical thinking is ultimately about the process of forming well-justified
true conclusions (i.e., knowledge).
Noel Hendrickson1
Much of this book has been about critical thinking. We have looked at ways to
develop considered and robust judgements as well as being conscious of how we
are approaching problems. If we define critical thinking as purposeful reflective
judgement, we can think about cognitive skills that critical thinkers display, such as
evaluation, interpretation, inference, analysis, explanation, and self-regulation (ie
evaluating how we are making judgements).2 There are a number of reasons why
adopting a critical thinking approach to intelligence analysis is important. These
include:
Facione, Peter A, 2011, Critical Thinking: What It Is and Why It Counts, Insight Assessment,
San Jose, CA, viewed 4 September 2014, <www.insightassessment.com/CT-Resources/CriticalThinking-What-It-Is-and-Why-It-Counts>.
75
Intelligence analysis provides the basis for important and potentially wide-reaching
decisions. This is why we need to employ a considered, critical and defensible
approach. One way of adopting a critical approach is by using formal methods,
which encourage us to think critically about our thinking, our judgements, and
ultimately our assessments.
Formal Methods
Structured analytic techniques can mitigate some of the human cognitive
limitations, side-step some of the well-known analytic pitfalls, and explicitly
confront the problems associated with unquestioned assumptions and
mental models. They can ensure that assumptions, preconceptions, and
mental models are not taken for granted but are explicitly examined and
tested.
Richards J Heuer, Jr & Randolf H Pherson3
Formal methods, or structured analytical techniques, are analytical methods that
can be consistently applied to a variety of problems. Most formal methods are
not unique to intelligence analysis, but have been developed in other fields used
by analysts. Many of these techniques broadly fall under the field of Operations
Analysis or Operations Research, in particular soft Operations Research or
qualitative techniques, in contrast to hard Operations Research or quantitative
techniques. One benefit of adopting formal methods is that many have already been
applied to problems in other fields, with their usefulness evaluated, peer reviewed
and results formally published.
It is not that formal methods guarantee that by using them we will produce the
right answer. Remember, we are often dealing with complex, uncertain and
changing situations, usually involving people. It might not matter how good our
analysis is, things can happen that are quite simply unforeseeable. So if formal
methods cannot guarantee success then why would we choose to use them?
Graphically, this question is answered below (Figure 11.1).
Heuer, Richards J, Jr & Pherson, Randolf H, 2011, Structured Analytic Techniques For Intelligence
Analysis, CQ Press, Washington, DC, p 24.
76
Traceable
Conclusions
Provide
Structure
Make
Thinking
Explicit
Tested
Methods
BENEFITS
OF FORMAL
METHODS
Making
Thinking
Purposeful and
Considered
Reveal
Biases
Reveals
Assumptions
Repeatable
Add
Credibility
Petersen, Martin, March 2011, What I learned in 40 years of doing intelligence analysis for US
foreign policymakers, Studies in Intelligence, vol 55, no 1, p 17.
Treverton, Gregory F, 2005, Foreword, in Johnston, Dr Rob, Analytic Culture in the U.S.
Intelligence Community: An Ethnographic Study, The Center for the Study of Intelligence,
Washington, DC, p xi.
77
formal methods included in this book are ones that can be used both individually
or as part of a small team, require limited resources (often just a pen and paper),
and can be used quickly. Five formal methods have been included in this book:
Backcasting (described in Chapter 4), Mind Mapping (described in Chapter 8),
Nominal Group Technique, Pre-Mortem Analysis, and Indicators of Change, with
the last three described here.
Delbecq, AL & Van de Ven, AH, 1971, A group process model for problem identification and
program planning, Journal of Applied Behavioral Science, vol 7, no 4, July, pp 466492.
Kohn, Nicholas W & Smith, Steven M, 2011, Collaborative fixation: effects of others ideas on
brainstorming, Applied Cognitive Psychology, vol 25, no 3, p 359.
ibid, pp 359371.
78
Pre-Mortem Analysis
Most people are aware of what a post-mortem isa medical examination of a body
to identify a cause of death. Pre-Mortem Analysis is an approach developed by
Gary Klein for considering why something might go wrong before the event rather
than after it. A pre-mortem starts with the assumption that a project has failed
and asks people to identify plausible reasons to explain why it did go wrong.9
The method assists in proactively identifying potential causes of failure before
they happen and encourages people to develop and implement strategies to avoid
such a failure. Pre-Mortem Analysis is a useful technique for intelligence analysts
to consider ways in which our assessments might turn out to be wrong, and has
been recommended by a number of authors.10 Analysts would use a pre-mortem
when considering their analysis or assessments of a future situation. Analysts
would assume that the analysis was wrong and identify plausible reasons why. In
conducting a pre-mortem within a group setting, Nominal Group Technique is the
recommended approach. Again, this ensures that everyones input is collected and
as many reasons as possible are identified. Pre-Mortem Analysis is a particularly
useful tool for considering how threat assessments might be incorrect. For
example, where analysts believe that the level of threat to personnel is Low, a premortem approach starts with the assumption that the threat level was actually
High, and seeks to identify reasons why the threat level changed so dramatically.
Reasons for a dramatic change in threat within a country might include terrorism,
natural disasters, health issues (eg severe acute respiratory syndrome (SARS)),
crime, a breakdown in law and order, specific targeting of personnel, civil war, or
conventional war. If done well, the results of a pre-mortem can be used as the basis
for another formal method, developing Indicators and Warnings.
Klein, Gary, 1998, Sources of Power: How People Make Decisions, MIT Press, Cambridge, MA,
pp7172.
10 For example, Heuer & Pherson, Structured Analytic Techniques For Intelligence Analysis; and
Hall, Wayne Michael & Citrenbaum, Gary, 2010, Intelligence Analysis: How to think in Complex
Environments, Praeger Security International, Santa Barbara, CA.
79
11 Ministry of Defence, 2013, Red Teaming Guide, Second Edition, Development, Concepts and
Doctrine Centre, Shrivenham, p A-15.
12 ibid, pp A-15 & A-16.
13 Heuer & Pherson, Structured Analytic Techniques For Intelligence Analysis, p 136.
14 Ministry of Defence, Red Teaming Guide, p A-15.
80
Chapter 12
Warnings over the pitfalls of mindsets and cognitive biases frequently appear in
discussions about intelligence analysis. By highlighting mindsets and common
biases, the aim is to make us more aware of them and minimise their negative
impacts. Mindsets and biases are also reflected in arguments. While arguments can
be useful in ensuring that our analysis is considered and robust, some arguments
can actually hinder clear thinking.
Mindsets
There is no such thing as context-free decision making. All judgments and
decisions rest on the way we see and interpret the world.
Scott Plous1
Mindset is a term frequently used within intelligence analysis. Dictionary
definitions of mindset include a mental attitude determining how we interpret
situations, a fixed mental attitude, fixed state of mind or ideas and attitudes that a
person approaches a situation with. A mindset relates to how we think, in particular
how we see and interpret the world. Mindsets are normal, we all use them to enable
us to incorporate new information, learn from experiences and adapt to situations.
However, our mindsets can also hinder us. People have a tendency to perceive what
they expect to perceive; these expectations subconsciously tell us what to look for,
what is important, and how to interpret what is seen.2
Plous, Scott, 1993, The Psychology of Judgment and Decision Making, McGraw-Hill, New York,
NY, p 13.
Heuer, Richards J, Jr, 2005, Limits of intelligence analysis, Orbis, vol 49, no 1, Winter, p 80.
81
The way that we immediately perceive and interpret the actions and behaviours
of a potential adversary reflects our mindset. While accepting that we do things
by accident we can tend to view every action of an adversary as deliberate,
calculated and intentional; we assume every blink as a wink.3 For example, if
a country decides to purchase a new fleet of fighter aircraft, our own immediate
interpretation of these actions will likely reflect our mindset. We could see these
actions as evidence that they are preparing for war. However, there are numerous
possible explanations, not all of which are sinister, including the country actually
feels threatened by their neighbours, they are updating ageing fighters, they
are wanting to improve their interoperability with other countries, they have
been offered the aircraft at a great pricetoo good to refuse or they have
been meaning to do it for a while but only now have the finances. The breadth of
plausible explanations shows why we need to maintain a questioning attitude if we
are to accurately assess the situation. While we cannot get away from mindsets, it
does not mean that we accept them unquestioningly.
Cognitive Biases
Another concept frequently discussed within the intelligence literature is that of
biases, specifically biases in the way that we think (cognitive biases). Each of us will
have been exposed to different types of cognitive biases in our lives, even fallen
into a few of them ourselves. Consequently, the kinds of biases that we have already
experienced are similar (if not identical) to those we encounter within intelligence
analysis.
We can think of cognitive biases as inaccuracies or misinterpretations in the way
we remember, think about, interpret or explain things. These can be things that
have already happened, are happening, or we thought were going to happen. It
might be when we were overconfident about a decision, relied on somebody elses
opinion, or simply made an assumption. By considering common cognitive biases,
we can better understand how we can go wrong, how others can go wrong, and
what we should deliberately look out for. Analysts are not alone in being subject
to cognitive bias. Those reading our analysis, including military commanders,
operators, and policymakers are all vulnerable to cognitive biases.
Research into the types of cognitive biases, and their frequency, within intelligence
analysis is ongoing. Even existing research has at times produced mixed results
Kurtz, Cynthia F & Snowden, David J, 2003, The new dynamics of strategy: sense-making in a
complex and complicated world, IBM Systems Journal, vol 42, no 3, p 463.
82
on the types of biases that analysts within the field actually display.4 So what is
included here is list of commonly discussed biases that people are known to adopt
and are, therefore, likely to be evident within intelligence analysis to some degree.
For example, refer to the discussion in Puvathingal, Bess J & Hantula, Donald A, 2012, Revisiting
the psychology of intelligence analysis: from rational actors to adaptive thinkers, American
Psychologist, vol 67, no 3, April, pp 199210.
Heuer, Richards J, Jr, 1999, Psychology of Intelligence Analysis, Center for the Study of Intelligence,
Washington, DC, p 44.
Risen, Jane & Gilovich, Thomas, 2007, Informal logical fallacies, in Sternberg, Robert J,
RoedigerIII, Henry L & Halpern, Diane F (eds), Critical Thinking in Psychology, Cambridge
University Press, Cambridge, pp 112113.
Taleb, Nassim Nicholas, 2010, The Black Swan: The Impact of the Highly Improbable, Penguin
Books, London, pp 6364 & 79.
83
While our biases are not limited to these, those described here are some of the
most commonly referred to and provide a start point for being aware of how our
thinking can be flawed.
Arguments
Analysts encounter arguments on a daily basis over just about every aspect of
analysis: the accuracy of a hypothesis, evaluation of evidence, the rationale for
Drner, Dietrich, 1989, The Logic of Failure: Recognizing and Avoiding Error in Complex
Situations, Perseus Books, Cambridge, MA, p 188.
10 Baron, Jonathon & Hershey, John C, 1988, Outcome bias in decision evaluation, Journal of
Personality and Social Psychology, vol 54, no 4, April, pp 569579.
11 Charles Allen, quoted in Cooper, Jeffrey R, 2005, Curing Analytic Pathologies: Pathways to
Improved Intelligence Analysis, Center for the Study of Intelligence, Washington, DC, p 38.
12 Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, London, Chapter 12.
84
assessments, and even the best way to present findings. Arguments are valuable for
encouraging us to consider alternative perspectives and explanations, and arrive at
reasoned judgements. Mostly arguments are a give and take of ideas in which we
are forced to consider, justify and even reconsider our assumptions, perceptions
and conclusions. However, not all forms of argument are helpful or useful in
achieving better analysis.
Poor arguments, or as we could label them conversation killers, are those arguments
that avoid the actual assessment, judgement or logical process. Instead, they focus
on other aspects that actually have little or nothing to do with the judgement
itself. These conversation killers are used to shut down debate, ignore alternatives
and encourage poorly reasoned judgements. We will no doubt have already heard
many of these and, unfortunately, have actually used some of them ourselves. By
recognising these conversation killers, the intention is that we will try to avoid
these in future and focus on the actual issues within the argument, not on the
peripherals. As difficult as it is, we need to recognise that a good judgement can
survive critique, questioning and challenges.
The following arguments are usually designed to kill a conversation instead of
opening up discussion or encouraging alternative ideas. We can see that they are
not legitimate ways of arguing, and some possible counters to them are included.
The first three conversation killers are particularly evident in public discourse.
Indeed, watching debates and arguments in the media is a useful way to learn to
recognise these. These three poor arguments are:
Stating that an argument is simplistic. This tactic again avoids the core
argument but tries to undermine the argument based on simplicity, as
nobody wants to be perceived as simple or stupid. Certainly, as discussed
in biases, people can oversimplify a problem. However, attacking an
argument as simple (which can be an entirely subjective term) still avoids
85
Default to experience. This argument avoids the core issue and defaults
to somebodys experience as having the final say. This is apparent in
statements such as In my experience , When I was , and That didnt
happen when we (ie so it will not happen now). The problem is that
experience does not necessarily equate to accurate perception. One
legitimate question is whether or not previous experiences are actually
relevant to the problem at hand? Additionally, relying on experience
means that we need a lot of it, we need to have learnt the right lessons
from it, and the lessons should remain relevant.
Flat out denial. This is where an argument is rejected entirely out of hand.
This is apparent in statements like You havent convinced me, Youre
wrong and That wont happen, particularly where there is no attempt
at discussing which points the person disagrees with. Nevertheless, an
assessment can be accurate regardless of whether or not it is believed by
those hearing or reading it. For example, a person may not accept that the
earth is round, but their lack of belief does not mean that the world will
change to fit their perceptions.
What would they know. This is obviously attacking the person not the
idea, assessment or argument. You will note that while this should be a
question (opening up discussion), it is more often used as a statement
86
(used to shut down debate). Again, this fails to address the strength of an
argument itself and ignores that an argument does not rest on the person
delivering it but on the logic and validity of the premises and conclusions.
The mysterious it. This relates to the offhanded remark often used to
entirely disregard an individual, group or organisations assessment with
the statement they just dont get it. This is similar to attacking the person,
and does not relate to the argument but focuses on the person or group
making the judgement. When we hear this statement, we can try getting
the person to explain what it is. Too often, they are unable to produce
a coherent description and it turns out to be something other than the
analysis itself.
87
88
Chapter 13
Unfinished Intelligence
Deferred Judgement
We commonly begin our analysis of a problem by formulating our
conclusions; we thus start at what should be the end of the analytic process.
Morgan D. Jones1
As difficult as it is for all of us, deferring our judgement is an important skill to
practise. It is easy for us to immediately come up with a theory or a conclusion
when presented with a problem to analyse. The shortcoming of this approach
is that we fall into the attitude of Weve got our answer, now we just need to
prove it. Instead, rather than jumping to a conclusion the better approach is to
try to maintain an open mind, consider multiple possibilities, and see where the
information takes us. This is easier said than done, but there are consequences
to making up our mind too quickly. We have already spoken about confirmation
bias where we make the information fit into our first theory or conclusion, and
filter out information that contradicts our theory. This is similar to the concept of
seizing and freezing, in which we seize on an initial conclusion and freeze out all
contradictory information.2
Once we have committed to a position, it appears to be very difficult to shift from
this and it requires far more evidence to move us from our initial judgement. As
Jones, Morgan D, 1998, The Thinkers Toolkit: 14 Powerful Techniques for Problem Solving, Three
Rivers Press, New York, NY, p 11.
Arie Kruglanski and Donna Webster, discussed in Arkes, Hal R & Kajdasz, James, 2011, Intuitive
theories of behavior, in Fischhoff, Baruch & Chauvin, Cherie (eds), Intelligence Analysis:
Behavioral and Social Scientific Foundations, The National Academies Press, Washington, DC,
pp154155.
89
has been argued, [p]eople form impressions on the basis of very little information,
but once formed, they do not reject or change them unless they obtain rather solid
evidence.3 We can also become emotionally invested in our initial judgements if
we perceive a loss of face in changing our position.4 Instead, we might think of
a comparison with weather forecasting and sports betting, forecasters in both of
these fields constantly update and amend their forecasts and odds in light of new
information. Even then, they can get it wrong.
Keeping an open mind is difficult, particularly in a time-compressed environment.
Additionally, this lack of closure makes us uncomfortable as we have to reconsider
what we might have already accepted.5 Nevertheless, if we are striving for
accuracy, then following where the information takes us appears far more logical
than determining where the information should lead. Perhaps the best approach to
keeping an open mind is to practise the attitude of asking What does this mean?
before concluding This is what it means.
Unfinished Intelligence
One of the terms regularly used in intelligence analysis is finished intelligence.
Finished intelligence is the analytical product that we develop and distribute,
usually in the form of a report, assessment or brief. However, the idea of finished
intelligence potentially hinders rather than helps us. Why? Many of the situations
intelligence analysts deal with remain ongoing and unfinished long after the analysis
has been published.
Threats change, adapt and react to our own behaviours and tactics, people go on
making decisions, altering their behaviour, and history keeps on being made.
Because intelligence analysis deals with people-based problems, these situations
continue. It is our published assessments that become fixed and permanent, not
necessarily the situation. Consequently, even though we do need to commit to an
assessment, a better approach is to think of most intelligence analysis as unfinished.
By thinking of intelligence as unfinished, we are reminded of the need to rethink
and revisit our original (even published) assessments. Ultimately, we do not want
our personnel thinking to be that they are safe and secure if the situation on the
ground has changed.
Heuer, Richards J, Jr, 2005, Limits of Intelligence Analysis, Orbis, vol 49, no 1, Winter, pp 8384.
Johnston, Dr Rob, 2005 Analytic Culture in the U.S. Intelligence Community: An Ethnographic
Study, The Center for the Study of Intelligence, Washington, DC, pp 2224.
90
Unfinished Intelligence
Peer Review
Within the scientific and academic communities, emphasis is placed on the peer
review of articles and papers prior to their acceptance for publication. The idea
behind peer review is that it provides an independent and impartial critique of
research and findings by other people within the field. By subjecting work to
external critique, the aim is to produce high-quality and credible research. Though
peer review has both strengths and weaknesses, the field of intelligence analysis
does not have a similarly ingrained culture of independent or external critique.
Some of this is due to secrecy and limits on access to information and analysis. The
problem is that these limits actually decrease the diversity of perspectives brought
to a particular situation or issue. Consequently, any opportunity for peer review and
critique by those removed from the immediate analysis should be encouraged.
Using a peer review approach prior to publishing analysis is worthwhile, even if
only to identify weaknesses in the analysis, unanswered questions or alternative
perspectives. Given the constraints on time and the organisational approval
processes, peer review is likely to be done by people we already know. Indeed, it
might be us proactively approaching peers to request feedback on our analysis. In
this case, a couple of guidelines are worth following. First, we need to identify peers
who will provide honest and constructive feedback, not people who will tell us what
they think we want to hear. Second, we should not set the scene for our reviewers
by over-explaining the product before they have read it; the analytical product
should be self-contained and speak for itself. Finally, when we get feedback, we
A number of authors make the argument for formal critique and after action reviews, including
Jeffrey R Cooper and Rob Johnston.
91
need to genuinely thank the person for their time and effort and actually listen to
what they have to say. While we might not accept all the feedback, or make all the
recommended changes (which is fine), finding a peer who will genuinely critique
our work is invaluable. Indeed, we might find ourselves being able to assist others
by providing feedback. Then we will appreciate both sides of peer review, receiving
and giving feedback.
92
Unfinished Intelligence
Given how often we work in teams, the crew debrief or after action review approach
is a particularly useful technique for reviewing performance. These also align
with calls for intelligence analysis to adopt institutionalised self-examination and
systematic lessons-learned processes.8 In terms of analysis, such a process would
involve looking at: What analysis was accurate?, What was inaccurate?, Did the
analysis meet the clients needs? Why or why not? and What can be incorporated
into future analysis? Such a review would also highlight how long the analysis took,
the scope and depth of research undertaken, and the total resources required.
Analytic Principles
The final topic of this book is the concept of analytical principles. The purpose is
to deliberately think about what principles we as analysts believe should guide our
behaviour. Research has indicated that professional standards and codes of ethics,
Cooper, Jeffrey R, 2005, Curing Analytic Pathologies: Pathways to Improved Intelligence Analysis,
Center for the Study of Intelligence, Washington, DC, pp 56 & 54.
93
Ariely, Dan, 2009, Predictably Irrational: The Hidden Forces that Shape Our Decisions, Harper,
London, pp 206215.
10 Brei, William S, 2005, Getting intelligence right: the power of logical procedure, in Lightfoot,
James E (ed), Learning With Professionals: Selected Works from the Joint Military Intelligence
College, Joint Military Intelligence College, Washington, DC, p 51.
94
Chapter 14
Summary
The knowledge and methods described in this book are designed to meet the needs
of junior analysts entering the field and act as a reference for more experienced
analysts. The insights and techniques contained were included as they are able to
assist us in dealing with a diverse range of analytical problems, whether operating
as an individual analyst or as part of a team.
While intelligence analysis supports decision-makers, it is itself a form of decisionmaking. Intelligence analysis is a continual process of forming judgements based on
available information while dealing with inherent uncertainty. The problems facing
us as analysts are often complex, people-based and future-oriented, making them
challenging on a number of levels. Regardless of the inherent difficulty, intelligence
analysis is important as it provides the basis for significant decisions and actions.
This reinforces the need for us to ensure that our thinking is considered, rigorous,
defensible and, as much as possible, transparent.
This book provides practical approaches and insights to encourage us to think
critically about situations, recognise cognitive biases, consider alternatives, and be
able to justify our analytical judgements. In essence, it is about being able to make
reasoned and supported judgements. As analysts, we should be able to understand
and explain how and why we reached a certain conclusion, consider alternative
possibilities, and reconsider assessments in light of new information.
95
Key Points
Time. Be prepared, backcast a time line, stick to it and plan the task
realistically.
What does the evidence support? And what would it take to change my
mind?
Cause and Effect. We can see patterns that do not exist and look for
explanations that are not there. Similar causes can produce different
effects.
Environments and social situations influence us, but we are not slaves to
them.
Formal methods aid critical thinking and make our thinking explicit and
transparent.
Mindsets. We cannot get away from them, but we can acknowledge them.
96
Summary
97
98
References
References
Andrew, Christopher, 2004, Intelligence, international relations and undertheorisation, Intelligence and National Security, vol 19, no 2, Summer, pp
170184
Ariely, Dan, 2009, Predictably Irrational: The Hidden Forces that Shape Our
Decisions, Harper, London
Arkes, Hal R & Kajdasz, James, 2011, Intuitive theories of behavior, in Fischhoff,
Baruch & Chauvin, Cherie (eds), Intelligence Analysis: Behavioral and Social
Scientific Foundations, The National Academies Press, Washington, DC,
pp143168
Baggini, Julian & Fosl, Peter S, 2010, The Philosophers Toolkit: A Compendium
of Philosophical Concepts and Methods, Second Edition, Wiley Blackwell,
Chichester
Baron, Jonathon, & Hershey, John C, 1988, Outcome bias in decision evaluation,
Journal of Personality and Social Psychology, vol 54, no 4, April, pp 569579
BBC Prison Study, viewed 4 September 2014, <www.bbcprisonstudy.org>
Ben-Israel, Isaac, 1989, Philosophy and methodology of intelligence: the logic of
estimate process, Intelligence and National Security, vol 4, no 4, pp 660718
Betts, Richard K, 1978, Analysis, war and decision: why intelligence failures are
inevitable, World Politics, vol 31, no 01, October, pp 6189
Brei, William S, 2005, Getting intelligence right: the power of logical procedure,
in Lightfoot, James E (ed), Learning With Professionals: Selected Works from
the Joint Military Intelligence College, Joint Military Intelligence College,
Washington, DC, pp 4776
Browne, M Neil & Keeley, Stuart M, 2007, Asking the Right Questions: A Guide to
Critical Thinking, Eighth Edition, Pearson Prentice Hall, Upper Saddle River, NJ
Buzan, Tony, 1997, Use Your Head, BBC Books, London
Cooper, Jeffrey R, 2005, Curing Analytic Pathologies: Pathways to Improved
Intelligence Analysis, Center for the Study of Intelligence, Washington, DC
Daase, Christopher & Kessler, Oliver, 2007, Knowns and unknowns in the war
on terror: uncertainty and the political construction of danger, Security
Dialogue, vol 38, no 4, pp 411434
99
100
References
Heuer, Richards J, Jr, 1999, Psychology of Intelligence Analysis, Center for the Study
of Intelligence, Washington, DC
Heuer, Richards J, Jr, 2005, Limits of intelligence analysis, Orbis, vol 49, no 1,
Winter, pp 7594
Heuer, Richards J, Jr & Pherson, Randolf H, 2011, Structured Analytic Techniques
For Intelligence Analysis, CQ Press, Washington, DC
Hutchins, Susan G, Pirolli, Peter L & Card, Stuart K, 2007, What makes intelligence
analysis difficult?: a cognitive task analysis, in Hoffman, Robert R, (ed),
Expertise out of Context, Lawrence Erlbaum Associates, New York, NY, pp
298304
Johnson, Loch K, 2007, An introduction to the intelligence studies literature, in
Johnson, Loch K (ed), Strategic Intelligence Volume 1: Understanding the
Hidden Side of Government, Praeger Security International, Westport, CT
Johnston, Dr Rob, 2005, Analytic Culture in the U.S. Intelligence Community: An
Ethnographic Study, The Center for the Study of Intelligence, Washington, DC
Jonassen, David H, 2004, Learning to Solve Problems: An Instructional Design
Guide, Pfeiffer, San Francisco, CA
Jones, Morgan D, 1998, The Thinkers Toolkit: 14 Powerful Techniques for Problem
Solving, Three Rivers Press, New York, NY
Kahneman, Daniel & Klein, Gary, 2009, Conditions for intuitive expertise: a failure
to disagree, American Psychologist, vol 64, no 6, September, pp 515526
Kahneman, Daniel, 2011, Thinking, Fast and Slow, Penguin Books, London
Kent, Sherman, 1964, Words of Estimative Probability, Center for the Study of
Intelligence, Washington, DC, viewed 4 September 2014, < https://www.cia.
gov/library/center-for-the-study-of-intelligence/csi-publications/books-andmonographs/sherman-kent-and-the-board-of-national-estimates-collectedessays/6words.html>
Klein, Gary, 1998, Sources of Power: How People Make Decisions, MIT Press,
Cambridge, MA
Kohn, Nicholas W & Smith, Steven M, 2011 Collaborative fixation: effects of
others ideas on brainstorming, Applied Cognitive Psychology, vol 25, no 3, pp
359371
101
Kurtz, Cynthia F & Snowden, David J, 2003, The new dynamics of strategy: sensemaking in a complex and complicated world, IBM Systems Journal, vol 42,
no 3, pp 462483
Lanning, Lt Col (Ret) Michael Lee, 1996, Senseless Secrets: The Failures of
U.S. Military Intelligence, From George Washington to the Present, Carol
Publishing Group, Secaucus, NJ
Lefebvre, Stphane, 2004, A look at intelligence analysis, International Journal of
Intelligence and CounterIntelligence, vol 17, no 2, pp 231264
Levitt, Steven & Dubner, Stephen, 2009, SuperFreakonomics, Allen Lane, New
York, NY
Lewis, Major Dennis, 2004, Predictive Analysis: An Unnecessary Risk in the
Contemporary Operating Environment, US Army School of Advanced
Military Studies, Fort Leavenworth, KS
Luft, Joseph & Ingham, Harrington, 1970, The Johari window, a graphic model of
interpersonal awareness, in Luft, Joseph, Group Processes: An Introduction to
Group Dynamics, Second Edition, Mayfield Publishing, Palo Alto, CA
Ministry of Defence (UK), 2011, Joint Doctrine Publication (JDP) 2-00:
Understanding and Intelligence Support to Joint Operations, Third Edition,
Development, Concepts and Doctrine Centre, Shrivenham
Moore, David T, 2007, Critical Thinking and Intelligence Analysis, Occasional
Paper Number Fourteen, National Defense Intelligence College, Washington,
DC
Moore, David T, 2011, Sensemaking: A Structure for an Intelligence Revolution,
National Defense Intelligence College, Washington, DC
North Atlantic Treaty Organization, 2012, NATO guide for Judgement-Based
Operational Analysis in Defence Decision Making: Analyst-Oriented Volume:
Code of Best Practice for Soft Operational Analysis, NATO Research and
Technology Organization, Neuilly-sur-Seine Cedex
Office of the Director of National Intelligence, 2008, Analytic Transformation:
Unleashing the Potential of a Community of Analysts, Office of the Director of
National Intelligence, Washington, DC, September
Ohno, Taiichi, 1988, Toyota Production System: Beyond Large-Scale Production,
Productivity Press, Portland, OR
102
References
Ormand, Sir David, 2009, The National Security Strategy: Implications for the UK
intelligence community, Institute for Public Policy Research, Discussion Paper,
February
Petersen, Martin, 1985, Managing/teaching new analysts, Studies in
Intelligence, vol 30, Fall, pp 19, viewed 4 September 2014, <http://www.
nationalsecuritylaw.org/files/received/CIA/Petersen-Managing_Teaching_
New_Analysts.pdf>.
Petersen, Martin, 2011, What I learned in 40 years of doing intelligence analysis for
US foreign policymakers, Studies in Intelligence, vol 55, no 1, March, pp 1320
Phythian, Mark, 2009, Intelligence analysis today and tomorrow, Security
Challenges, vol 5, no 1, Autumn, pp 6783
Pidd, Michael, 2004, Tools for Thinking: Modelling in Management Science, Second
Edition, Wiley, Chichester
Prunckun, Hank, 2010, Handbook of Scientific Methods of Inquiry for Intelligence
Analysis, Scarecrow Press, Lanham, MD
Puvathingal, Bess J & Hantula, Donald A, 2012, Revisiting the psychology of
intelligence analysis: from rational actors to adaptive thinkers, American
Psychologist, vol 67, no 3, April, pp 199210
Quiggin, Thomas, 2007, Seeing the Invisible: National Security Intelligence in an
Uncertain Age, World Scientific Publishing, Singapore
Schmitt, Major John F & Klein, Gary A, 1996, Fighting in the fog: dealing with
battlefield uncertainty, Marine Corps Gazette, vol 80, no 8, August, pp 6269
Schulz-Hardt, Stefan, Brodbeck, Felix C, Mojzisch, Andreas, Kerschreiter, Rudolf
& Frey, Dieter, 2006, Group decision making in hidden profile situations:
dissent as a facilitator for decision quality, Journal of Personality and Social
Psychology, vol 91, no 6, December, pp 10801093
Shanteau, James, 1992, Competence in experts: the role of task characteristics,
Organizational Behavior and Human Decision Processes, vol 53, no 2, pp 252266
Silver, Nate, 2012, The Signal and the Noise: Why So Many Predictions Fail But
Some Dont, Penguin Press, New York, NY
Simon, Herbert, 1992, What is an explanation of behaviour?, Psychological
Science, vol 3, no 3, May, pp 150161
103
104
References
Zimbardo, Philip, 2011, The Lucifer Effect: How Good People Turn Evil, Random
House, New York, NY
105
106