Typology of Human Extinction Risks

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Typology

Possible timeline

Main adverse
factor of
the
catastrophe

of

Human

Extinction

Risks

100 000 BC

20 century

Beginning of the 21 century

2030

2050

3000

Natural
risks

Natural risks
activated by
human activity

Based on
known
technologies

Based on
new
technologies

Superintelligence
as final
technology

Remote and
hypothetical
risks

Explosions

Energy

Big asteroid impact (20 km size)


Supervolcano explosion,
flood basalt event
Gamma ray burst, nearby
supernova, neutrino from it
Solar flare or CME during weak
earth magnetic field
Sun grazing comet lead to large
flare
Ocean catastropic degasation
Hypercane (disturbance of
stability of the atmosphere)
Warming of the Earths core by
WIMPs (Rampino), its catastrophic
degazation or collapse
Activisation of Galactic central black
hole
Relict black holes
Collapse of false vacuum

Giant energy release


Accidental start of natural
catastrophe

Nuclear weapons
Nuclear war
Artificial explosion of
supervolcanos
Nuclear winter
Artificial nuclear winter by
explosions in forests
Artificial runaway global
warming by nuclear explosions on
Arctic shelf and methane release

Runaway global warming


(Venusian scenario)
New Ice Age
Disruption of electric grid
because of solar flare

Simple way of creation of


nuclear weapons
Accident on a collider:
micro black hole, false vacuum
decay, strangelet
Catastrophic magma release due
to crust penetration (Stevenson
probe)
Creation of superbomb
Artificial diverting of asteroids to
Earth using precise calculations of
celestial mechanics (space billiards)

War with AI

Artificial explosions in space

Different types of AI
Friendliness
Several AI fight each other for
world dominance
Nuclear war against AI or against
attempts to create it
Military drones rebell

Thermonuclear detonation of large


gas planets
Artificial black holes

Unfriendly AI destroy
humanity

Encounter with alien


intelligence

Failure of control system


Competitors

Intelligence

Another type of humans

(like Homo sapience were for


Neanderthals)
Another type of intelligent
specie

Disjunction

Impairment of intelligence

Human enhancement leads to


competing species

Ignorance and neglect of


existential risks
Dangerous memes
(fascism, fundamentalism)
Cognitive biases
Risky decisions (nuclear war)
Tragedy of the Commons
(arms race)

Super addictive drug (brain


stimulation)
Malfunction in robotic army causes
mass destruction
Computer virus with narrow AI
uses home robots and bioprinters
Dangerous meme (like ISIS)
GMO parasite that causes mass
destructive behaviour
Accidental nuclear war

As a risk to this AI
To get resources for his goals
(Paperclip maximaizer)
Realize incorrectly formulated
friendliness (Smile maximizer)
Fatal error in late stage AI (AI
halting)

Downloading Alien AI via SETI


ET arrive on Earth
METI attracts dangerous ET
Beings from other dimensions
We live in simulation and it
switched off or is built to model
catastrophes

Synthetic biology

Organisms

Replication

Pandemic
Super pest
Dangerous predator
Dangerous carrier
(mosquitoes)

Atmosphere

Poisoning

Atmospheric composition change


(disappearance of oxygen)
Release of toxic gases
(volcanism, methane hydrates,
dissolved gases in the ocean)

Classical species extinction

Combined
scenarios

Probability

The confluence of a set of many


adverse circumstances
Changing environment and
competitors

From 1 to 1 000 000 to


1 to 100 000 a year based on
historic risks and other species
survival time

Biological weapons

Overpopulation followed by

Dangerous pandemic (e.g.:


mutated flu)
Mistake of biodefense
(contaminated vaccination, SV40)
Artificial pest
Omnivorous bacteria and fungi
Toxin producing GMO
organisms (dioxin, VX)
Cancer cell line adapted to live
in population (HELA)

collapse

Genetic degradation

Reduction of the medium level


of intelligence in population
Decline in fertility
The accumulation of deleterious
mutations
Antibiotic-resistant bacteria

Ecology
The accumulation of new toxins
in the biosphere and its collapse
Resource depletion

Chain of
natural disasters
Epidemic followed by
degradation and extinction
Degeneration of ability of the
biosphere to regenerate, then loss
of the technology and starvation
Death of crops from superpest,
than hunger and war

Global contamination
Deliberated chemical
contamination (Doixin build-up and
release)
Deliberate destruction of all
nuclear reactors using nuclear weapons
Autocatalytic reaction like Ice-9,
artificial prions

War as a trigger
Nuclear war leads to use or release
of biological weapons
World War leads to arms race and
creation of Doomsday Machines
System crisis caused by many
factors

Small, as such risks are longterm, but more


dangerous risks of new technologies could happened much
earlier

The roadmap is based on the book Risks of human extinction in the 21 century (Alexey Turchin and Michael Anissimov).
Most important risks are in bold. Most hypothectical are italic. This roadmap is accompained by another one about the ways of prevention of existential risks.
Alexey Turchin, 2015, GNU-like license. Free copying and design editing, discuss major updates with Alexey Turchin.
http://immortality-roadmap.com/ last updated version here
Proofreading and advise: Michael Anissimov

0.1 1%
a year

Multipandemic based on genetic engineering (10-100 lethal


viruses)
Home bioprinters and
biohackers
Artificial carriers like flies
Green goo (biosphere is eaten by
artificial bioreplicators and viruses)
Fast replicating killers (poisonous
locust)

Nanotech

Grey goo (unlimited replication)


Nanotech weapons (small robots)
Universal nanoprinter (creates
all other weapons)

Artificial global contamination


(Doomsday machine)
Cobalt superbomb with global
contamination
Destruction of electric grid by
nuclear explosions in stratosphere
Several Doomsday machines with
contradictory conditions
Artificial penetration of earth core
with reactor-probe

Control systems failure


As a result of the civilizational
crisis geoengineering system of
termoregulation fails and a sharp
global warming results (Seth Baum)
Failure of Bioshield or Nanoshield
where it malfunctions and threatens
humanity or the biosphere

10 30%
total

Catastrophe on the level of


mind and values

Extraterrestrial robots
ET nanorobots-replicators
ET robots-berserkers, killing
civilization after certain threshold

Humans are uploaded but became


philosophical zombie without qualia
Mass unemployment caused by
automation causes ennui, and
suicide
Computer virus used for mass
attack on widely implanted brain
interface chips

Whimpers
(values contamination)
Value system of posthumanity
moves away from ours
Evil AI, which goal is to maximize
suffering of as much humans as
possible (worse than I Have No
Mouth, and I Must Scream)

Catalysts of bad scenarios


Cold War leads to creation of
Unfriendly AI
Asteroid impact causes accidental
nuclear war
Scientists studying deadly viruses
accidentally release them

50%
total

Unknown physical laws

Weapons based on them


Unexpected results of experiments
Dangerous natural phenomena
End of the Universe: Big Rip,
vacuum state transition
Unknown unknowns

Complexity crisis
Unpredictably, chaos, and black
swans lead to catastrophe

Small

You might also like