Week 11 Lesson 3
Week 11 Lesson 3
1
destruction,” 21st-century technologies are Joy’s dilemma and argument
much more readily available to individuals or ● His worries focus on the transforming
small groups, and having knowledge alone is technologies of the 21st century— genetics,
sufficient to enable their deployment. nanotechnology, and robotics (GNR). Joy
5.1. Scientist Critics claims that we will soon achieve the computing
- Joy traces his worries to a discussion he had with power necessary to implement some of the
Ray Kurzweil at a conference in 1998. He had read an scenarios envisioned by Kurzweil and
early draft of Kurzweil’s The Age of Spiritual Moravec, but worries that we overestimate our
Machines: When Computers Exceed Human design abilities. Such hubris may lead to
Intelligence and found it deeply disturbing. disaster. Summarize below are some
- Subsequently, he encountered arguments by the opponents’ arguments from his article. Joy’s big
Unabomber Ted Kaczynski. Kaczynski argued that if fish eat little fish argument quotes robotics
machines do all of society’s work, as they inevitably pioneer Hans Moravec: “Biological species
will, then we can: almost never survive encounters with
a. let the machines make all the decisions; or superior competitors.” He suggests we will
b. maintain human control over the machines. be driven to extinction by our superior robotic
- If we choose “a” then we are at the mercy of our descendants.
machines. It is not that we would give them control or ● In his vision of the future presupposes that
that they would take control, rather, we might become robots and humans will remain separate
so dependent on them that we would have to accept creatures, a view explicitly rejected by robotics
their commands. Needless to say, Joy doesn’t like this expert Rodney Brooks and others. Thus, we
scenario. don’t know that robots will be the bigger fish,
- If we choose “b” then control would be in the hands that they will eat us even if they are, or that
of an elite, and the masses would be unnecessary. In there will even be distinct fishes.
that case, the tiny elite: ● Joy’s mad scientist argument describes a
1. would exterminate the masses molecular biologist who “constructs and
2. reduce their birthrate so they slowly became disseminates a new and highly contagious
extinct; or plague that kills widely but selectively.”
3. become benevolent shepherds to the masses. self-replication amplifies the danger of GNR: “A
bomb is blown up only once— but one bot can
- The first two scenarios entail our extinction, but even become many, and quickly get out of control.”
the third option is bad. In this last scenario, the elite First of all, bombs replicate, they just don’t
would fulfill all physical and psychological needs of the replicate by themselves. Joy’s concern must
masses, while at the same time engineering the not be with replication, but with self-replication.
masses to sublimate their desire for power. In this ● Joy’s lack of control argument focuses on the
case, the masses might be happy, but they wouldn’t be self-replicating nature of GNR. self-replication
free. amplifies the danger of GNR: “A bomb is blown
- Joy finds these arguments both convincing and up only once—but one bot can become many,
troubling. About this time Joy read Hans Moravec's and quickly get out of control.”
Book Robot: Mere Machine to Transcendent Mind ● Robotic self-replication appears to be out of
where he found predictions similar to Kurzweil’s. Joy our control, as compared to our own or other
was especially concerned by Moravec’s claim that humans’ self-replication.
technological superiors always defeat technological ● Joy fears that robots might replicate and then
inferiors, as well as his claim that humans will become enslave us
extinct as they merge with the robots. Disturbed, Joy ● Joy stand corrected in “uncontrolled
consulted other computer scientists who, for the most self-replication in these newer technologies
part, agreed with these predictions. runs a risk of substantial damage in the
- In addition, Joy’s vision of the future presupposes physical world,” so too does the “uncontrolled
that robots and humans will remain separate creatures, self replication” of humans, their biological
a view explicitly rejected by robotics expert Rodney tendencies, their hatreds, and their ideologies.
Brooks and others. If Brooks is correct, humans will ● Joy’s easy access argument claims that
gradually incorporate technology into their own bodies 20th-century technologies nuclear, biological,
thus eliminating the situation that Joy envisions. and chemical (NBC)—required access to rare
2
“raw materials and highly protected
information,” while 21st-century technologies
“are widely within the reach of individuals or
small groups.”
➢ This means that “knowledge alone will enable
the use of them,” a phenomenon that Joy
terms: “knowledge-enabled mass destruction
(KMD).”
● Joy’s technologies make things worse
argument.
- As for genetic engineering, I know of no
reason—short of childish pleas not to play
God—to impede our increasing abilities to
perfect our bodies, eliminate disease, and
prevent deformity.
● As for nanotechnology, Joy eloquently writes of
how “engines of creation” may transform into
“engines of destruction
● Joy gives us no reason whatsoever to share his
fear about the fact that NBC technologies have
largely military uses and were developed by
governments, while GNR have commercial
uses and are being developed by corporations.
Joy’s it’s never been this bad argument:
➢ “this is the first moment in the history of
our planet when any species by its
voluntary actions has become a danger
to itself.” Thus, humans are a greater
threat to themselves now than ever
before.
● A basic difficulty with Joy’s article is this: he
mistakenly accepts the notion that technology
rules people rather than the reverse. But if we
can control our technology, there is another
solution to our dilemmas. We can use our
technology to change ourselves; to make
ourselves more ethical, cautious, insightful, and
intelligent.
Aftermath
- After the publication of the article, Bill Joy suggested
assessing technologies to gauge their implicit dangers,
as well as having scientists refuse to work on
technologies that have the potential to cause harm.
- In the 15th Anniversary issue of Wired in 2008, Lucas
Graves's article reported that the genetics,
nanotechnology, and robotics technologies have not
reached the level that would make Bill Joy's scenario
come.