Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing
Ebook523 pages7 hours

Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing

Rating: 3.5 out of 5 stars

3.5/5

()

Read preview

About this ebook

Bootstrapping analyzes the genesis of personal computing from both technological and social perspectives, through a close study of the pathbreaking work of one researcher, Douglas Engelbart. In his lab at the Stanford Research Institute in the 1960s, Engelbart, along with a small team of researchers, developed some of the cornerstones of personal computing as we know it, including the mouse, the windowed user interface, and hypertext. Today, all these technologies are well known, even taken for granted, but the assumptions and motivations behind their invention are not. Bootstrapping establishes Douglas Engelbart's contribution through a detailed history of both the material and the symbolic constitution of his system's human-computer interface in the context of the computer research community in the United States in the 1960s and 1970s.

Engelbart felt that the complexity of many of the world's problems was becoming overwhelming, and the time for solving these problems was becoming shorter and shorter. What was needed, he determined, was a system that would augment human intelligence, co-transforming or co-evolving both humans and the machines they use. He sought a systematic way to think and organize this coevolution in an effort to discover a path on which a radical technological improvement could lead to a radical improvement in how to make people work effectively. What was involved in Engelbart's project was not just the invention of a computerized system that would enable humans, acting together, to manage complexity, but the invention of a new kind of human, "the user." What he ultimately envisioned was a "bootstrapping" process by which those who actually invented the hardware and software of this new system would simultaneously reinvent the human in a new form.

The book also offers a careful narrative of the collapse of Engelbart's laboratory at Stanford Research Institute, and the further translation of Engelbart's vision. It shows that Engelbart's ultimate goal of coevolution came to be translated in terms of technological progress and human adaptation to supposedly user-friendly technologies. At a time of the massive diffusion of the World Wide Web, Bootstrapping recalls the early experiments and original ideals that led to today's "information revolution."

LanguageEnglish
Release dateDec 1, 2000
ISBN9781503618367
Bootstrapping: Douglas Engelbart, Coevolution, and the Origins of Personal Computing

Related to Bootstrapping

Computers For You

View More

Reviews for Bootstrapping

Rating: 3.6666667 out of 5 stars
3.5/5

6 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Bootstrapping - Thierry Bardini

    WRITING SCIENCE

    EDITORS Timothy Lenoir and Hans Ulrich Gumbrecht

    BOOTSTRAPPING

    Douglas Engelbart, Coevolution, and the Origins of Personal Computing

    THIERRY BARDINI

    STANFORD UNIVERSITY PRESS

    STANFORD, CALIFORNIA

    Stanford University Press

    Stanford, California

    © 2000 by the Board of Trustees of the Leland Stanford Junior University

    Library of Congress Cataloging-in-Publication Data

    Bardini, Thierry.

    Bootstrapping : Douglas Engelbart, coevolution, and the origins of personal computing / Thierry Bardini.

    p. cm—(Writing science)

    Includes bibliographical references and index.

    ISBN 0-8047-3723-1 (alk. paper)—ISBN 0-8047-3871-8 (paper : alk. paper)— ISBN 9781503618367 (ebook)

    1. Microcomputers—History. 2. Human-computer interaction. 3. User interfaces (Computer systems). I. Title. II. Series.

    QA 76.17. B37 2000

    004.16'09—dc21

    00-056360

    This book is printed on acid-free, recycled paper.

    Original printing 2000

    Last figure below indicates year of this printing:

    09    08    07    06    05    04    03    02    01    00

    Printed in the United States of America

    CONTENTS

    Illustrations

    Preface

    Introduction: Douglas Engelbart’s Crusade for the Augmentation of Human Intellect

    1. Language and the Body

    2. The Chord Keyset and the QWERTY Keyboard

    3. The Invention of the Mouse

    4. Inventing the Virtual User

    5. SRI and the oN-Line System

    6. The Arrival of the Real User and the Beginning of the End

    7. Of Mice and Man: ARPANET, E-mail, and est

    Coda: Where Hand and Memory Can Meet Again

    Appendix: Personnel at Engelbart's SRI Lab

    Notes

    Works Cited

    Index

    ILLUSTRATIONS

    1-1. Engelbart’s Portrayal of the H-LAM/T System

    2-1. Engelbart’s Chord Keyset

    2-2. Cooke and Wheatstone’s Five-Needle Telegraph, 1837

    2-3. Francis’s Machine, 1857

    2-4. The Sholes, Glidden, and Soulé Machine of 1868

    2-5. Hughes’s Type-Printing Telegraph

    2-6. John Pratt’s Typewriter

    2-7. The Sholes and Glidden Machine of 1873

    2-8. The Remington Shift-Key Typewriter

    3-1. Memex, Front View

    3-2. Memex’s Twin Screens

    3-3. The Whirlwind Light Gun

    3-4. The RAND Tablet

    3-5. A Mathematical Representation of the Integral

    3-6. Maxwell’s Illustration

    3-7. The Operating Principle of the Planimeter

    3-8. Illustrations for the Original Patent of the Mouse

    4-1. The Pre–Cognitive Science View of the Interface

    4-2. The Mental-Models View of the Interface

    4-3. A Simple Model of the Interface, ca. 1989

    4-4. The Knee Control

    4-5. Front View of an Early Three-Button Mouse

    5-1. The Special Devices Channel

    5-2. The Herman Miller NLS Console

    6-1. The Network That Developed the Personal Interface

    6-2. The Flex Machine

    6-3. The Original Desktop, 1976

    C-1. The Genesis of the Personal Interface

    8 pages of photos

    PREFACE

    Instead of starting from the individuality of the technical object, or even from its specificity, which is very unstable, to try to define the laws of its genesis in the framework of this individuality or specificity, it is better to invert the problem: it is from the criterion of the genesis that we can define the individuality and the specificity of the technical object: the technical object is not this or that thing, given hie et nunc, but that which is generated.

    —GILBERT SIMONDON, Du mode d’existence des objets techniques

    How did the creators of personal computer technology envision those who would use it? How did they perceive the future of computers within the larger society? What technical options were included, or excluded, from the hardware, systems software, and applications on the basis of these representations? How have technical designs based upon the values and visions of early technical innovators shaped the way users integrate present-day computers into their work?

    To understand the answers to these questions, and with them, the origins of personal computing, it is necessary to begin by understanding the contributions of Douglas Engelbart and the concerns that motivated them. Famous and revered among his peers, Engelbart is one of the most misunderstood and perhaps least-known computer pioneers. This book proposes to remedy this, and not only for the sake of a case study or to claim a spot for Douglas Engelbart in the pantheon of the computer revolution, but also because such an enterprise teaches us many lessons in the development, diffusion, and effect of the defining technology of the twentieth century: the computer.

    This book is intended for various audiences and answers different expectations accordingly. A first type of reader will find in the book ample historical results about the genesis of personal computing and a definitive account of Engelbart’s research program at his lab, the Augmentation Research Center at the Stanford Research Institute. For this historically inclined reader, the interest of the book will stem from a well-documented thesis about the achievement and significance of the ARC laboratory that will go further than the published versions, which are generally characterized by a lack of theoretical focus. It shows how and why a significant part of what defines life as we now live it came into being.

    A second type of reader will consider the book as my contribution to ongoing debates in sociology of science and technology or communication. For this scholarly reader, the strength and value of the book also will reside in the way the case study is carried out in order to cast a new light, based on an informed multidisciplinary perspective, on the sociology of science and technology. For the communications scholar, this book is informed by the current debates on the future of communication technologies and audiences and proposes an innovative argument to answer the fundamental question of the relationship between technology and user.

    The research reported in this book started, oddly enough, with a report for the United Nations Food and Agricultural Organization (Rogers, Bardini, and Singhal 1992). I helped Dr. Everett M. Rogers advise this institution about the potential uses of microcomputers in the South. It resulted in more questions and frustrations than answers, and especially in one crucial question: what is a microcomputer?

    Terry Winograd (1990, 443) once said that we may sometimes forget how new it all is. . . . Not so long ago, there were just a few odd psychologists concerned with ‘human factors,’ who happened to study how people used computers. How new indeed: most current personal computer users do not realize that they have more computer power at their fingertips than NASA had to send a man on the moon. But if the computing technology is new, Winograd is right to say that the interest for its human users is even newer.

    The call for a better understanding of the human side of the computing technology, however, has been made repeatedly since the mid-1980’s. Jonathan Grudin, for instance, noted that the effectiveness of computers as agents in the world will increase in step with their greater understanding of us. For that reason, work to develop an understanding of people will remain at the very heart of the computer’s development. It is the engine of change (1990, 267). My work intends to develop such an understanding of people in connection to computers: not an understanding of the cognitive and physical processes of the individual user, but rather an understanding of users as collective entities emerging through time. It focuses on the history of the forms of agency in the human-computer interaction from a sociological perspective. It is a genealogy of the human-computer interface.

    The earliest notion of an interface comes from the Greek prosōpon, a face facing another face (Heim 1993, 78). The interpersonal and unmediated prosōpon remains the ultimate model of the computer interface, the dream of a transparent, unintrusive medium. In computer-mediated communication, the interface has progressively emerged as a fundamental notion. Michael Dertouzos once noted that when computers first appeared, input-output commands were minor afterthoughts to cohesive, often well-crafted and occasionally pretentious programming languages. Today, these commands occupy over 70 percent of programming systems’ instructions (1990, 1). Jonathan Grudin realized that the term user interface is technologically-centered and that in the engineering perspective that gave birth to this notion, "the equation of the ‘user interface’ to software and I/O devices means, ironically, that 'user interface' denotes the computer interface to the user . . . not the user's interface to the computer" (1993, 112, 115, emphasis in the original).

    In this book, I look at the emergence of the personal computer interface in both senses, not just as the emergence of a technology independent of those who develop it and those who are thought of as using it. The slow and sometimes painful process of imagining the personal computer was not just a process of technological innovation, independent of uses and users, as tends to be the norm in the historical accounts of the development of the computer. In its inception, as the career of Douglas Engelbart shows, the development of the personal computer interface was a technology by and about people.

    In more traditional accounts, the computer is first a batch-processing machine, an information-processing device that processes data, usually coded in punch cards, in huge batches. In a second phase, computing time is shared among users who can run specific tasks on the same computer simultaneously. In a third phase, each user has access to a devoted stand-alone machine that sits on his or her desktop. And finally, the stand-alone workstations of the previous phase are connected into a network.

    Such a way to describe the evolution of computing focuses on the specific characteristics of the computer at a given time and usually puts the emphasis on a technological innovation that allowed the passage from one phase to the next: the time-sharing operating system, for example, the desktop metaphor of the human-computer interface, or packet switching network technologies.¹ While these innovations obviously contributed greatly to shaping the history of computing, the dynamic of personalization that characterizes the evolution of computing since the late 1940’s played an equally important role. I describe the progressive construction of the user as a person, or, what sometimes amounted to the same thing, how the computer eventually got a personality. The creators of personal computer technology linked their innovations to ideologies or representations that explained and justified their designs. Those visions have become invisible, latent assumptions to the latter-day users of the personal computer, even as they shape these users’ activities and attitudes. It is my task here to make them visible once again.

    Like Steve Shapin, and unlike some postmodernist and reflexivist friends, I write in the confident conviction that I am less interesting than the subjects I write ‘about,’ and accordingly, [that] much of this book can be read in the mode of old-fashioned historical realism (1995, xv). Everything presented as a citation in this book is real: somebody, whoever that is, told it to me in one way or another. I happen to have met most of the people I quote in this book, except the dead ones and the very famous others.

    The story I tell begins not long before I was born, and I finish it around the time I was twenty years old, long before I had any interest in what I am writing about now. Many coincidences made my writing possible, most of which were encounters with people. As a sociologist, my pleasure is to meet people, hear them, and then write about them. This book is a testimony to my respect for them. I wish for you to hear their voices.

    I am not an ironic, self-reflexive narrator, and my purpose is not to reveal the ideology of representation. These are very valuable means and purposes that deserve serious consideration, but these are not my means and purposes.² Susan Leigh Star is right: "power is about whose metaphor brings worlds together, and holds them here (1991, 52, emphasis in the original). This book is about power and marginality: my own power and marginality, as well as the respective power and marginality of the actors I represent in this book. When you read my account, my power is much stronger than the power of these represented actors. I decide who speaks and when. My respect for the people and things I represent is paralleled by the most abject and nevertheless constitutive lack of respect: I can silence them, transform their meaning by misquoting them, and so on. I assume this responsibility, and trying my best" only means that I commit myself not to enact such horrors intentionally. Remaining errors and other unintentional betrayals are my responsibility.

    In the remainder of the book, I will not write about power directly, which means that I will always write about powers. Indeed, this whole book is about powers: powers of the user, of the designer, of the analyst. In the framework I present here, power is dispersed into multiple sites, and all these dispersed powers are tied together in the same dynamic, mutually constitutive: the attempt to control the central uncertainty of innovative practices. In this process, the actors and the author are playing symmetrical parts, all collaborating in the same process of cultural production and diffusion.³

    My account is no more and no less definitive than the actors’. It is just different. I do not necessarily use the same resources, I do not necessarily intend my description to reach the same audience. I do not need to compare my narrative to the actors’ narratives and refer to an allusive metaposition. There is no metaposition, there are only discourses—and maybe authors behind them—who want us to believe that they have more depth, that they know better. I know for a fact that I do not know better. I am a priori not more or less shy, agnostic or else informed than the actors I represent. You will be the judge.

    This book is witness to the respect I owe to a few individuals: Gabriel Degert, of ENSA Montpellier, who gave me a taste of interdisciplinary scholarly work in the social sciences and nurtured my passion for sociology in an hostile environment; Rigas Arvanitis, of ORSTOM Caracas, who opened new directions of research for me and introduced me to the relativist sociology of science and technology; Everett M. Rogers, of the University of New Mexico, who gave me the opportunity to do my job in the United States, was always ready to discuss and listen, and opened to me the doors of Silicon Valley; James R. Taylor, who gave me the opportunity to teach and do research in the best conditions possible and, therefore, to tackle such a crazy project as writing this book; and, last but not least, Douglas Engelbart, of the Bootstrap Institute, who agreed to answer my questions and cheerfully helped me in writing this book.

    This book would not have existed without the patience and understanding of the people who told me their stories: Don Andrews, Bob Belleville, Peter Deutsch, Bill English, Charles Irby, Alan Kay, Butler Lampson, Harvey Lehtman, Ted Nelson, George Pake, Jeff Rulifson, Dave Smith, Robert Taylor, Keith Uncapher, Jacques Vallée, Smokey Wallace, and Jim Warren. Thank you all, and I sincerely hope that you will occasionally find your voice in these pages.

    My deepest thank-yous go to my development editor, Bud Bynack, who made a book out of my manuscript, and to my editor, Nathan MacBrien, who always knew how to keep his cool when I did not keep mine. Thank you both for believing in this book and making it live up to your belief. I also want to acknowledge the contribution of the following colleagues who greatly helped in the making of this book: Frank Biocca and the participants in the 1993 International Communication Association panel that he organized, Harry Collins and the participants in the 1995 Fourth Bath Quinquennial Workshop, Tim Lenoir and the participants in the 1995 Stanford Technology and Economics Workshop, Henry Lowood, Michael Century, Peter Salus, Michael Friedewald, John Staudenmaier, Line Grenier, Aude Dufresne, Jean-Claude Guédon, Serge Proulx, Patrice Flichy, August Horvath, Toru Nakayama, and Tom Valente. I also thank my graduate students for their (much needed) patience and support while I was writing this book and all the undergraduate students at the Department of Communication at the University of Montreal, on whom I shamelessly tried some of the ideas of this book.

    On an even more personal note, I would like to thank Lucy Ring, Riton V. and Maurice Dantec, my partners in crime, and Éric Le Ménédeu, mon peintre préféré, for their always wise advice, and Adriana Di Polo, for her support in the early stages of the project. Finally, thank you to Fabienne Lucet for standing by me when the book became one, through the tough times of revision and the glorious days, too, pour ton oreille patiente et ton sourire lumineux.

    Some of the chapters or sections of this book appeared previously elsewhere in a shortened version. The Introduction and Chapter 1 were presented at the Fourth Bath Quinquennial Science Studies Workshop, Bath, England, July 27-31, 1995. Chapters 2 and 3 appeared in French in Réseaux 87, January-February 1998. Chapter 4 appeared in the Journal of Computer Mediated Communication 3, no. 2 (September 1997), in French in Réseaux 76 (summer 1996). Parts of Chapter 6 appeared in the Journal of Communication 45, no. 3 (summer 1995). I thank all the editors of these journals for granting me the permission to reprint and update parts of these publications.

    Developments in Computer Technology, 1943–1964

    Developments in Computer Technology, 1969–1984 (Computers are shown above the line; software and components, below)

    INTRODUCTION

    Douglas Engelbart’s Crusade for the Augmentation of Human Intellect

    Journal entry 37. Thoughts of the Brain are experienced by us as arrangements and rearrangements—change—in a physical universe; but in fact it is really information and information processing that we substantialize. We do not merely see its thoughts as objects, but rather as movement, or, more precisely, the placement of objects: how they become linked to one another. But we cannot read the patterns of arrangement; we cannot extract the information in it—i.e., it as information, which is what it is. The linking and relinking of objects by the Brain is actually a language, but not a language like ours (since it is addressing itself and not someone or something outside itself).

    —PHILIP K. DICK, Valis

    Very few people outside the computer industry know Douglas Engelbart, the leading figure of the Augmentation of Human Intellect project, and among those people, many still credit him only with technological innovations like the mouse, the outline processor, the electronic-mail system, or, sometimes, the windowed user interface. These indeed are major innovations, and today they have become pervasive in the environments in which people work and play. But Douglas Engelbart never really gets credit for the larger contribution that he worked to create: an integrative and comprehensive framework that ties together the technological and social aspects of personal computing technology.

    Engelbart articulated a vision of the world in which these pervasive innovations are supposed to find their proper place. He and the other innovators of this new technology defined its future on the basis of their own aspirations and ideologies. Those aspirations included nothing less than the development, via the interface between computers and their users, of a new kind of person, one better equipped to deal with the increasing complexities of the modern world. In pursuing this vision, they created the conditions, both symbolic and material, that prescribe the possibilities and limits of the technology for the users of personal computer technology today.

    Engelbart and his associates conceived of the personal interface as a hybrid entity derived from both the human and nonhuman participants. That is, it was understood to operate by means of representations, symbolic and material, derived from both, some appearing electronically via integrated circuits and display screens, some deriving from the physical and mental abilities of the people that the designers of the technology imagined using and benefiting from them.¹ The story of the personal interface thus is twofold. It is the story of a technological innovation. At the same time, it is the story of how Douglas Engelbart and the other designers of that technology conceived of the people who would use it. That part of the story involves how they understood humans to live and work, to think and act. It also involves more than that: how they believed humans could do better at living and working, thinking and acting. Both aspects of the story meet in what Douglas Engelbart always called his crusade.

    In the 1950’s, computing technology was still in its early stages, characterized by massive machines devoted to number crunching. Input-and-output technology was rudimentary, only one user could access the computing power at a time, and a very few users indeed did so. The number of computers in use started to grow rapidly in the second half of the 1950’s, from a few hundred worldwide in 1955 to approximately five thousand in 1959 (Norberg and O’Neill 1996, 75). Two intertwined trends at that time slowly started to shape the future of computing. First, the increased use of computers by large governmental and business organizations (and with it, the inception of their cultural presence and prestige), was driven in part by the development of programs and languages that made it possible to take the computer out of the research laboratory and put it to use. Second, with the development of these programs and languages, a caste of professionals, the computer coders or programmers, started to appear. It was the time of the mainframe computer priesthood, who employed the power of the machine to serve the very large corporation or the military.

    In the first trend, programmers began to acquire increased control over their machines via new layers of code imposed over the fundamental machine code, which is made of strings of binary digits or bits (0 or 1). Even if programming had improved dramatically with the use of stored electronic programs, rather than the hardware plugboards of the 1940’s, the programs themselves still consisted of bits in machine code, and programming was exceedingly difficult, involving putting together long, unintuitive chains of 1s and 0s. By 1955, though, a relatively old idea, the use of metaprograms called assemblers and compilers, which worked as translators between some sort of more natural human language and machine code, finally became an idea whose time had come.

    The computer pioneers had thought of just such a possibility. Alan Turing, for instance, had developed a symbolic language close to an assembly language and had even written that actually one could communicate with these machines in any language provided it was an exact language, i.e. in principle one should be able to communicate in any symbolic logic, provided that the machines were given instruction tables which would enable it to interpret that logical system (quoted in Hodges 1992 [1983], 358). Indeed, the need for higher-level programs and languages was recognized from the mid-1940’s. BINAC and UNIVAC, two of the first digital computers, used short code, a system similar to Turing’s instruction tables in symbolic language, and Atick Glennie, a programmer for the Manchester Mark I computer, wrote in 1952 that to make it easy one must make coding comprehensible. Present notations have many disadvantages; all are incomprehensible to the novice, they are all different (one for each machine) and they are never easy to read. It is quite difficult to decipher coded programs even with notes and even if you yourself made the program several months ago (quoted in Lubar 1993, 3 59).

    The first compilers appeared with the first digital computers of the 1950’s: the A-O compiler of the UNIVAC, and the compiler designed by J. Halcombe Laning and Niel Zierler for the Whirlwind computer in 1953 (Lubar 1993, 360).² But a major step in making the computer useful in the real world occurred between 1954 and 1957 with the development of the first widely diffused high-level programming language: IBM FORmula TRANslator, FORTRAN for short.³ FORTRAN was designed for scientific users and soon became the standard programming language for scientific applications. It was soon followed by another language specifically designed to become the standard for business applications: COmmon Business Oriented Language, or COBOL. Even if these two languages were much closer to natural human languages, they still had two significant drawbacks by i 960: they were linear languages, much more like a spoken language than a language for thought, and their community of users was still relatively small. What is more, it was giving some signs that it wanted to stay small. The computer priesthood proved unwilling to give up its privileged position between the computer and the supplicants who wanted to make use of its power.⁴

    Programming languages remained separate languages, still unnatural and arcane to average people. In the second trend, the programmers thus became a dedicated group of professionals, at ease with the difficulties of computer programming and the seemingly arcane nature of computer languages, a necessary intermediary between the computer’s power and its end users. In 1957, there were fifteen thousand of them in the United States alone. By then, most of them had a science or math background, often a Ph.D. in mathematics (Lubar, 361).

    At first, computers had been used by individual programmers who ran their programs one at a time in scheduled sessions. Because of the difficulties of the management of the computer peripherals (tape drives, card readers, card punches, printers), the computer processing unit was active only a small part of the time of the scheduled session (see Norberg and O’Neill 1996, 770). Along with the rise of computer languages and the community of computer programmers, the mid-1950’s saw two crucial innovations that changed dramatically this situation and the nature of computing: the development of operating systems and the use of batch processing. The first innovation took care of the intricacies of managing input-output devices, while the second was a more economical way to manage computer time. Both consolidated the central position of computer specialists in the way computers were put to practical use.

    The development of operating systems meant that a machine is run by an operating system, not by an [individual] operator (Fred Brooks, quoted in Lubar 1993, 367). And in batch processing, programs were run in batches not, as previously, one at a time. So the individual user was displaced by computer operations departments that oversaw the running of the machine by the system software, with the computer and its attending priesthood usually isolated in a computer center" far from the end users of the computer, whether they were able to program or not.

    Such was the state of computing at the beginning of the 1960’s, the era that brought revolutions—or attempts at revolutions—in many areas of life. Computing was to be one area where change was most dramatic. From an arcane exercise carried out by a close-knit community of specialists, computing became available to increasingly large segments of the population, an easily used means to a plethora of ends. How—and why—that revolution came about can best be seen by examining the career of one of its most important agents, Douglas Engelbart. Both because of Engelbart’s efforts and, in some ways, despite them, the personal computer is what it is today.

    FROM TECHNICIAN TO CRUSADER

    The Genesis of the Augmentation of Human Intellect

    Douglas C. Engelbart was born in Portland, Oregon, in 1925, the second of three children, to a couple of Scandinavian and German descent. His father was an electrical engineer and owned a radio shop there. He died when Douglas Engelbart was nine years old. This early loss shaped the young Engelbart’s personality in two different ways: the electrical engineering background of his father provided an early influence, even if the advent of the Depression did not exactly allow a strong relationship at this level, and, perhaps more importantly, the loss itself had repercussions for his sometimes vexed relationships with later sources of authority:

    I didn’t have any clarity on what I’d like to do, because my father, during the Depression, had to work very hard, and what I remember is his coming home and eating dinner and going back to finish repairing radios. . . . I was nine, when he died—that’s too young to die. . . . I realized some years later, when I got to college, that I must have been looking for that [some sort of role model after his father’s death]. I’d become quite disappointed midway through the first semester, in one professor or another, and finally started realizing I would like them to be the father I didn’t have, instead of being a professor. (Engelbart 1996)

    Douglas Engelbart was a bright child who sailed through most of his school years with no apparent difficulties. He graduated from high school and spent two years at Oregon State University in Corvallis, where, with higher education on a wartime footing, he was trained as a radar technician before he was drafted into the U.S. Navy. (He was still in high school during Pearl Harbor.) The radar training he received in college and the navy proved to be central for the rest of his career and first triggered an absolute fascination in his young mind:

    I’d hear these rumors among the kids about this thing called radar, and that the navy had this program where they would train you by having you study, and you’d go behind closed fences and they’d take the books out of vaults and teach you and then search you when you left, and put the books back in the vaults. It all sounded so dramatic. Whatever the secret stuff, radar, was, it intrigued me. So I started saying, Well, I think I’ll sort of prepare, so that when I go into the service, maybe that’s what I can do. And that’s what I ended up doing. . . . It was pretty challenging to me to learn. . . . Without knowing the math and the physics underneath it, you could get a model for how it’s operating so then you could understand how to service it, and troubleshoot, and repair. I would usually be groping for a deeper understanding. . . . We were technicians. We weren’t aimed for being officers, we were a bunch of enlisted men being technicians. But it was challenging to learn that much, and put it together. (Engelbart 1996)

    Douglas Engelbart was in the navy from 1944 to 1946 and was stationed for a year at the Philippines Sea Frontier, in Manila Bay. (His ship had embarked on VJ day, August 13, 1945.) He did not have to fight, and learned his trade instead. After the war, he went back to Oregon State to finish his degree in electrical engineering. He received his Bachelor of Science diploma in 1948 and then took a job at the Ames Navy Research Center in Mountain View, California, where he stayed for three years, from 1948 to 1950. He was recruited as an electrical engineer in the electrical section, a service-and-support group that helped develop specifications. It was a mixture of maintenance and building, basically a line job in electrical engineering. He was not dealing with computers at all. As Engelbart puts it: I’ll tell you what a computer was in those days. It was an underpaid woman sitting with a hand calculator, and they’d have rooms full of them, and that’s how they’d get their computing done. So you’d say ‘What’s your job?’ ‘I’m a computer’ (Engelbart 1996).

    Even at that time, however, computing was no longer being limited to human computing. During the three years Engelbart spent at the Ames Research Laboratory, IBM assembled their SSEC electromechanical computer, which first ran a stored program in January 1948. Manchester University’s Mark I prototype ran the first fully electronic stored program on June 21 of the same year. Researches were under way on the EDSAC (at Cambridge University), UNIVAC and BINAC (first at the Moore School, and then at John P. Eckert’s and John W. Maulchy’s Electronic Control Company), ILLIAC (at the University of Illinois), JOHNNIAC (at the RAND Corporation in Santa Monica, California), MANIAC (at Los Alamos Laboratories, New Mexico), and most importantly, WHIRLWIND (at MIT).

    In the United States, these computing projects were all connected to military research and development activities in one way or another. By 1948, the Office of Naval Research was funding 40 percent of all basic research, and most advances in computing research were classified.⁷ In August 1949, the Soviet Union exploded an atomic bomb, and the Cold War climate settled in, without any prospect for a thaw. Internally, the Committees on Un-American Activities of the House and Senate soon turned this climate into a general atmosphere of suspicion. All these conditions did not help diffuse a general awareness of computing’s state of the art.⁸

    Furthermore, Engelbart’s awareness of computing research was also limited by the fact that on the West Coast, this research was mostly concentrated at RAND. A part of the military-industrial complex, RAND (an acronym for Research and Development) was founded in 1946 as a joint venture between the U.S. Air Force and Douglas Aircraft. In 1948, it separated from Douglas and became an independent, nonprofit think tank devoted to studying techniques of air warfare. During the 1950’s, the air force funded RAND for approximately ten million dollars a year, and at its peak, in 1957, the institution had 2,605 employees (Edwards 1996, 165). It was definitely not in the business of disseminating research on computing.

    Flash

    In the narrative that Engelbart provides to describe how he decided to get involved in computing research, the actual vision of what he wanted to accomplish and even the understanding of how he wanted to accomplish it came in an instantaneous insight, yet at the same time was a complex development that encompassed most aspects of his personal and professional life at the time. Its result was his lifelong crusade. The narrative itself is a remarkable tale of an intuition produced almost as an act of will as the result of hard work. It is a tale with deep resonances in the American tradition of self-made technological innovators, from Edison, Bell, and the Wright Brothers onward, and beyond that, in the tradition of self-reliance and self-invention, from Ralph Waldo Emerson to Fitzgerald’s Jay Gatsby.

    As Engelbart describes this period, I was never the kind that would push everybody into talking about what I wanted to talk about. I guess I was looking around watching people and soaking it up (Engelbart 1996). He was opening himself to various professional and moral discourses, trying to figure out a set of personal goals for his life. There was a reason for this serious reflection. After three years of working at a steady job, Engelbart had become engaged, in December 1950, at the age of twenty-five:

    I can just remember one half-hour driving to work, one day after I got engaged; that was a turning point. . . . I had all this excitement, Oh, I’m engaged! And I was riding to work, and I said to myself, Well, let’s see, I’d better get my mind on work. What am I gonna do today? Oh, well, gee, that’s not terribly exciting. Is there anything this week that I can look forward to that’s in any way a little bit exciting? And suddenly I just realized that on ahead of me there were very nice people, it was a good place to work . . . but there was no excitement. I could have noticed that two years earlier, but being a bachelor, and busy trying to fill the rest of my life, I guess, it didn’t really dawn on me. But it just dawned on me then. . . . By the time I got to work, I had this realization that I didn’t have any more goals, and that getting married and living happily ever after was the last of my goals, which isn’t very surprising. The Depression kids were likely to grow up getting a steady job, and getting married and having a family, that’s about all. I was literally embarrassed to realize that. I was twenty-five. It was December 10th or nth, 1950. For some reason, I just picked that as an explicit, conscious thing to do; I had to figure out a good set of professional goals. (Engelbart 1996)

    In many ways, Engelbart indeed can be seen as a representative of the generation of Depression kids, a generation born in adverse conditions and coming of age during and just after World War II. Not just for Engelbart, but for many Americans of his generation, getting a steady job, and getting married and having a family, were important, but it was equally important to them that they never agree that’s about all there is to life.

    The historical conditions of postwar America created a specific cultural background that framed issues of material well-being, money, power, and morals in general. These issues revolved around what to do with their lives—what ends to serve, and on what terms, at what cost. Both the New Deal and World War II had altered the American institutional landscape. As Richard Hofstadter noted, the Second World War, like the first, increased the need for experts, not only the sort the New Deal employed but also men from previously untapped fields of scholarship—even classicists and archeologists were suddenly thought important because of their knowledge of the Mediterranean area (1962, 421). But experts who were taken up into the matrices of large institutions and organizations thereby sacrificed the self-reliant autonomy that American culture so strongly valorized and that was epitomized by the free intellectuals of the Emersonian tradition. As Ross Evans Paulson has noted:

    The separation of the academics from the free intellectuals in the late 1930s and 1940s was accelerated by government assistance to higher education. . . . The balance of power in intellectual matters gradually shifted. The free intellectual became the outsider; academia swallowed the poet, the writer, the playwright, the philosopher. . . . A pervasive anti-intellectualism made the very notion of the free, unattached and critical individual seem somehow subversive. The free intellectual survived, if at all, as an exile, a supplicant for foundation grants and fellowships or as a foundation executive or expert.

    The children of the Depression and World War II, prematurely turned adults, returned to a civilian life fundamentally altered in its institutional spaces and social roles. Outsiders of all sorts, intellectual, artistic, and even technical, had to decide what ends to serve, and how much, or how little, to compromise their autonomy and self-identity in serving them. For many, the problem of what to do with their lives in this new situation was exacerbated by a sense that they faced a paradoxical situation in which, with the dawn of the nuclear age and the arms race, science and technology had been the key to winning what was starting to look to many like a Pyrrhic victory. However, the idealistic opening of a new era also was full of hopes, fears, and a sense of moral

    Enjoying the preview?
    Page 1 of 1