In Defense of A Liberal Education-W. W. Norton & Company (2015)
In Defense of A Liberal Education-W. W. Norton & Company (2015)
In Defense of A Liberal Education-W. W. Norton & Company (2015)
Liberal Education
FAREED ZAKARIA
—E. O. Wilson
Contents
1: Coming to America
2: A Brief History of Liberal Education
3: Learning to Think
4: The Natural Aristocracy
5: Knowledge and Power
6: In Defense of Today’s Youth
Notes
Acknowledgments
In Defense of a
Liberal Education
1
Coming to America
IF YOU WANT to live a good life these days, you know what you’re
supposed to do. Get into college but then drop out. Spend your days
learning computer science and your nights coding. Start a technology
company and take it public. That’s the new American dream. If you’re not
quite that adventurous, you could major in electrical engineering.
What you are not supposed to do is study the liberal arts. Around the
world, the idea of a broad-based “liberal” education is closely tied to the
United States and its great universities and colleges. But in America itself, a
liberal education is out of favor. In an age defined by technology and
globalization, everyone is talking about skills-based learning. Politicians,
businesspeople, and even many educators see it as the only way for the
nation to stay competitive. They urge students to stop dreaming and start
thinking practically about the skills they will need in the workplace. An
open-ended exploration of knowledge is seen as a road to nowhere.
A classic liberal education has few defenders. Conservatives fume that it
is too, well, liberal (though the term has no partisan meaning). Liberals
worry it is too elitist. Students wonder what they would do with a degree in
psychology. And parents fear that it will cost them their life savings.
This growing unease is apparent in the numbers. As college enrollment
has grown in recent decades, the percentage of students majoring in
subjects like English and philosophy has declined sharply. In 1971, for
example, 7.6 percent of all bachelor’s degrees were awarded in English
language and literature. By 2012, that number had fallen to 3.0 percent.
During the same period, the percentage of business majors in the
undergraduate population rose from 13.7 to 20.5.
Some believe this pattern makes sense—that new entrants into higher
education might simply prefer job training to the liberal arts. Perhaps. But
in earlier periods of educational expansion, this was not the case. In the
1950s and 1960s, for instance, students saw college as more than a glorified
trade school. Newcomers, often from lower-middle-class backgrounds and
immigrant families with little education, enthusiastically embraced the
liberal arts. They saw it as a gateway to a career, and also as a way to
assimilate into American culture. “I have to speak absolutely perfect
English,” says Philip Roth’s character Alex Portnoy, the son of immigrants
and hero of the novel Portnoy’s Complaint. Majors like English and history
grew in popularity precisely during the decades of mass growth in
American higher education.
The great danger facing American higher education is not that too many
students are studying the liberal arts. Here are the data. In the 2011–12
academic year, 52 percent of American undergraduates were enrolled in
two-year or less-than-two-year colleges, and 48 percent were enrolled in
four-year institutions. At two-year colleges, the most popular area of study
was health professions and related sciences (23.3 percent). An additional
11.7 percent of students studied business, management, and marketing. At
four-year colleges, the pattern was the same. Business led the list of majors,
accounting for 18.9 percent of students, and health was second, accounting
for 13.4 percent. Another estimate found that only a third of all bachelor’s
degree recipients study fields that could be classified as the liberal arts. And
only about 1.8 percent of all undergraduates attend classic liberal arts
colleges like Amherst, Swarthmore, and Pomona.
As you can see, we do not have an oversupply of students studying
history, literature, philosophy, or physics and math for that matter. A
majority is specializing in fields because they see them as directly related to
the job market. It’s true that more Americans need technical training, and all
Americans need greater scientific literacy. But the drumbeat of talk about
skills and jobs has not lured people into engineering and biology—not
everyone has the aptitude for science—so much as it has made them
nervously forsake the humanities and take courses in business and
communications. Many of these students might well have been better off
taking a richer, deeper set of courses in subjects they found fascinating—
and supplementing it, as we all should, with some basic knowledge of
computers and math. In any event, what is clear is that the gap in technical
training is not being caused by the small percentage of students who choose
four-year degrees in the liberal arts.
Whatever the facts, the assaults continue and have moved from the
realm of rhetoric to action. The governors of Texas, Florida, North Carolina,
and Wisconsin have announced that they do not intend to keep subsidizing
the liberal arts at state-funded universities. “Is it a vital interest of the state
to have more anthropologists?” Florida’s Rick Scott asked. “I don’t think
so.” Wisconsin is planning to cut money from subjects that don’t train
students for a specific job right out of college. “How many PhDs in
philosophy do I need to subsidize?” the radio show host William Bennett
asked North Carolina’s Patrick McCrory, a sentiment with which McCrory
enthusiastically agreed. (Ironically, Bennett himself has a PhD in
philosophy, which appears to have trained him well for his multiple careers
in government, media, nonprofits, and the private sector.)
It isn’t only Republicans on the offensive. Everyone’s eager to promote
the type of education that might lead directly to a job. In a speech in
January 2014, President Barack Obama said, “I promise you, folks can
make a lot more, potentially, with skilled manufacturing or the trades than
they might with an art history degree.” He later apologized for what he
described as a “glib” comment, but Obama has expressed similar sentiments
during his presidency. His concern—that in today’s world, college
graduates need to focus on the tools that will get them good jobs—is shared
by many liberals, as well as conservatives and independents. The
irrelevance of a liberal education is an idea that has achieved that rare status
in Washington: bipartisan agreement.
The attacks have an effect. There is today a loss of coherence and
purpose surrounding the idea of a liberal education. Its proponents are
defensive about its virtues, while its opponents are convinced that it is at
best an expensive luxury, at worst actively counterproductive. Does it really
make sense to study English in the age of apps?
In a sense, the question is un-American. For much of its history,
America was distinctive in providing an education to all that was not skills
based. In their comprehensive study of education, the Harvard economists
Claudia Goldin and Lawrence Katz note that, historically, Britain, France,
and Germany tested children at a young age, educated only a few, and put
them through a narrow program designed specifically to impart a set of
skills thought to be key to their professions. “The American system,” they
write, “can be characterized as open, forgiving, lacking universal standards,
and having an academic yet practical curriculum.” America did not embrace
the European model of specific training and apprenticeships because
Americans moved constantly, to new cities, counties, and territories in
search of new opportunities. They were not rooted in geographic locations
with long-established trades and guilds that offered the only path forward.
They were also part of an economy that was new and dynamic, so that
technology kept changing the nature of work and with it the requirements
for jobs. Few wanted to lock themselves into a single industry for life.
Finally, Goldin and Katz argue, while a general education was more
expensive than specialized training, the cost for the former was not paid by
students or their parents. The United States was the first country to publicly
fund mass, general education, first at the secondary-school level and then in
college. Even now, higher education in America is a much broader and
richer universe than anywhere else. Today a high school student can go to
one of fourteen hundred institutions in the United States that offer a
traditional bachelor’s degree, and another fifteen hundred with a more
limited course of study. Goldin and Katz point out that on a per capita basis,
Britain has only half as many undergraduate institutions and Germany just
one-third. Those who seek to reorient U.S. higher education into something
more focused and technical should keep in mind that they would be
abandoning what has been historically distinctive, even unique, in the
American approach to higher education.
And yet, I get it. I understand America’s current obsession. I grew up in
India in the 1960s and 1970s, when a skills-based education was seen as the
only path to a good career. Indians in those days had an almost mystical
faith in the power of technology. It had been embedded in the country’s
DNA since it gained independence in 1947. Jawaharlal Nehru, India’s first
prime minister, was fervent in his faith in big engineering projects. He
believed that India could move out of its economic backwardness only by
embracing technology, and he did everything he could during his fourteen
years in office to leave that stamp on the nation. A Fabian socialist, Nehru
had watched with admiration as the Soviet Union jump-started its economy
in just a few decades by following such a path. (Lenin once famously
remarked, “Communism is Soviet power plus the electrification of the
whole country.”) Nehru described India’s new hydroelectric dams as
“temples of the new age.”
I attended a private day school in Bombay (now Mumbai), the
Cathedral and John Connon School. When founded by British missionaries
in the Victorian era, the school had been imbued with a broad, humanistic
approach to education. It still had some of that outlook when I was there,
but the country’s mood was feverishly practical. The 1970s was a tough
decade everywhere economically, but especially in India. And though it was
a private school, the tuition was low, and Cathedral catered to a broad cross
section of the middle class. As a result, all my peers and their parents were
anxious about job prospects. The assumption made by almost everyone at
school was that engineering and medicine were the two best careers. The
real question was, which one would you pursue?
At age sixteen, we had to choose one of three academic streams:
science, commerce, or the humanities. We all took a set of board exams that
year—a remnant of the British educational model—that helped determine
our trajectory. In those days, the choices were obvious. The smart kids
would go into science, the rich kids would do commerce, and the girls
would take the humanities. (Obviously I’m exaggerating, but not by that
much.) Without giving the topic much thought, I streamed into the sciences.
At the end of twelfth grade, we took another set of exams. These were
the big ones. They determined our educational future, as we were reminded
again and again. Grades in school, class participation, extracurricular
projects, and teachers’ recommendations—all were deemed irrelevant
compared to the exam scores. Almost all colleges admitted students based
solely on these numbers. In fact, engineering colleges asked for scores in
only three subjects: physics, chemistry, and mathematics. Similarly, medical
schools would ask for results in just physics, chemistry, and biology. No
one cared what you got in English literature. The Indian Institutes of
Technology (IITs)—the most prestigious engineering colleges in the
country—narrowed the admissions criteria even further. They administered
their own entrance test, choosing applicants entirely on the basis of its
results.
The increased emphasis on technology and practicality in the 1970s was
in part due to domestic factors: inflation had soared, the economy had
slumped, and the private sector was crippled by nationalizations and
regulations. Another big shift, however, took place far from India’s borders.
Until the 1970s, the top British universities offered scholarships to bright
Indian students—a legacy of the raj. But as Britain went through its own
hellish economic times that decade—placed under formal receivership in
1979 by the International Monetary Fund—money for foreign scholarships
dried up. In an earlier era, some of the brightest graduates from India might
have gone on to Oxford, Cambridge, and the University of London. Without
outside money to pay for that education, they stayed home.
But culture follows power. As Britain’s economic decline made its
universities less attractive, colleges in the United States were rising in
wealth and ambition. At my school, people started to notice that American
universities had begun offering generous scholarships to foreign students.
And we soon began to hear from early trailblazers about the distinctly
American approach to learning. A friend from my neighborhood who had
gone to Cornell came back in the summers bursting with enthusiasm about
his time there. He told us of the incredible variety of courses that students
could take no matter what their major. He also told tales of the richness of
college life. I remember listening to him describe a film society at Cornell
that held screenings and discussions of classics by Ingmar Bergman and
Federico Fellini. I had never heard of Bergman or Fellini, but I was amazed
that watching movies was considered an integral part of higher education.
Could college really be that much fun?
My parents did not push me to specialize. My father had been deeply
interested in history and politics ever since he was a young boy. He had
been orphaned at a young age but managed to get financial assistance that
put him through high school and college. In 1944, he received a scholarship
to attend the University of London. He arrived during the worst of the
blitzkrieg, with German V-2 rockets raining down on the city. On the long
boat ride to England, the crew told him he was crazy. One member even
asked, “Haven’t you read the newspapers? People are leaving London by
the thousands right now. Why would you go there?” But my father was
determined to get an education. History was his passion, and he worked
toward a PhD in that subject. But he needed a clearer path to a profession.
So, in addition, he obtained a law degree that would allow him to become a
barrister upon his return to Bombay.
Though my mother was raised in better circumstances, she also faced a
setback at a young age—her father died when she was eight. She briefly
attended a college unusual for India at the time—a liberal arts school in the
northern part of the country called the Isabella Thoburn College, founded in
1870 by an American Methodist missionary of that name. Though her
education was cut short when she returned home to look after her widowed
mother, my mother never forgot the place. She often fondly reminisced
about its broad and engaging curriculum.
My parents’ careers were varied and diverse. My father started out as a
lawyer before moving into politics and later founding a variety of colleges.
He also created a small manufacturing company (to pay the bills) and
always wrote books and essays. My mother began as a social worker and
then became a journalist, working for newspapers and magazines. (She
resigned from her last position in journalism last year, 2014, at the age of
seventy-eight.) Neither of them insisted on early specialization. In
retrospect, my parents must have worried about our future prospects—
everyone else was worried. But to our good fortune, they did not project
that particular anxiety on us.
My brother, Arshad, took the first big step. He was two years older than
I and fantastically accomplished academically. (He was also a very good
athlete, which made following in his footsteps challenging.) He had the
kind of scores on his board exams that would have easily placed him in the
top engineering programs in the country. Or he could have taken the IIT
exam, which he certainly would have aced. In fact, he decided not to do any
of that and instead applied to American universities. A couple of his friends
considered doing the same, but no one quite knew how the process worked.
We learned, for example, that applicants had to take something called the
Scholastic Aptitude Test, but we didn’t know much about it. (Remember,
this is 1980 in India. There was no Google. In fact, there was no color
television.) We found a pamphlet about the test at the United States
Information Service, the cultural branch of the U.S. embassy. It said that
because the SAT was an aptitude test, there was no need to study for it. So,
my brother didn’t. On the day the test was scheduled, he walked into the
makeshift exam center in Bombay, an almost empty room in one of the
local colleges, and took the test.
It’s difficult to convince people today how novel and risky an idea it
was at the time to apply to schools in the United States. The system was still
foreign and distant. People didn’t really know what it meant to get into a
good American university or how that would translate into a career in India.
The Harvard alumni in Bombay in the 1970s were by no means a “Who’s
Who” of the influential and wealthy. Rather, they were an eclectic mix of
people who either had spent time abroad (because their parents had foreign
postings) or had some connection to America. A few friends of ours had
ventured to the United States already, but because they hadn’t yet graduated
or looked for jobs, their experiences were of little guidance.
My brother had no idea if the admissions departments at American
colleges would understand the Indian system or know how to interpret his
report cards and recommendations. He also had no real Plan B. If he didn’t
take the slot offered by engineering schools, he wouldn’t be able to get back
in line the next year. In fact, things were so unclear to us that we didn’t even
realize American colleges required applications a full year in advance. As a
result, he involuntarily took a gap year between school and college, waiting
around to find out whether he got in anywhere.
As it happened, Arshad got in everywhere. He picked the top of the
heap—accepting a scholarship offer from Harvard. While we were all
thrilled and impressed, many friends remained apprehensive when told the
news. It sounded prestigious to say you were going to attend Harvard, but
would the education actually translate into a career?
My mother traveled to the United States to drop my brother off in the
fall of 1981, an uneasy time in American history. The mood was still more
1970s malaise than 1980s boom. The country was in the midst of the worst
recession since the Great Depression. Vietnam and Watergate had shattered
the nation’s confidence. The Soviet Union was seen as ascendant in our
minds. Riots, protests, and urban violence had turned American cities into
places of genuine danger. Our images of New York came from Charles
Bronson movies and news reports of crack and crime.
All of this was especially alarming to Indians. The country’s traditional
society had interpreted the 1960s and 1970s as a period of decay in
American culture, as young people became morally lax, self-indulgent,
permissive, and, perhaps most worrisome, rebellious. The idea that
American youth had become disrespectful toward their elders was utterly
unnerving to Indian parents. Most believed that any child who traveled to
the United States would quickly cast aside family, faith, and tradition for
sex, drugs, and rock and roll. If you sent your kids to America, you had to
brace yourselves for the prospect that you might “lose” them.
In his first few weeks abroad, Arshad was, probably like all newcomers
to Harvard, a bit nervous. My mother, on the other hand, returned from her
trip clear of any anxiety. She was enchanted with the United States, its
college campuses, and the undergraduate experience. She turned her
observations into an article for the Times of India titled “The Other
America.” In it, she described how concerned she had been before the trip
about permissiveness, drugs, and rebellion at American colleges. She then
went on to explain how impressed she was after actually spending time on a
campus to find that the place focused on education, hard work, and
extracurricular activities. The students she met were bright, motivated, and,
to her surprise, quite respectful. She met parents who were tearfully bidding
their children good-bye, talking about their next visit, or planning a
Thanksgiving reunion. “I feel I am in India,” she wrote. “Could this be the
heartless America where family ties have lost their hold?”
Indians had it all wrong about the United States, my mother continued.
She tried to explain why they read so much bad news about the country.
“America is an open society as no other. So they expose their ‘failings’ too
as no other,” she wrote. “[Americans] cheerfully join in the talk of their
own decline. But the decline is relative to America’s own previous strength.
It remains the world’s largest economy; it still disposes of the greatest
military might the world has known; refugees from terror still continue to
seek shelter in this land of immigrants. It spends millions of dollars in the
hope that someone, somewhere may make a valuable contribution to
knowledge. America remains the yardstick by which we judge America.”
As you can see, she was hooked.
In those years, it was fashionable in elite Indian circles to denounce the
United States for its imperialism and hegemony. During the Cold War, the
Indian government routinely sided with the Soviet Union. Indira Gandhi,
the populist prime minister, would often blame India’s troubles on the
“foreign hand,” a reference to the CIA. But my mother has always been
stubbornly pro-American. When my father was alive, he would sometimes
criticize America for its crimes and blunders, partly to needle my brother
and me and partly because, as one who had struggled for India’s
independence, he had absorbed the worldview of his closest allies, who
were all on the left. Yet my mother remained unmoved, completely
convinced that the United States was a land of amazing vitality and virtue.
(I suspect it’s what has helped her accept the fact that her sons chose the
country as their home.)
Along with photographs and information brochures from her trip, my
mother also brought back Harvard’s course book. For me, it was an
astonishing document. Instead of a thin pamphlet containing a dry list of
subjects, as one would find at Indian universities, it was a bulging volume
overflowing with ideas. It listed hundreds of classes in all kinds of fields.
And the course descriptions were written like advertisements—as if the
teachers wanted you to join them on an intellectual adventure. I read
through the book, amazed that students didn’t have to choose a major in
advance and that they could take poetry and physics and history and
economics. From eight thousand miles away, with little knowledge and no
experience, I was falling in love with the idea of a liberal education.
I decided to follow in my brother’s footsteps and didn’t pursue the
Indian options available to me. I took the SAT and wrote the required essays
and applications. If you had asked me why I was so determined to go to the
United States, I couldn’t have given you a coherent response. Indian
universities seemed limiting and limited. I thought about applying to British
universities, but I would have needed a scholarship and few existed. The
idea of “reading” just one subject at Oxford or a narrow set of subjects at
Cambridge seemed less interesting when compared with the dazzling array
of opportunities at the Ivy League schools. And, of course, there was the
allure of America.
I had always been fascinated by America. I had visited once as a
teenager, but most of my knowledge about the country came from
Hollywood. While the Indian market was too poor and distant to get any
newly released movies, we watched the ones we would get, a few years
delayed—anything from The Poseidon Adventure to Kramer vs. Kramer—
as well as old classics, like the Laurel and Hardy comedies, which I loved.
Television arrived in the country in the mid-1970s, initially with just one
government-run black-and-white channel that mostly aired documentaries
on the glories of Indian agriculture. Every Sunday night, my family would
gather around the television set to watch the one unadulterated piece of
entertainment it would air, a Bollywood movie. Preceding that was a single
episode of I Love Lucy, presumably all that Indian television could afford to
import from the United States. Everyone watched it with pleasure, laughing
along with Lucy and her crazy family. To this day, I have a soft spot for that
show.
By the late 1970s, technology had begun to bring more of the West to
India. A few of my friends had video recorders, and after a while, so did we.
It was impossible to acquire actual copies of American movies and shows,
but we did get many bootlegged versions. Somewhere in the United States,
a relative would tape the latest television shows and send them to the family
back home. These bootlegged Betamax tapes would be passed around in
Bombay like samizdat publications in the Soviet Union.
The hottest show at the time was Dallas, which we all devoured. The
scenes during the opening credits were my window into the American
dream: shining shots of gleaming skyscrapers, helicopters landing in office
parks, men in ten-gallon hats getting in and out of cavernous Cadillacs. And
Victoria Principal—she was certainly part of my American dream.
Whatever the newspapers said about problems in the United States, who
could believe it with these images flashing across the screen? America
seemed vast, energetic, and wealthy. Everything happened in Technicolor
there.
The U.S. Information Service, set up to promote American culture and
ideas during the Cold War, would hold screenings of older American
classics. A friend and I would often attend these showings. There, in a small
room in Bombay, sitting amid aging expats, I was introduced to
Hollywood’s golden age. I kept a scrapbook on these movies, from It
Happened One Night to Adam and Eve to How the West Was Won. In a
sense, they were my first real introduction to American history. And they
added to my sense of the country as the world’s most exciting place.
Let me be honest, though: while the soft attraction was great, so was the
cold cash. My parents were well-paid professionals, but India was one of
the poorest countries in the world. Their annual salaries combined would
have equaled just half of one year’s tuition abroad. At the time, American
colleges did not offer need-blind admissions to foreign students like me—
the schools all had much smaller endowments in those days—but they did
distribute merit scholarships. And if you were admitted, they worked out a
combination of grants, loans, and on-campus jobs that would allow you to
attend. My brother’s reports from Harvard were that between his
scholarship and a campus job, he was making do quite well. He even had
enough money for books and incidental expenses. Yet realizing that I
needed not only admission but also a scholarship added to my anxiety.
I got very lucky and ended up going to Yale. I have no idea why they let
me in or why I chose to go there. I marvel today at college-bound American
kids who take two or three trips to campuses, sit in on classes, have long
discussions with counselors, and watch student theater productions—all to
decide where to go to college. In comparison, I made an utterly uninformed
choice from halfway around the world. I didn’t get into Harvard, but I was
fortunate to be able to choose between Princeton and Yale and couldn’t
really decide. I knew little about either. If I made a list of each university’s
objective merits, which I did, Princeton usually came out on top. It was
smaller and richer and had offered me a bigger scholarship. Everyone had
heard of it in India because of Albert Einstein. Very few knew of Yale. This
seems hard to believe, but Yale really was quite obscure in India. My father,
like many Indians, couldn’t pronounce the name, and to his dying day he
called it “Ale.” In general, American universities that have great name
recognition in India—and in Asia more generally—are those with strong
engineering programs, science departments, and business schools. These
were not Yale’s strengths.
Eventually, I decided to use the only mechanism I could think of: a coin
toss. Heads, I would go to Yale; tails, I would go to Princeton. I flipped the
coin. It was tails. So I decided to make it a “best of three” and tossed again.
I don’t remember if Yale won the coin toss at that point or if I kept going
until it did. But in doing the exercise, I realized that I wanted to go to Yale. I
don’t quite know why. It is an example of the power of intuition. Though
obviously both are great institutions, Yale was the perfect place for me. I
knew something at the time that I couldn’t explain or even understand.
Yale offered then, and still does now, a rigorous first-year academic
program called Directed Studies. It is a sweeping survey of the Western
literary and philosophical tradition from ancient Greece to modernity. This
seemed like a wonderful opportunity for a kid from India. It would have
introduced me to a number of great Western classics that I had heard about
but never read. You had to apply to be able to take the courses, which I did.
Some months later, I was thrilled to get a note informing me that I had been
accepted into the program.
I chickened out. When I got to Yale, it was time for me to finalize my
choices. I tallied up the subjects that I believed I had to take—courses like
math, computer programming, and physics—and realized that if I were
going to enroll in Directed Studies, it would fill up most of my schedule. I
panicked at the idea of committing so completely to something that seemed
so impractical. I remember thinking to myself, “When people ask me in
India over the summer about my courses, I could talk about computers and
math. How would I explain this?” So I dropped Directed Studies and signed
up for courses that seemed more sensible.
In my first year, however, I allowed myself to pick one class simply out
of sheer interest. The course was a popular lecture on the history of the
Cold War, taught by a political science professor named H. Bradford
Westerfield. His lectures were packed with vivid details and delivered with
gusto. I was hooked.
International politics and economics had always appealed to me. As a
teenager in India, I would avidly read the major international newspapers
and magazines, which sometimes arrived weeks after they were published.
The great global drama of the times was the clash of the superpowers, and it
echoed in India, a country that was torn between the two camps. I
remember devouring the excerpts of Henry Kissinger’s memoirs when they
came out, though I’m sure I didn’t understand them. (I was fifteen at the
time.) Yet I never thought that one studied these kinds of subjects in
college. I had assumed that I would major in something that was practical,
technical, and job oriented. I could always read newspapers on the side.
Westerfield’s course, however, made me realize that I should take my
passion seriously, even without being sure what it might lead to in terms of
a profession. That spring, I declared my major in history. I was going to get
a liberal education.
But still, I couldn’t have answered the question, what is a liberal
education?
2
* Bruce Kimball, Orators and Philosophers: A History of the Idea of Liberal Education (New York:
Teachers College Press, 1986), is especially enlightening on ancient and early education, and I draw
on it, among other sources, for the paragraphs dealing with that period.
† I draw on Delbanco’s excellent book College: What It Was, Is, and Should Be (Princeton, NJ:
Princeton University Press, 2012), among others, for the early years of American higher education.
3
Learning to Think
WHEN YOU HEAR someone extol the benefits of a liberal education, you
will probably hear him or her say that “it teaches you how to think.” I’m
sure that’s true. But for me, the central virtue of a liberal education is that it
teaches you how to write, and writing makes you think. Whatever you do in
life, the ability to write clearly, cleanly, and reasonably quickly will prove
to be an invaluable skill.
In my freshman year of college, I took an English composition course.
My teacher, an elderly Englishman with a sharp wit and an even sharper red
pencil, was a tough grader. He would return my essays with dozens of
comments written in the margins, each one highlighting something that was
vague or confusing or poorly articulated. I realized that in coming from
India, I was pretty good at taking tests and regurgitating things I had
memorized; I was not so good at expressing my own ideas. By the time I
got to college, I had taken many, many exams but written almost no papers.
That was not unusual even at a good high school in Asia in the 1970s, and
it’s still true in many places there today.
Over the course of that semester, I found myself starting to make the
connection between my thoughts and words. It was hard. Being forced to
write clearly means, first, you have to think clearly. I began to recognize
that the two processes are inextricably intertwined. In what is probably an
apocryphal story, when the columnist Walter Lippmann was once asked his
views on a particular topic, he is said to have replied, “I don’t know what I
think on that one. I haven’t written about it yet.”
In modern philosophy, there is a great debate as to which comes first—
thought or language. Do we think abstractly and then put those ideas into
words, or do we think in words that then create a scaffolding of thought? I
can speak only from my own experience. When I begin to write, I realize
that my “thoughts” are usually a jumble of half-formed ideas strung
together, with gaping holes between them. It is the act of writing that forces
me to sort them out. Writing the first draft of a column or an essay is an
expression of self-knowledge—learning just what I think about a topic,
whether there is a logical sequence to my ideas, and whether the conclusion
flows from the facts at hand. No matter who you are—a politician, a
businessperson, a lawyer, a historian, or a novelist—writing forces you to
make choices and brings clarity and order to your ideas.
If you think this has no earthly use, ask Jeff Bezos, the founder of
Amazon. Bezos insists that his senior executives write memos, often as long
as six printed pages, and begins senior-management meetings with a period
of quiet time, sometimes as long as thirty minutes, while everyone reads the
“narratives” to themselves and makes notes on them. If proposing a new
product or strategy, the memo must take the form of a press release, using
simple, jargon-free language so that a layperson can understand it. In an
interview with Fortune’s Adam Lashinsky, Bezos said, “Full sentences are
harder to write. They have verbs. The paragraphs have topic sentences.
There is no way to write a six-page, narratively structured memo and not
have clear thinking.”
Norman Augustine, reflecting on his years as the CEO of Lockheed
Martin, recalled that “the firm I led at the end of my formal business career
employed some one hundred eighty thousand people, mostly college
graduates, of whom over eighty thousand were engineers or scientists. I
have concluded that one of the stronger correlations with advancement
through the management ranks was the ability of an individual to express
clearly his or her thoughts in writing.”
The second great advantage of a liberal education is that it teaches you
how to speak. The Yale-NUS report states that the college wants to make
“articulate communication” central to its intellectual experience. That
involves writing, of course, but also the ability to give compelling verbal
explanations of, say, scientific experiments or to deliver presentations
before small and large groups. At the deepest level, articulate
communication helps you to speak your mind. This doesn’t mean spouting
anything and everything you’re thinking at any given moment. It means
learning to understand your own mind, to filter out under-developed ideas,
and then to express to the outside world your thoughts, arranged in some
logical order.
Another difference that struck me between school in India and college
in the United States was that talking was an important component of my
grade. My professors were going to judge me on my ability to think through
the subject matter and to present my analysis and conclusions—out loud.
The seminar, a form of teaching and learning at the heart of liberal
education, helps you to read, analyze, and dissect. Above all, it helps you to
express yourself. And this emphasis on “articulate communication” is
reinforced in the many extracurricular activities that surround every liberal
arts college—theater, debate, political unions, student government, protest
groups. In order to be successful in life, you often have to gain your peers’
attention and convince them of your cause, sometimes in a five-minute
elevator pitch.
The study and practice of speech actually figured far more prominently
in the early centuries of liberal education. Rhetoric was among the most
important subjects taught—often the most important. It was intimately
connected not only with philosophy but also with governance and action. In
the centuries before print, oral communication was at the center of public
and professional life. The eighteenth- and nineteenth-century college
curricula in Britain and the United States maintained that emphasis on
oratory.
In the twentieth century, as research became the major focus of large
universities, and the printed text became the dominant method of mass
communication, the emphasis on speech faded, especially in the United
States. In Great Britain, public speaking remains prominent in a tradition of
poetry recitation and elocution, debate and declamation. At the center of
Britain’s political life stands the House of Commons, a venue in which the
ability to thrust and parry verbally gains a politician notice by his or her
peers. That’s why so many Britons sound intelligent, lucid, and witty—it’s
not just the accent. The rise of television and digital video have made verbal
fluency useful, sometimes crucial. Whether for public or private
communication, the ability to articulate your thoughts clearly will prove to
be a tremendous strength. No matter how strong your idea, you have to be
able to convince others to get behind it.
A related method of learning through the ages has been something that
is often thought of as pure pleasure—conversation. “Conversation,” a
former president of Yale, A. Whitney Griswold, wrote, “is the oldest form
of instruction of the human race,” defining it as “the great creative art
whereby man translates feeling into reason and shares with his fellow man
those innermost thoughts and ideals of which civilization is made.” The
scientist and philosopher Alfred North Whitehead once confessed that
“outside of the book-knowledge which is necessary to our professional
training, I think I got most of my development from the good conversation
to which I have always had the luck to have access.” This is probably the
insight behind the “open-plan office” that encourages meetings, chats, and
conversation throughout the workday. For my part, I have found that
interviewing people, exchanging views with peers and friends, and arguing
at editorial meetings have been crucial to learning.
That brings me to the third great strength of a liberal education: it
teaches you how to learn. I now realize that what I gained from college and
graduate school, far more lasting than any specific set of facts or piece of
knowledge, has been the understanding of how to acquire knowledge on my
own. I learned how to read an essay closely, search for new sources, find
data to prove or disprove a hypothesis, and detect an author’s prejudices. I
learned how to read a book fast and still get its essence. I learned to ask
questions, present an opposing view, take notes, and, nowadays, watch
speeches, lectures, and interviews as they stream across my computer. And
most of all, I learned that learning was a pleasure—a great adventure of
exploration.
Whatever job you take, the specific subjects you studied in college will
probably prove somewhat irrelevant to the day-to-day work you will do
soon after you graduate. And even if they are relevant, that will change.
People who learned to write code for computers just ten years ago now
confront a new world of apps and mobile devices. What remain constant are
the skills you acquire and the methods you learn to approach problems.
Given how quickly industries and professions are evolving these days, you
will need to apply these skills to new challenges all the time. Learning and
re-learning, tooling and retooling are at the heart of the modern economy.
Drew Faust, president of Harvard University, has pointed out that a liberal
education should give people the skills “that will help them get ready for
their sixth job, not their first job.”
You might also need to experiment with varieties of intelligence, not
just one. Howard Gardner, a developmental psychologist and expert on
education, has posited that there are at least eight kinds of intelligence:
linguistic, logical-mathematical, spatial, musical, bodily-kinesthetic,
naturalistic, intrapersonal, and interpersonal. To be properly prepared for
today’s world, students must experience several methods of learning
conducive to these various intelligences. America’s loose and open system
of higher education allows for this kind of experimentation. This is what
prompted Gardner to write, “There is a joke in my trade that one should go
to infant school in France, preschool in Italy, primary school in Japan,
secondary school in Germany, and college or university in the United
States.”
Thomas Cech—Nobel Prize–winning chemist and graduate of Grinnell
College, a classic liberal arts school—makes a sports analogy to illustrate a
similar insight. Just as athletes do exercises unrelated to their own sport, so
students should study fields outside their academic area of focus. “Cross-
training may exercise key muscle groups more effectively than spending the
same amount of time working out in the sport of interest,” Cech writes.
“Analogously, a liberal arts education encourages scientists to improve their
‘competitive edge’ by cross-training in the humanities or arts. Such
academic cross-training develops a student’s ability to collect and organize
facts and opinions, to analyze them and weigh their value, and to articulate
an argument, and it may develop these skills more effectively than writing
yet another lab report.”
Gardner argues that in the future, students will focus even more on
modes of thinking. After all, with facts being just a Google search away,
why waste brain cells memorizing them? He notes that the best thinking
often happens when ideas, fields, and disciplines collide, in a setting where
cultures rub up against one another. In the same vein, he rejects a great-
books approach to learning—more so than I would. The point of education,
in his view, is not to stock students’ minds with antique furniture, but to
help them gain the intellectual skills they require to build their own set of
chairs and tables. He would favor a curriculum that exposes students to
different ways of thinking—observational, analytic, aesthetic, teamwork
oriented, and so on (which sounds a lot like the Yale-NUS program). Such a
curriculum is now known to produce results. Drawing on his knowledge of
psychology and neuroscience, Gardner asserts that “it borders on
malpractice to design education that is backward-looking and that ignores
what we now understand about how the mind constructs and reconstructs
knowledge.”
Technology and engineering involve extraordinary explorations of ideas
and thought, something that is often lost because of their real-world
application. They are scientifically fascinating, whether or not they will
make you rich. I remember being amazed by the first computers I saw in
India in the 1970s, but I didn’t have any sense that they would produce
lucrative new industries. In those days, the computer programming I
learned involved using punch cards and mastering FORTRAN, a language
long-since dead. Even in that cumbersome format, the machine’s incredible
power was evident. It was also fun to learn something so new. Computers
have transformed the world in ways that are now blindingly obvious. But
with all the money surrounding them, we can easily forget the intellectual
pleasure they can give. Big data, artificial intelligence, and mobile
computing all might produce great new companies, but they also take us
into areas of knowledge where we have never been before. And whether or
not that makes someone a billionaire, it is a thrilling intellectual journey
that asks profound questions about the nature of the mind—a return in some
ways to the idea of science as a branch of philosophy.
Even technical skills by themselves are a wonderful manifestation of
human ingenuity. But they don’t have to be praised at the expense of
humanities, as they often are today. Engineering is not better than art
history. Society needs both, often in combination. When unveiling a new
edition of the iPad, Steve Jobs explained that “it is in Apple’s DNA that
technology alone is not enough. It’s technology married with liberal arts,
married with the humanities, that yields us the result that makes our hearts
sing.”
That marriage is not simply a matter of adding design to technology.
Consider the case of Facebook. Mark Zuckerberg was a classic liberal arts
student who also happened to be passionately interested in computers. He
studied ancient Greek intensively in high school and was a psychology
major when he attended college. The crucial insights that made Facebook
the giant it is today have as much to do with psychology as they do
technology. In interviews and talks, Zuckerberg has often pointed out that
before Facebook was created, most people shielded their identities on the
Internet. The Internet was a land of anonymity. Facebook’s insight was that
you could create a culture of real identities, where people would voluntarily
expose themselves to their friends, and this would become a transformative
platform. Of course, Zuckerberg understands computers deeply and now
uses great coders to put his ideas into practice, but his understanding of
human psychology was key to his success. In his own words, Facebook is
“as much psychology and sociology as it is technology.”
Technology and liberal education go hand in hand in business today.
Twenty years ago, tech companies might have survived simply as industrial
product manufacturers. Now they have to be at the cutting edge of design,
marketing, and social networking. Many other companies also direct much
of their attention toward these fields, since manufacturing is increasingly
commoditized. You can make a sneaker equally well in many parts of the
world. But you can’t sell it for three hundred dollars unless you have built a
story around it. The same is true for cars, clothes, and coffee. The value
added is in the brand—how it is imagined, presented, sold, and sustained.
Bruce Nussbaum, an expert on innovation, wrote in a 2005 essay in
Businessweek that the “Knowledge Economy as we know it is being
eclipsed by something new—call it the Creativity Economy. . . . What was
once central to corporations—price, quality, and much of the left-brain,
digitized analytical work associated with knowledge—is fast being shipped
off to lower-paid, highly trained Chinese and Indians, as well as
Hungarians, Czechs, and Russians. Increasingly, the new core competence
is creativity—the right-brain stuff that smart companies are now harnessing
to generate top-line growth. . . . It isn’t just about math and science
anymore. It’s about creativity, imagination, and, above all, innovation.”
David Autor, the MIT economist who has most carefully studied the
impact of technology and globalization on jobs, writes that “human tasks
that have proved most amenable to computerization are those that follow
explicit, codifiable procedures—such as multiplication—where computers
now vastly exceed human labor in speed, quality, accuracy, and cost
efficiency. Tasks that have proved most vexing to automate are those that
demand flexibility, judgment, and common sense—skills that we
understand only tacitly—for example, developing a hypothesis or
organizing a closet. In these tasks, computers are often less sophisticated
than preschool age children.” This doesn’t in any way detract from the need
for training in technology, but it does suggest that as we work with
computers—which is really the future of all work—the most valuable skills
will be the ones that are uniquely human, that computers cannot quite figure
out—yet.
Autor divides the job market into three slices. A Fast Company article
nicely summarizes his research. “At the bottom of the market, there’s a
growing number of service sector jobs that require hands-on interaction in
unpredictable environments—driving a bus, cooking food, caring for
children or the elderly. These are impossible to outsource or replace with
technology,” it notes. The middle tier is made up of jobs that are white
collar but are also routine. They involve information processing, form
filing, fact finding, data entry, and simple data analysis. These are white-
collar jobs in insurance, banking, and law, and they are increasingly being
done better by machines. “At the top of the market are the jobs that
everyone wants. And guess what?” the article says, perhaps more
optimistically than Autor himself might, “These are the jobs that graduates
of the American educational system are well prepared for. [They] require
creativity, problem solving, decision making, persuasive arguing, and
management skills.” Vinod Khosla, a Silicon Valley venture capitalist,
argues that machine learning will replace many human jobs, but even he
believes that work involving complex creativity, emotional intelligence, and
value judgments will continue to be done by humans.
And then there is the most influential industry in the United States—
entertainment, one of the greatest global growth sectors. A 2012 industry
report titled The Sky Is Rising presented data showing that all business
related to entertainment had maintained an upward trajectory, through
recessions and recoveries. Between 1995 and 2009, the number of feature
films made worldwide more than quadrupled. Between 2008 and 2011, the
number of Americans playing video games jumped about two and a half
times. Even in book publishing, revenues rose 5.6 percent between the
recession years of 2008 and 2010. Music and television as well—everything
in the sector is up. This is an industry that employs millions around the
world, continues to grow, and enriches economies and cultures. And at its
heart are stories, images, words, and songs. Often these artistic elements are
further embellished by technology—as in the films The Lord of the Rings
and Frozen. Regardless of how these films are made, it is clear that much of
the production of entertainment requires a background and expertise in one
of several of the liberal arts.
So there is a value to writing and music and design and art. But what
about art history? What is the best response to President Obama and so
many others who worry about the purpose of an academic degree in
subjects as seemingly obscure as art history and anthropology? To be fair to
the president, his emphasis was on the many millions of Americans who are
more inclined to obtain some kind of skills-based training than a liberal
education. Perhaps those people would be better off learning a specific
technical skill rather than enrolling in a preprofessional-sounding major like
“business.” But for those who do find that their passion is art history or
anthropology, and take it seriously, there are real rewards in the outside
world. Both those fields often require the intensive study of several
languages and cultures, experience working in foreign countries, an eye for
aesthetics, and the ability to translate from one medium or culture to
another. Most of these skills could be useful in any number of professions
in today’s globalized age. They force you to look at people and objects from
a variety of perspectives. As Howard Gardner’s research demonstrates, this
kind of exposure trains various kinds of intelligence, making you a more
creative and aware person.
Consider the experience of Dr. Irwin Braverman of the Yale Medical
School. In 1998, when he was teaching young medical students who were
residents at an affiliated hospital, Dr. Braverman discovered that their
powers of observation and diagnosis were weak. His novel solution was to
take them to an art gallery. He teamed up with Linda Friedlander, curator of
the Yale Center for British Art, to design a visual tutorial for one hundred
students. They asked the students to examine paintings, forcing them to
unpack the many layers of detail and meaning in a good work of art.
Braverman found that students performed demonstrably better at diagnosis
after taking the class—so much so that twenty other medical schools have
followed his example.
While this may sound like the quixotic idea of one professor, there are
data to support the value of rounded or lateral thinking in the workforce. In
2013, the American Association of Colleges and Universities published a
survey showing that 74 percent of employers would recommend a good
liberal education to students as the best way to prepare for today’s global
economy. When students graduate, those with engineering degrees start out
with higher salaries—as they should, given that they possess a tangible
skill-set that can be instantly applied within a company. But over time, the
wage gap between engineers and other professionals narrows, especially for
liberal arts students who go on to get a professional degree. In fact, one
recent study found that students from a set of liberal arts colleges were
more likely than their peers at other institutions of higher education to
obtain doctorates in sciences, presumably because they possess an acute
curiosity and sense of academic adventure. As I noted, a liberal education
might encourage student interest in scientific subjects for their inherent
intellectual value, rather than their value in the marketplace. And that might
have its own payoffs over time in terms of basic research and scientific
advancement.
Norman Augustine (the former Lockheed Martin CEO) stressed the
importance of both scientific skills and humanistic thought:
So what does business need from our educational system? One answer is that it needs more
employees who excel in science and engineering. . . . But that is only the beginning; one
cannot live by equations alone. The need is increasing for workers with greater foreign-
language skills and an expanded knowledge of economics, history, and geography. And who
wants a technology-driven economy if those who drive it are not grounded in such fields as
ethics? . . .
Certainly when it comes to life’s major decisions, would it not be well for the leaders and
employees of our government and our nation’s firms to have knowledge of the thoughts of the
world’s great philosophers and the provocative dilemmas found in the works of great authors
and playwrights? I believe the answer is a resounding “yes.”
*All spellings and capitalization in writings by the founding fathers have been modernized where
necessary.
† At the time, advocates of education, Benjamin Franklin and Thomas Jefferson included, thought it
fitting only for young white men.
5
* There are those who argue that the tree is really an arbitrary sign of obedience. But as the Milton
scholar David Scott Kastan notes in his introduction to Paradise Lost (Indianapolis: Hackett, 2005),
xlv, why then is it not called the “Tree of Obedience”? The forbidden fruit of knowledge is clearly
central to the story.
6
1: Coming to America
16 As college enrollment has grown: U.S. Department of Education, National Center for Education
Statistics, Digest of Education Statistics 2013, Table 322.10. The Digest, published annually by
the National Center for Education Statistics, is a highly accessible source for statistics on higher
education. The data are updated throughout the year online at
http://nces.ed.gov/programs/digest/.
16 in earlier periods of educational expansion: For more on the post–World War II expansion in
higher education and the simultaneous rise in the humanities, see Louis Menand, “The
Humanities Revolution,” in The Marketplace of Ideas: Reform and Resistance in the American
University (New York: W. W. Norton, 2010), 63–73; and William M. Chace, “The Decline of the
English Department,” American Scholar, Autumn 2009. The Humanities Indicators, a project of
the American Academy of Arts and Sciences available online at www.humanitiesindicators.org,
also tracks data on the twentieth-century rise and fall in the study of the humanities.
17 “I have to speak”: Philip Roth, Portnoy’s Complaint (New York: Vintage Books, 1994), 164.
17 Here are the data: National Center for Education Statistics, Digest, Table 311.60.
18 Another estimate: Measures of the liberal arts vary by source, depending largely on how
academic fields are classified. Justin Pope, “Liberal Arts Colleges Forced to Evolve with
Market,” Associated Press, Dec. 30, 2012, estimates that between 100,000 and 300,000 of the
country’s approximately 17 million undergraduates attend a liberal arts college, that is, a
residential college that exists independent of any larger university. The same article estimates
that about one-third of bachelor’s degrees in the United States are awarded in the liberal arts.
The Digest places the undergraduate population at 18 million as of 2012 (Table 303.60). It
also divides degrees into six broad categories: humanities; social and behavioral sciences;
natural sciences and mathematics; computer sciences and engineering; education; business; and
other fields, a category that includes professional programs such as agriculture and law
enforcement. If just the first three are classified as liberal, then about 40 percent of the 1.8
million bachelor’s degrees conferred in 2011–12 were in the liberal arts (Table 318.20).
19 governors: See Scott Jaschik, “Florida GOP vs. Social Science,” Inside Higher Ed, Oct. 12,
2011 (Scott quote); Kevin Kiley, “A $10,000 Platform,” Inside Higher Ed, Nov. 30, 2012; and
Kiley, “Another Liberal Arts Critic,” Inside Higher Ed, Jan. 30, 2013 (Bennett quote).
19 It isn’t only Republicans: Scott Jaschik, “Apology from Obama,” Inside Higher Ed, Feb. 19,
2014.
20 their comprehensive study of education: Claudia Goldin and Lawrence Katz, The Race between
Education and Technology (Cambridge: Harvard University Press, 2010), 28–29.
21 Today a high school student: Ibid., 254.
22 Jawaharlal Nehru: On Nehru’s economic views, see Shashi Tharoor, Nehru: The Invention of
India (New York: Arcade, 2003), 159–193; and Jawaharlal Nehru, “Temples of the New Age,”
July 8, 1954, available at
http://www.nehruinternationalconference2014.com/nehru_speech4.aspx.
30 “The Other America”: Fatma Zakaria, “The Other America,” Times of India, Mar. 28, 1982.
The American Encounter: The United States and the Making of the Modern
World (coeditor)
Copyright © 2015 by Kastella Rylestone, LLC
Zakaria, Fareed.
In defense of a liberal education / Fareed Zakaria.
pages cm
Includes biographical references.
ISBN 978-0-393-24768-8 (hardcover)
1. Education, Humanistic. I. Title.
LC1011.Z34 2015
370.11'2—dc23
2014043268