Matthew Might, a computer science professor at the University of Utah, writes: “Every fall, I explain to a fresh batch of Ph.D. students what a Ph.D. is. It’s hard to describe it in words. So, I use pictures.” Here it goes. Matt’s Illustrated Guide:
Imagine a circle that contains all of human knowledge:
By the time you finish elementary school, you know a little:
By the time you finish high school, you know a bit more:
With a bachelor’s degree, you gain a specialty:
A master’s degree deepens that specialty:
Reading research papers takes you to the edge of human knowledge:
Once you’re at the boundary, you focus:
You push at the boundary for a few years:
Until one day, the boundary gives way:
And, that dent you’ve made is called a Ph.D.:
Of course, the world looks different to you now:
Last week, we showed you a clip of Bill Gates speaking at the recent Techonomy conference in Lake Tahoe. Our comments concentrated on a shorter segment where Gates talks about the coming transformation of education – about how the internet will start displacing the traditional university within five years. That clip figures into a larger talk, now fully available online, called “Reinventing Capitalism: How to Jumpstart What the Marketplace Can’t” (48 minutes). And it puts Gates’ views on education (not to mention his overall philanthropic work) into a larger context. What’s generally on display here is his limitless faith that science and technology can solve the world’s problems. It’s an approach that makes perfect sense for ridding the world of malaria. But it’s potentially a double-edge sword for education. You can watch the full talk above, or view it here. (His full comments on education & technology come around the 21 minute mark, and again later on.) You can also learn more about what Gates is reading, watching and listening to on his website.
Next month’s edition of Fast Company (available online now) brings you a big, glowing tribute to TED and its TED Talks. It’s a lovefest in print, the kind that sells magazines. And, along the way, Anya Kamenetz (author of DIY U) makes some big claims for TED. Let me start with this one:
I would go so far as to argue that [TED’s] creating a new Harvard — the first new top-prestige education brand in more than 100 years.
Of course TED doesn’t look like a regular Ivy League college. It doesn’t have any buildings; it doesn’t grant degrees. It doesn’t have singing groups or secret societies, and as far as I know it hasn’t inspired any strange drinking games.
Still, if you were starting a top university today, what would it look like? You would start by gathering the very best minds from around the world, from every discipline. Since we’re living in an age of abundant, not scarce, information, you’d curate the lectures carefully, with a focus on the new and original, rather than offer a course on every possible topic. You’d create a sustainable economic model by focusing on technological rather than physical infrastructure, and by getting people of means to pay for a specialized experience. You’d also construct a robust network so people could access resources whenever and from wherever they like, and you’d give them the tools to collaborate beyond the lecture hall. Why not fulfill the university’s millennium-old mission by sharing ideas as freely and as widely as possible?
TED, the new Harvard. The new university. It’s a nice idea … until you think about it for a few moments. Will watching 18 minute lectures – ones that barely scratch the surface of an expert’s knowledge – really teach you much? And when the 18 minutes are over, will the experts stick around and help you become a critical thinker, which is the main undertaking of the modern university after all? (Will they assign the papers where you grapple with the difficult ideas? Will they make sure your arguments are sound? That your writing is lucid? Or will they even expand on their brief lectures and teach you something in-depth?) Nope, you’ll get none of that. The experts will give their 18 minute talks, and then they’re gone. Ultimately, Kamenetz seems to know she’s overreaching. She eventually circles around to say, “Sure, these talks have their limits as an educational medium. An 18-minute presentation, no matter how expert, can’t accommodate anything overly theoretical or technical — the format is more congenial to Freakonomics than economics.” And so the whole initial, catchy premise falls apart. (Maura Johnston rightly makes this point too, among other good ones, in her must-read reaction to the “breathless” Fast Company article.)
I have no beef with TED. Quite the contrary, I’m a big fan of their open lectures. (Get the full list here.) And you can’t blame TED when others read too much into what they do. But, echoing points made last week, I do have an issue with commentators reducing education to watching TV. So a quick request to the “edupunks” and “edupreneurs” out there. As you’re democratizing education and lowering tuition through technology, could you make sure that whatever you’re finally offering is an education in more than mere name? You feel me?
NOTE: Anya Kamenetz, the author of the Fast Company article, offers a response in the comments below. In fairness to her, please give them a read. We also have a little follow up.
Speaking at the Techonomy conference in Lake Tahoe last week, Bill Gates argued that the cost of college needs to come down, and the only way to accomplish this is through technology and lessening the importance of “place-based” colleges. That’s how you keep college education open to all. During the talk, he went further and asserted, “Five years from now, on the Web for free, you’ll be able to find the best lectures in the world. It will be better than any single university.”
To be sure, I don’t dispute this particular point. You can already find hundreds of free courses online, and that’s part of our reason for being. But, as I have frequently reminded people, listening to lectures doesn’t mean you’re getting a rounded education. Lectures inform you. They’re great in that way. But you get an education when you couple lectures with readings, when you chew over ideas in a discussion section, when you analyze the lectures and readings in critical papers, when you take exams that force you to synthesize everything you’ve learned during the entire semester, etc. Right now, it is very hard to accomplish this online. On a relative basis, e‑learning tools have evolved strikingly slowly during the past decade. The widely deployed tools are often still klunky and rudimentary. And it still takes considerable time, money and labor to produce a truly excellent online course. (At least that’s what I have found during my ten years in the space.) Will we make progress here? Yes. Would I welcome it? Of course. But will we offer a substantive and highly scalable online alternative in five years? Very doubtful, unless a catalyst comes along who can dramatically sweep away the existing major players (who just bog things down) and introduce some serious innovation. Mr. Gates, are you that catalyst?
Harvard University has now released version 2.0 of OpenScholar, an open source software package that lets scholars build personal and project-oriented web sites in a matter of minutes. It’s a quick, easy, and free solution (minus one meaningful caveat below) that allows academics to build an online home for their “CV, bio, publications, blogs, announcements, links, image galleries, class materials,” and even submit publications to online repositories, such as Google Scholar. You can see an example of OpenScholar in action here.
Now here’s the one important rub. Before a prof can start using OpenScholar, someone on his/her IT staff will need to install the software on their university’s servers. Harvard doesn’t host the solution. The video above and Wired Campus offer more details …
In 2008, writer Nicholas Carr published an essay in The Atlantic with the provocative headline, “Is Google Making Us Stupid?” Carr’s thesis was that the Internet, for all its immediate and obvious benefits, was also doing us some harm. It was robbing us of our ability to read deeply and concentrate on long texts.
“Immersing myself in a book or a lengthy article used to be easy,” Carr wrote. “My mind would get caught up in the narrative or the turns of the argument, and I’d spend hours strolling through long stretches of prose. That’s rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do.” The habits acquired reading hypertext – skimming and skipping rapidly from one item to the next – stayed with Carr even when he was away from his computer. “Once I was a scuba diver in a sea of words,” he wrote. “Now I zip along the surface like a guy on a Jet Ski.”
Carr found that many people he knew — “literary types, most of them” — were noticing the same thing. Frequent use of the Net seemed to weaken one’s capacity for reading long, fully developed texts. Carr began to worry about the consequences. If we lose our ability to read deeply, might we also lose our ability to think deeply?
Two years later Carr is back with a book, The Shallows: What the Internet is Doing to Our Brains, which explores that question in depth. To understand what is going on, he writes, we have to look beyond the content. “Media work their magic, or their mischief, on the nervous system itself,” Carr writes. “Our focus on a medium’s content can blind us to these deep effects. We’re too busy being dazzled or disturbed by the programming to notice what’s going on inside our heads.”
In The Shallows, Carr describes numerous scientific studies that lend support to his claim that Web surfing has adverse cognitive consequences. For example, research has shown that readers of hypertext have more difficulty understanding and remembering what they have read than readers of traditional, “linear” text. In multiple studies, the distraction of hyperlinks was shown to hinder comprehension.
Other studies have tracked the movement of readers’ eyes and revealed that Web readers typically do not read line-by-line, the way they would if they were reading a printed text. Instead, their eyes trace out a pattern resembling the letter F. The eyes typically begin by following a few lines all the way across, then skim part-way across a few more lines before drifting downward along the left-hand side of the text. Jakob Nielsen, a Danish Web usability expert who conducted some of the early eye-tracking studies, puts it succinctly: “How do users read on the web? They don’t.”
The patterns of thought that go along with reading habits such as these – superficial, scattered, perpetually distracted – can have serious consequences even when we’re not online, argues Carr. He cites recent brain research showing that neural connections are significantly refigured, or “re-mapped,” as a consequence of mental experience – especially repetitive experience. Carr quotes a blog entry written by neuroscientist Michael Merzenich: “When culture drives changes in the ways that we engage our brains, it creates DIFFERENT brains.”
The Shallows, like Carr’s earlier magazine article, has sparked considerable public debate – much of it polarized. As the book came out last week, The New York Times began a series, “Your Brain on Computers,” examining some of the issues raised by Carr. On Friday, Harvard psychologist and cognitive scientist Steven Pinker entered the fray. “Experience does not revamp the basic information-processing capacities of the brain,” Pinker wrote in the Times. “Far from making us stupid, these technologies are the only things that will keep us smart.”
Carr issued a response, arguing that Pinker was “too quick to dismiss people’s concerns over the Internet’s influence on their intellectual lives.” He quoted the work of another psychologist: “As Patricia Greenfield, the UCLA developmental psychologist, wrote in a Science article last year, research suggests that our growing use of screen-based media is weakening our ‘higher-order cognitive processes,’ including ‘abstract vocabulary, mindfulness, reflection, inductive problem solving, critical thinking, and imagination.’”
Wherever one stands in the debate, Carr has challenged us to do precisely what he says is becoming more difficult to do: pause, reflect, and meditate on the matter. We spoke with Carr by email.
Open Culture: When did you first begin to suspect that the Internet was changing the way you think?
Nicholas Carr: It was sometime during 2007. I had been using the Net, with increasing intensity, for more than a decade, and it began to dawn on me that there might be a connection between all the time I spend clicking links and exchanging emails and the erosion of my ability to concentrate. When I sat down to read a book, or just to think, I found it hard to maintain my focus – my mind wanted to keep clicking and surfing.
Open Culture: There is something addictive – almost like slot machines – about surfing the Web, isn’t there?
Nicholas Carr: I’m not sure whether it rises to the level of addiction, but the web certainly tends to inspire compulsive behavior. There are a few reasons for that, but one of the big ones is that human beings crave new information. So if we’re given the opportunity to get a new bit of information – and it doesn’t much matter whether it’s trivial or important – we’ll go for it. Since on the Web new information is always just a click away, it becomes hard to break free of the flow. We keep clicking, keep checking email, keep Googling, and so on. That desire for new stuff is amplified by the fact that a lot of the information flowing through our computers or our cell phones these days has a social component – it takes the form of messages or updates from people we know. If we disconnect, we can feel like we’re missing out on the conversation, and because we’re social beings that feeling can be unendurable.
Open Culture: Your essay in The Atlantic caused quite a stir. There’s even a long Wikipedia page on it, which is unusual for a magazine article. Were you surprised by the reaction?
Nicholas Carr: The Wikipedia article is quite good, I think. But, yes, I was surprised. I had seen the article as a rather modest personal essay, but it clearly struck a chord. I received notes from scores of people saying that they were having similar experiences to my own and were very concerned about the Net’s influence. Of course, I also received quite a few messages saying I was full of baloney.
Open Culture: It’s been almost two years since the article appeared. Since then, what have you learned about how the Internet is affecting our brains?
Nicholas Carr: The reaction to the piece led me to look beyond the personal anecdotal, to see what science and history might tell us about the cognitive and cultural effects of a popular medium like the Internet. A lot of what I discovered was disturbing. Many studies suggest that the Net, and screen-based technologies in general, encourages a distracted way of thinking that impedes comprehension and learning, even as it gives us access to a huge amount of valuable information. What I also found is that, to understand the Net’s influence, you really have to look at it in the context of technology’s effects on the intellectual history of humankind, going all the way back to the development of maps and devices for timekeeping. It’s a fascinating story.
Nicholas Carr: I think they are. The Web is now about 20 years old. Up until recently, we’ve been dazzled by its riches and conveniences – for good reason. Now, though, I think we’re becoming more aware of the costs that go along with the benefits, of what we lose when we spend so much time staring into screens. I sense that people, or at least some people, are beginning to sense the limits of online life. They’re craving to be more in control of their attention and their time.
Open Culture: In the book you quote Marshall McLuhan, who famously wrote that the “medium is the message.” and that the content served up by a medium is merely “the juicy piece of meat carried by the burglar to distract the watchdog of the mind.” How does this relate to what’s happening with the web?
Nicholas Carr: It’s natural that, when a new medium comes along, we focus on the content it provides us – the shows on the TV and radio, the stories in newspapers and magazines – and don’t pay much heed to its deeper effects on cognition and culture. Popular media tend to be very good at seducing “the watchdog of the mind,” as McLuhan put it. McLuhan’s intent was to get the watchdog to pay attention to what the burglar was stealing. That’s pretty much my intent, too.
Open Culture: What is the burglar stealing, and how?
Nicholas Carr: Our more attentive, solitary modes of thinking – contemplation, reflection, introspection, and the like. We’re training our brains to be more adept at skimming and scanning and surfing – and those are certainly valuable ways of thinking – but we’re neglecting our quieter modes of thought. And when you don’t exercise a habit of mind, you slowly begin to lose it.
Open Culture: In the book you write about “neuroplasticity.” What is that?
Nicholas Carr: It used to be assumed that the structure of the human brain was fixed by the end of childhood. But what we’ve learned over the last 40 years is that even the adult human brain is constantly changing, at a cellular level, to adapt to changes in circumstances and experiences. We can assume, therefore, that the changes in our habits of thought produced by the Net and related media are also producing actual biological changes in our brain. I argue that that’s likely one of the reasons why our distracted, nervous, skimming forms of thinking stay with us even when we turn off our computers.
Open Culture: In a recent interview in The Atlantic, you said, “There seems to be a redefinition of our idea of intelligence.” What did you mean by that?
Nicholas Carr: We used to think of the gathering of information as only the first stage of thinking. The second and more important stage was thinking deeply about the information we gathered, connecting it to the other information stored in our heads in order to build personal knowledge and even wisdom. Now, I sense that we’re increasingly defining intelligence as merely the act of gathering – as a matter of “accessing” as much information as possible. We’re beginning to lose sight of the deep thinking stage, which requires concentration, quiet, and a degree of solitude.
Open Culture: Some people have suggested we’re moving inexorably toward a kind of global intelligence, or “hive mind,” in which individual human minds are the worker bees. Given the benefits of collectivization, would that be a bad thing? Perhaps our individual minds are being re-wired for a greater collective intelligence.
Nicholas Carr: What’s interesting about our minds, I believe, is what’s least bee-like about them. I’m not sure what “collective intelligence” means, but if I were to define it I’d say it’s synonymous with “culture.” And culture emanates from many individual minds, working alone or in concert.
Open Culture: Speaking of culture, some of your critics have suggested that behind your argument lies a nostalgia for the days when the literary intelligentsia were the cultural elite. In response to your Atlantic essay, Clary Shirky wrote, “Having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. The threat isn’t that people will stop reading War and Peace. That day is long since past. The threat is that people will stop genuflecting to the idea of reading War and Peace.” How do you respond to that?
Nicholas Carr: I would be lying if I didn’t confess to being saddened by the much-reduced place of literature and literary writers in our culture. I personally see great works of literature – including, yes, Tolstoy’s – as being not only among the most profound achievements of human culture but also deeply inspiring and enlightening on a personal level. Shirky is a very smart man, but I find his comments about literature in general and Tolstoy in particular to be expressions of an apparently fashionable form of philistinism. I have yet to discover anything on the Web that has the emotional and intellectual resonance of, say, Thomas Hardy’s Jude the Obscure or the poems of Robert Frost.
Open Culture: If we are sacrificing our reflective, contemplative faculties, what do you think will be the long-term consequences, both for the quality of individual lives and for society at large?
Nicholas Carr: Well, as the title of my book makes pretty clear, I think we’re shifting toward shallower, less interesting intellectual lives and, more generally, a shallower culture. That doesn’t mean we’re getting dumber or stupider. It means that the emphasis of our thought is shifting away from the more contemplative and solitary modes of thought that I believe give richness and distinctiveness to our thoughts and even our selves. I fully understand that there are plenty of other people who don’t value the quieter modes of thought and will hence dismiss my concerns, but I think there are many other people who, like me, sense a hollowing out of intellectual life.
Open Culture: Your book is basically descriptive, rather than prescriptive. You don’t offer any solutions. Are you pessimistic?
Nicholas Carr: I’m not optimistic. But what I’ve tried to do in The Shallows is to describe carefully what I believe is going on, in hopes that it will raise people’s awareness. Raising awareness is the most valuable prescription I can offer as a writer.
Open Culture: In your own life, are you doing anything to combat the problems you describe?
Nicholas Carr: I’m trying to cut back on my use of the Net. In order to regain the concentration necessary to write my book, I curtailed my use of e‑mail, didn’t use my cell phone, dropped my Facebook and Twitter accounts, and mothballed my blog. It helped enormously. My thinking became much calmer and more focused. I have to confess, though, that I’ve been drifting back to my old habits. I haven’t given up the fight, though.
This article was contributed by Mike Springer, a journalist in Cambridge, Massachusetts.
Reading the press lately, you’d think the American university system is the next mortgage market. And the humanities? They’re toxic debt. Here’s a quick recap of the grim parade of stories:
Last week, The New York Times set the stage with this: an article detailing how students are drowning in debt, which raises the questions: Can students still afford America’s expensive universities? And will banks keep making these loans? The Washington Examiner goes further and bluntly asks: Is a Higher Education Bubble about to Burst?
Next, in The New Yorker, a widely-read article offers this factoid: During the coming decade, most of the sectors adding jobs in the US won’t require a college degree. So some academics (yes, academics) are left wondering, “why not save the money and put it towards a house?” Or, put differently, is a college education really worth the money?
The meme continues yesterday with David Brooks musing in an opinion piece: “When the going gets tough, the tough take accounting. When the job market worsens, many students figure they can’t indulge in an English or a history major. They have to study something that will lead directly to a job.” “There already has been a nearly 50 percent drop in the portion of liberal arts majors over the past generation, and that trend is bound to accelerate.” So why bother with a humanities education? Brooks tries to make his best case, and it’s not a bad one. But I’m not sure that a younger generation is listening. And if you listen to this 2008 interview with Harold Bloom, they maybe shouldn’t be.
Fast forward a generation, and you might hardly recognize the humanities. Big data is here, and it’s allowing tech savvy students to take a whole new approach to “reading” texts. Using Google’s digital library and other tools powered by high power computing, students can now quantitatively analyze large bodies of literature and draw new conclusions about the evolution of ideas, language, and culture. (More on this here.) Some worry that these “stat-happy quants” risk taking “the human out of the humanities.” Others (myself included) suspect that this approach could enliven the humanities, allowing scholars to focus on new methods and questions. How “big data” is transforming the humanities (and the sciences too) is the subject of six articles appearing in The Chronicle of Higher Education. Let me highlight them for you:
We're hoping to rely on loyal readers, rather than erratic ads. Please click the Donate button and support Open Culture. You can use Paypal, Venmo, Patreon, even Crypto! We thank you!
Open Culture scours the web for the best educational media. We find the free courses and audio books you need, the language lessons & educational videos you want, and plenty of enlightenment in between.