At 30 years old, I definitely consider myself part of the Facebook generation. Zuckerberg’s brainchild hit the ‘net when I was a senior in college, and by then I was already well acquainted with e-mail, chat rooms, text-messaging, and all the multifarious precursors to today’s social media. I text, I post, I chat, I even snapchat: in these respects, I’m an utterly unremarkable member of my society.
But I also happen to be a college professor and a molder of young minds. And, far from indulging the technology-driven spirit of the times, I make my students work as students have always worked. They read Seneca, Pascal, Tolstoy, and Schopenhauer. They are obliged to turn in papers by hand; they must come to office hours to speak with me about their grades; they are even, and this is most anachronistic of all, required to attend class. Physical presence is key to every aspect of their learning experience, be it my hovering, breathing presence in the classroom or the office, the cohort of 30 or so warm bodies that shows up for lecture twice a week, or the more abstract form of embodiment conveyed by the weight of a book.
To believe certain commentators, however, this embodied notion of learning is on its way out in American higher education. Writing for The American Interest’s January/February 2013 edition, the recent Yale graduate Nathan Harden offers the following ominous prognostications about the future of university instruction in our digital age:
\”In fifty years, if not much sooner, half of the roughly 4,500 colleges and universities now operating in the United States will have ceased to exist. The technology driving this change is already at work, and nothing can stop it. The future looks like this: Access to college-level education will be free for everyone; the residential college campus will become largely obsolete; tens of thousands of professors will lose their jobs; the bachelor’s degree will become increasingly irrelevant; and ten years from now Harvard will enroll ten million students.\”
On Harden’s account, one of the principal reasons for this portended transformation, which is already being partially implemented by such institutions as Harvard and MIT, is that the cost of college is increasingly out of proportion with its perceived economic benefit. As the American job market has become more competitive, the cost of a degree has increased, and only the most naïve of students still believe that a college education is a universally redeemable ticket to middle-class prosperity. The weighing up of costs and benefits involved in earning a college degree will lead inevitably to a re-evaluation of the current higher education model. Luxury residence halls, face-to-face interaction between professors and students, ivied brick walls — these will all be things of the past once the much-heralded education bubble finally bursts. What will replace them are massively populated, inexpensive online courses and lectures, prerecorded by the very best lecturers and administered by those hordes of professors and other academics not quite sexy or charismatic enough to warrant virtual celebrity.
To anyone who thinks Harden’s predictions are a little too ambitious (not to mention deeply disturbing, at least for college professors who don’t fancy the idea of working in a grading factory), don’t worry — they most likely are. What Harden forgets — and indeed, what just about everyone prophesying the eclipse of face-to-face interaction in a virtual world forgets — is that human beings are, above all else, bodies, and that to lead full, happy, and meaningful lives, we need other bodies. Let’s consider the following examples of how technologies of virtualization have failed to triumph over our species’ thirst for physical presence.
1. The Giant Head. Some older readers may recall a famous article in Reader’s Digest from the late 1950s featuring an illustration of a massive human head connected to miniscule arms and legs. What was the thesis of that article? The tech junkies of the time believed that in the future technology would become so advanced that human beings would no longer need to use their bodies, leading to a swelling of the brain and a shriveling of our appendages. Many also foretold a time when food supplements would replace food. Wouldn’t it be great, they asked, if instead of spending hours preparing and eating meals, we could nourish ourselves in just a few seconds? No one at the time seemed to consider that human beings might not want to do any of this — that we might enjoy using our bodies, eating, and the like. In the half-century since these predictions were made, restaurants have proliferated, and heads haven’t grown one bit.
2. Live Theater. When I was a kid, there were hardly any live theaters in my hometown of Bakersfield, Calif. Now there are about ten. Many people used to believe that movies had sounded the death knell for live theater, but today the latter enjoys just as much, if not more, prestige than it did 100 years ago. I recently had the good fortune to see Kevin Spacey’s production of Richard III. I’ll remember his performance for the rest of my life — it had never occurred to me that acting could be so visceral, so violent, so physical. How many of us can say the same thing about movies? Again, those who foretold the demise of live theater never reckoned that people might just plain like seeing living bodies move around and speak on the stage, and that no amount of special effects could compensate for the lack of real flesh and blood.
3. The myth of social media. This myth holds that virtual, online or technologically mediated interactions are in the process of replacing face-to-face interactions. Most people never take the time to think about what the world would be like if this were really the case. I live in a small college town, and I can assure anyone interested in such things that student interactions on Friday and Saturday nights are plenty physical —sometimes I can hear them from across the lake! Social media does little more than provide a way of sharing information that enhances the intimacy of eventual physical contact. Anyone who doesn’t know this doesn’t understand the technology.
Of course, people like Harden will point to other sectors of the economy where technological innovation has erased thousands of jobs. People don’t need information from stockbrokers or travel agents to make decent decisions about travel or investment anymore, so why should a living, breathing professor be necessary to convey the sort of information one gets out of a college education? If that information can be distributed more cheaply thanks to virtualization, why should students be expected to bear the extra expense of classroom education?
The answer to this question is so elementary that the objection supporting it is almost hard to take seriously. The truth is that education is not simply the conveying of information. In fact, it is probably only marginally that. How many people remember most of what they learned in college? Only very few, I would guess. The benefit of a classroom education is that it keeps students under a certain amount of mental pressure, forces them to think on the spot, and obliges them to explain themselves to other people who are physically present. Information is afoot in these interactions, but so are wisdom, passion, empathy, and a whole host of other viscera that only an embodied teacher or student can properly convey.
How effective, for instance, do we imagine an online church experience would be compared to the real thing? Is it reasonable to think that a virtual tour of the cathedral at Chartres would be as spiritually moving as being there? We should also consider that many students might simply enjoy the physical classroom and their interaction with peers and professors — or at least they might recognize that they learn better under these conditions. The costs of classroom education may be soaring out of proportion at present, but this is not a verdict on the education itself.
So let’s ask — what developments are behind these grim augurs of the collapse of America’s higher education model? Some of it undoubtedly has to do with politics. Many commentators on the right (and perhaps Harden is one of them) would likely cheer the dismantlement of a system whose values are often perceived as far left of center. If taking education online can put “tenured radicals” out of work, then why not welcome it? At the same time, however, just as many moderate and left-leaning thinkers have joined the chorus of those predicting the failure of higher education (for instance, see Thomas Friedman’s recent writings in The New York Times), and it would be simplistic to chalk this latest round of doom-peddling up to politics.
The real culprit, I suggest, is what, for lack of a better term, we might call Appleism. Innocent in principle but nefarious in practice, the doctrine of Appleism holds that increases in technological capability are synonymous with increases in human happiness. Anything that can be put on a screen is better than what can be seen with the naked eye. The passage of electrons through a cathode tube is equivalent to passage from a lower to a higher state of being. Proponents of Appleism hold out technology as an intrinsic good; they are the sorts of folks who compulsively buy the latest Apple product, simply on principle.
We can point to fiscal insolvency all we want, but one has difficulty believing that Harden’s and others’ vision of a fully or almost-fully online education is not also the product of society’s limitless fascination with virtualization. Proponents of the current craze ought to think carefully about the human costs of technology before enthusiastically proclaiming the end of a system that could leave hundreds of thousands of people without work, students cheated out of a quality education, and that would further contribute to the creation of a world where virtualization is always and everywhere, without qualification or questioning, heralded as an unequivocal good.
Author Bio: Louis Betty is an assistant professor of French at the University of Wisconsin-Whitewater.