Over the past two decades we have placed the outcomes of higher education under scrutiny. Accrediting agencies make the assessment of learning a key to appraising institutions. We scholars make our voices heard on the matter, and politicians have grown curious about undergraduates. In the first decade of the new millennium, Secretary of Education Margaret Spellings’s Commission on the Future of Higher Education spent three years collecting data on the best way to measure the effect of a college education.
Academics, accreditors, and government officials are all working to improve the performance of schools. Their efforts represent diverse perspectives, but the research pivots on cognitive psychology: the study of learning, knowledge acquisition, and skill development. Even authors in other fields gravitate toward cognitive psychology when they turn their attention to higher education. For example, Richard Arum and Josipa Roksa are both sociologists, but their 2011 work Academically Adrift relies on psychometrics: they studied the cognitive skills developed en route to the baccalaureate.
This scenario begs a question. Is cognitive psychology the most appropriate tool to employ in the study of education? The discipline of psychology offers a valuable perspective, but postsecondary life is more than a psychological enterprise. Education is a social institution. Attending a college engages you with one of our cultural traditions, a process akin to taking part in a wedding, funeral, or bar or bat mitzvah. Physically, campuses resemble other kinds of sacred ground. They sometimes even look like churches. And I cannot think of a good reason for anyone to wear a mortarboard, aside from adhering to the demands of a deep-rooted rite of passage.
Anthropologists suggest that culture influences behavior. Traditions work on thoughts and they determine tendencies. Customs affect the choices we make—the foods we eat, the clothes we wear, and the cars we drive. No doubt our culture shaped our interest in the cognitive outcomes of education.
We are individualistic. As a nation, we look inward when we look for answers to questions. We practice “pop psychology” but we have no comparable “popular anthropology.” Psychological research dominates discussions of higher education because our culture steers our attention in that direction, but accumulated evidence suggests that cognitive development is a poor lens through which to view postsecondary schools.
An Anthropological Approach
Since the publication of Kenneth A. Feldman and Theodore M. Newcomb’s 1969 book, The Impact of College on Students, researchers have demonstrated that changes in ethics, self-concept, personality, and aesthetic preferences are longer lasting and more significant than what are often short-lived advances in skills or knowledge. Unfortunately, what we usually describe as learning is ephemeral. Undergraduates quickly forget what they learned when they were cramming for midterms and final exams. In Academically Adrift, Arum and Roksa report that after three semesters, 45 percent of the students they studied did not show sustained improvement in writing or critical thinking.
Over the course of an education, however, students may come to love ideas or reading or the elegance of mathematics or the craft of writing artful sentences. In the words of the eminent psychologist B. F. Skinner, “Education is what survives when what has been learned has been forgotten.” When our institutions are at their best, they turn students into educated people with the values, status, and opportunities that we associate with the college degree.
In view of our individualistic tendencies, it makes sense that we see education as a means to personal skill enhancement, but the focus on learning has left a hole in our understanding of the function that education serves in our society. At this juncture, and for immediate political reasons, we need a social and cultural conception of colleges and universities. To that end, we could gain from the methods of anthropologists, but the field of anthropology has been maligned of late, and by the very people who could benefit from its insights.
The Miami Herald-Tribune quoted Florida governor Rick Scott as asking, “Is it a vital interest of the state to have more anthropologists?” He answered his own question, “I don’t think so.” The interview focused on the need for higher education leaders to channel public funds toward fields with the potential to produce technology and jobs: math, science, and engineering. Shortly after the publication of the Herald-Tribune article, Scott took part in a radio broadcast, in which he suggested of anthropology that “it’s a great degree if people want to get it, but we don’t need them here.”
Cultural studies are qualitative in nature. Anthropologists work in the medium of talk or narrative. In our technology- and data-driven lives, we do not hold this form of scholarship in high regard, but a qualitative approach is necessary where our thoughts on education are concerned. The field of anthropology surely could help us to understand the kinds of people who graduate from our institutions and why society applauds them. Personhood, after all, has an aesthetic component. We cannot reduce identities to numerical data, nor can we understand the human self through the use of metrics. Becoming an educated person involves the creation of a new set of stories to tell yourself, a new understanding of who you are. Becoming an educated person is an art.
Rites of Passage
The anthropologist Victor Turner conducted seminal studies on the process of taking part in rituals. During his fieldwork, he spent time with the Ndembu people of western Zambia. Through his association with the Ndembu, Turner created a model for understanding the steps through which we pass on the way to elevating our place in society. Only a handful of scholars make use of Turner’s principles in research on education, but the concepts lend themselves to a discussion of postsecondary schools.
Turner described three steps in the process of upgrading status: “separation from the mainstream, a period of marginalization or liminality, and finally, aggregation,” where we return to the group in a new capacity. The liminal, from the Latin limen, or “threshold,” is crucial to the process of becoming educated. Turner describes the liminal state as “betwixt and between.” For a period of four or more years, traditional undergraduates at residential colleges and universities are “neither here nor there.”
Each autumn, sons and daughters leave the homes of their youth to establish themselves on a campus. In the new environment, they become submerged in the culture of college: new rules and values, sex, all-night study sessions, alternative music, alcohol, and extracurricular sports. These are the rituals in which we engage as we move from one stage of life to another. On a personal level, these rites become part of our own biographies or life stories. They are also the means that a society uses to evolve and reinvent itself. As a ritual, higher education affords us a chance to pull away from social and cultural expectations. Immersed in the liminal state, students prepare for new roles and, while they are unmoored from the responsibilities of professions, enjoy the luxury of time spent in reflection on the norms and conventions they left behind.
In the era before cognitive ability came to dominate discussions of undergraduate development, Alexander Astin reviewed the studies of student outcomes conducted between 1977 and 1993. He used survey data to give us a partial sense of the changes that students undergo through the course of a baccalaureate. They develop a positive self-image and a greater sense of intellectual competence. Students increase their levels of social activism, their commitment to solving environmental problems, and their interest in developing a meaningful philosophy of life. In short, they become different people.
Astin’s work suggests that the specific learning outcomes associated with courses are not the strongest determinant of student change. In his classic What Matters in College?, he concludes, “The student’s peer group is the single most potent source of influence on growth and development during the undergraduate years.” As educators, we assume that students enroll in our classes for the sake of the learning outcomes listed on our syllabi. The truth is that learning outcomes are actually a small part of the endeavor. The postsecondary ritual is a large and life-changing experience.
Generally speaking, we do not conceive of education as a rite of passage. As a result, our traditions have grown susceptible to challenges that also ignore this reality. Without a cultural conception of education, it has been possible to minimize our customs and reduce education to training. In particular, when students take courses on the Internet, we dispense with the cultural portion of education. In online learning, we assess the knowledge and skills that students can demonstrate, momentarily, while logged on to the website for a class. Questions about who you are or what you become are not central to the venture. A recent advertisement for the online Western Governors University suggests that the university measures “how much you learn, not how much time you spend in class.” From a purely cognitive standpoint, the assertion is logical; but there is also reason to believe that spending time in class is one of the more important elements of becoming an educated person. In addition to time spent with peers, face-to-face contact with faculty members also ranks as a significant determinant of student change.
Higher Education as a Cultural Process
During the past twenty years, we have taken a complicated, rich, and historic rite of passage and reduced it to its most basic utilitarian purpose. In the process, we have ignored the tools we could have used to explain or understand what we lost in the bargain.
It might help to imagine another rite of passage in the same terms we have been using to describe the practice of going to college. Consider marriage. What if we limited weddings to their most basic function? If all that matters, with regard to the institution of marriage, is that two people agree to a set of vows, then we could easily conduct weddings over the Internet. An appropriate person could send copies of the vows to both the bride and groom. They would receive them on their smartphones, and after reading the text, both parties could type the words “I do.”
It is questionable whether weddings would have the same meaning if we reduced them to their most basic component and conducted them online. Education, as a cultural process, is similarly irreducible. Even those who believe that cognitive skills are the most important outcome of education should acknowledge that it is hard to build skills without an accompanying change in one’s identity.
Educators spend a good deal of energy testing critical-thinking ability and, frankly, are frustrated with the results. One reason we have difficulty producing critical thinking is that we separate thinking from thinkers. We treat critical thinking as if it were a freefloating ability when, in fact, it is a function of oneself or one’s identity. Critical thinking is a way of positioning oneself toward a problem. For critical thinking to take place, students must first come to think of themselves as people who are willing to take a critical stance in relation to an issue.
Similarly, we conceive of writing as a skill, but good writing is possible only when students come to think of themselves as writers. Let me use myself as an example. I have enjoyed a modest level of success as an author. Still, if you compared first drafts of my books and articles with those I produced as a senior in high school, you would find a close match with respect to the level of quality. From what I can tell, my ability has not changed since my late teens, but good writing does not simply result from talent or abilities. Good writing is the result of editing, revising, and rewriting. My native ability has not improved since I was a teenager, but I have a different identity today. As an undergraduate, I started to think of myself as a writer; that changed everything. To be a writer, I found out, takes a willingness to edit your own work until you fall asleep with your face stuck to loose pages of manuscript.
Becoming a critical thinker, a writer, and an educated person is a cultural process. When it is at its best, the process is formative, even transformative. It involves beginning as a neophyte and then becoming an expert or a practitioner of one kind or another— and recognizing the process as a time-honored ritual. Cognitive development is part of the process, but our real task in education is to create an environment where students can become writers, scientists, historians, biologists, and mathematicians. Each of these professions amounts to more than a skill set. Each is also more than a career. These are identities rooted in the self-concepts of our graduates.
I serve as a consultant-evaluator for the nation’s largest regional accrediting agency. Under the auspices of the Higher Learning Commission, twice a year I visit schools and implore them to measure what students learn. Even when a college’s mission statement suggests that the institution exists to produce a certain kind of person—“good citizens,” for example—we educators tend to relieve ourselves of the duty to understand the identity shift that occurs during the course of an education.
Scholars with a variety of backgrounds have done a good deal to assess the skills that students develop, or fail to develop, in college. In the end, however, the important but ignored questions of education have to do with personhood. The economist Howard Bowen once suggested, “The impact of higher education is likely to be determined more by the kind of people college graduates become than by what they know when they leave college.” To understand the change that students undergo, we need to solicit stories from them, and for that, we will likely need ethnographers and a commitment to understanding the processes and rites of education and not simply their quantifiable outcomes.
Efforts to collect students’ stories will serve students well if we continue to value job placement as one sign of success upon graduation. Few employers conduct performance appraisals as part of the hiring process. During interviews, people are asked to talk about themselves. Job interviews, in this sense, are exercises in storytelling.
Staffing demands are shifting fast in the twenty-first century. Corporations can train and retrain employees for work in their organizations, but they cannot make an educated person. For that, we need institutions of education.
Creating the kinds of people who can successfully inhabit the future is a serious undertaking. The weight of the task should force us to wonder what, exactly, we are trying to achieve. Will we supply the workforce with a corps of single-minded technocrats? Or will we usher in the next renaissance? To accomplish the latter, we will need to view our graduates as people instead of skill sets. We will have to give thought to the norms they observe, the values they share, and the roles they are able to fill. In other words, we will have to study their sense of self—the stories they use to make sense of who they are and where they fit in our society.
On assignment for the Higher Learning Commission, I once served on a team that granted initial accreditation to White Earth College, a Native American institution located in west-central Minnesota. One of the distinctive features of the college was a practice that required each student to tell and record a “graduation story.” I listened to a sample of the stories during the time I spent on campus. They struck me as the germ of a custom that could help us plot our course to a better future.
Author Bio:Chad Hanson serves as chair of the Department of Sociology and Social Work at Casper College. He is the author of The Community College and the Good Society and a forthcoming collection of essays, Trout Streams of the Heart. His e-mail address is [email protected].