Technology can be a tricky thing to handle. You might know how to make iPads, but that does not necessarily mean that you know how to use them. Universities are another case: they are the primary source of technological innovation but less good at making the leap to apply it to what they do best – teaching.
E-learning is discussed in academia but there is a way to go. According to a survey published in August by the Chronicle of Higher Education and the Pew Research Centre, only half of 1,055 US university and college presidents surveyed thought online courses offer value equal to classroom-based teaching (though this was well above the proportion of the public who think so). It is not so surprising that universities are cautious about relinquishing the old educational model: libraries, old textbooks and face-to-face lectures. This is at heart a concern about the quality of the student experience, though that debate has only just begun.
It is also true, however, that ¾ of the presidents reported offering online courses. Nevertheless, the first attempt by a prestigious higher education institution to embrace e-learning on a truly massive scale is important. It arises from a place where innovation abounds, Stanford University. Dr Sebastian Thrun, a German-born professor of Computer Science, and Dr Peter Norvig, Director of Research at Google, are offering this autumn an online course, Introduction to Artificial Intelligence, free of charge for anyone in the world (classes started on 10 October and enrolment is now closed). According to the New York Times, almost 130,000 people from more than 190 countries signed up. Those completing the course will receive a ‘statement of accomplishment’ but not a grade or credit. This class is the first part of a three-course project at Stanford’s Department of Computer Science which aims to make technology skills accessible to a wider audience.
The medium is the message
It is argued that the real innovation in IT is not the technology itself but its applications. This may be true but it misses a crucial point. Technology does not just transfer content; it is content. In a broader sense, the term encompasses everything that is not purely human. As the Canadian scholar Marshall McLuhan put it, whatever constitutes an extension of our nervous system is a form of technology. Language is a tool and the phonetic alphabet is a tool within another tool, and so on.
McLuhan became famous in the 1960s McLuhan2for his revolutionary theories on the impact of technology on human history and culture. According to McLuhan and scholars such as Walter Ong, technology is not just a means for mankind to achieve measurable goals, but a crucial historical factor that constantly reshapes the way we apprehend ourselves and our environment.
Digital minds, print mindsets
McLuhan had a special interest in the uses of technology in education. He was one of the first to point out that our models of education are based on old ideas, shaped by the technological bias of the print era.
This is even more evident today; the gap between the use of digital technology in everyday life and in higher education has become very wide. Students entering tertiary education this year know how to play music on an iPod or video-chat on Skype, but are less comfortable with libraries. Print is not their cup of tea, as shown by the declining sales of newspapers among younger people. And when they get to the classroom, they have to deal with the typical features of the previous technological paradigm: notes, textbooks and essays.
This educational model is not in tune with the digital era. Electronic media allow readers to become authors and publishers themselves. Bloggers have changed everything in journalism – with impacts that range from useful to malign. According to last week’s Economist, technology is adapting to the people who use it rather than forcing them to adapt to it. Fans of the i-culture make apps that suit their personal needs and lifestyle. And as the same publication reports, researchers at Gartner predict that by 2014 around a quarter of business apps will be created by non-IT staff. Education apps made by students are part of this shift to ‘personalised’ computing.
The Gutenberg project has made more than 36,000 e-books available free online since the 1990s. Wikipedia shows that knowledge is no longer only passed from an authority to a large number of recipients, as in the print era, but can be shared by multiple users in a network. And as McLuhan observed, older media become content for the new ones. When TV appeared film constituted a big part of its programming. This is being reproduced now, as TV sitcoms are uploaded onto YouTube. The e-book is another manifestation of media’s trait of acquiring new uses through innovation.
Connecting the dots
The use of digital technology in higher education will enhance interconnectedness, synchronicity and non-linear structures. Knowledge will be shared online by groups, rather than unidirectionally to the masses. The process of learning will be more collaborative and perhaps less competitive than it currently is. Social media could play a huge role in such a shift: they are the best tool we have at the moment for such communication.
Cathy N Davidson, a professor of interdisciplinary studies at Duke University, pointed out in a recent article in the Chronicle that students tend to interact, collaborate and be creative when using online networks. She suggests a new way of learning and working based on multitasking: allowing students to distribute various parts of a specific task, focus on what they are good at and then reconnect the dots. The web makes this education model effective – ‘everything links to everything, and all of it is available all the time’, says Professor Davidson. This is quite different from being taught to finish one task before starting the next.
McLuhan’s argument was that a ‘circular’, rather than assembly-line, approach would be crucial for higher education to provide workers with skills appropriate to the new patterns of industrial production. Thus, we need a new way of learning that favours polymathy over specialisation. The non-linear bias of the electric circuit brings us there. New patterns of classifying higher education and different types of degrees will be required. As e-learning evolves, students will be able to pick odd combinations of majors, such as French and Artificial Intelligence. That will be more difficult for the more rigidly structured continental and Asian higher education systems.
Flexibility is the keyword for universities to understand the new era and profit from it. Other innovators in e-learning are either experimenting with, or considering, a self-paced online programme with no conventional instructors. No question that radically lower costs are driving this.
Teaching the teachers
Technology affects power relations and many academics may not like the implications for their authority within their institutions. One of McLuhan’s most interesting arguments was that modern industry would have to put emphasis on design, not as superfluous ornament but as an integral part of the product. In his own words, ‘the artist tends now to move from the ivory tower to the control tower of society’ (Understanding Media, 1964). As it happens, the more we participate in the communication process, the more creative and design-conscious we become. McLuhan died in 1980 and would have been 100 this year. He never saw an iPad and probably never heard of Steve Jobs. But he would have found his machines fascinating.