According to tech entrepreneur Patrick Collison and economist Tyler Cowen, academia needs a new discipline called “progress studies.” But their proposal overlooks two crucial facts: human progress has been an object of study for centuries, and innovators ignorant of that scholarship have had devastating effects on the planet and society.
When Collison and Cowen write that “progress itself is understudied,” historians balk. Progress has been a backbone of historical narratives; its origins, consequences and limits have been objects of historical analysis for generations.
We should be wary about Collison and Cowen’s advice to study the “successful” and train the “brilliant” in order to speed up “progress.” Neither recent nor more distant history suggests that these terms have neutral definitions. To the contrary, they have often been excuses for colonial expropriation and social exclusion, and sometimes alibis for democratic and environmental catastrophe.
Our apprehension about progress studies derives from our scholarship in the history and philosophy of science, and in particular in the history of scientific, colonial and social engineering projects in the era of the scientific revolution and the Enlightenment — arguably, the period that gave us our modern idea of progress in the first place.
Even earlier, the conditions and causes of what Francis Bacon called “human progress and empowerment” were established subjects of interdisciplinary study. At the turn of the 17th century, the Italian jurist and antiquarian Guido Pancirolli published an influential history of ancient and modern knowledge and invention. Originally published in Italian in 1612, it was translated into Latin, French and English. Pancirolli’s work — as historian of science Vera Keller has shown — made history a tool of innovation at the outset of the scientific revolution.
Pancirolli was not alone. In 1668, Royal Society Fellow Joseph Glanvillpublished Plus Ultra, promising to trace “the progress and advancement of knowledge” from Aristotle to “the most remarkable late improvements of practical, useful learning,” so as “to encourage philosophical endeavours.” Proposing to fly through “mere comprehension” of progress to “the deeper goal of speeding it up,” Collison and Cowen add little to Glanvill’s pitch.
What is progress?
Historians of economy, society, science and technology such as Joel Mokyr, Joyce Appleby, Margaret Jacob and Paul Slack have given ample space to the makings of modern progress, along with related ideas Collison and Cowen echo: discovery, invention, improvement, transformation.
Works on the Scientific Revolution, Enlightenment and Industrial Revolution are replete with explorations of what Collison and Cowen call the “distribution of progress” and its causes. Indeed, accounting for “the combination of economic, technological, scientific, cultural and organizational advancement” that distinguishes “first-world” or “developed” societies has been a central question of the social sciences since their formation over a century ago.
Scholarship also underlines the complexities of progress and the underside of “successful people, organizations, institutions, policies and cultures,” problems that progress studies defines out of existence. Progress towards what? Success for whom? By what measure? At what cost?
Studies of scientific and technological projects highlight the mixed results of past disruptions. Eric Ash’s work on fen drainage in 17th century England reveals the destruction technology wrought on local environments and ways of life. Anya Zilberstein shows that 18th century colonization fostered positive views of anthropogenic climate change. Keller and Koji Yamamoto have probed the tensions between private and public interest that progress constantly involved.
Such research tells us that “progress” is a situated and often interested claim about human efforts, not a natural good or a divine gift. It needs critical assessment, not headlong zeal.
Progress at what cost?
Still today, the zeal continues apace, bringing with it unintended effects and attendant remorse. In a trend that Audrey Watters terms “the regret industry,” tech leaders who leapt before they looked are lining up to confess the harmful consequences of their innovations. A disposition among tech entrepreneurs to act first and do their homework later (if at all) has spawned countless start-ups, but it has also caused destruction and harm.
As tech more and more takes center stage, and techies themselves begin to speak out publicly, there’s increasingly this ‘I told you so’ or ‘there’s a whole field for this’ backlash, usually from academics (see replies to this tweet). https://t.co/KGn8fzuDXs
— Antonio García Martínez (@antoniogm) July 30, 2019
Facebook founder Mark Zuckerberg’s 2018 testimony before the U.S. Congress in the wake of the Cambridge Analytica scandal drew the public’s attention to the social media platform’s complicity in mining its users’ data without their consent. By that time, at least six high profile Facebook execs, developers and investors had already left the company and issued public regrets for the company’s devastating effects on elections, privacy and media.
Since then, Facebook has with some success worked to reduce the spread of so-called “fake news.” However, it remains a host and accelerant for misinformation, and the misinformation previously spread on the site continues to have lasting effects on journalism and democratic institutions.
Facebook is not the only site that has had devastating effects on society. Twitter’s ease of use has led to the rise of troll farms that spread not only political misinformation, but also white supremacy and Islamophobia. Former Google design ethicist Tristan Harris has criticized companies like Google and YouTube for earning a fortune in what he calls the “attention economy” — the monetization of human attention through algorithms that lure users to conspiracy theories and clickbait, and addictive technology designed to keep us constantly scrolling.
While we have focused here on social and political harms, tech innovation both directly and indirectly contributes to what is arguably humanity’s most serious existential threat, the climate crisis. It is worth remembering that this particular crisis has its origins in the Industrial Revolution.
Social media indirectly contributes to climate change by fostering climate misinformation. Moreover, by weakening democratic institutions, it aids in the election of candidates supported by climate change deniers. More than this, though, information technology directly contributes to climate change through the massive energy demands of the internet and the devices that we use to access it.
Collison and Cowen rightly note that “areas of study have expanded greatly since the early European universities were formed,” with older disciplines spawning new disciplines and subfields, and “many subsequent transformative discoveries.” But when they urge that progress studies could “concoct policies and prescriptions… that increase the efficacy, productivity and innovative capacity of human organizations,” Collison and Cowen forget that university teaching and research requires not only innovation but due caution.
Universities’ core mission is scholarship in the service of society. The evolution of university disciplines should emerge not from self-styled “progress engineers” but from research and teaching that balances optimism and curiosity with critical thinking and responsible engagement of perspectives from across the disciplines.
Author Bios: Shannon Dea is a Professor of Philosophy at the University of Waterloo and Ted McCormick is an Associate Professor of History at Concordia University