Schools globally have scrambled to adopt or expand use of technology to minimize learning disruptions related to COVID-19. Educational technology has long posed serious privacy and equality problems, and these problems are now reaching a boiling point. Hasty choices now could have long-term impacts.
We are part of a seven-year research initiative, The eQuality Project, which examines young people’s experiences with privacy and equality in digitally networked environments. We are focused on how technology companies gather and use data about young people in ways that affect their well-being and relationships.
We have documented a number of significant issues that are possible byproducts of the data practices of educational technology (edtech) companies. These include corporate tracking of student activities both inside and outside of the classroom, discrimination against young people from marginalized communities, student loss of autonomy due to ongoing monitoring of their activities and sale of student data to third parties often for purposes of advertising to them.
According to the complaint, #Google “is using its services to create face templates and ‘voiceprints’ of children […] through a program in which the search giant provides school districts […] free access to G Suite for Education apps” #privacy #edtechhttps://t.co/grw1oquoiY
— The eQuality Project (@eQuality_ca) April 7, 2020
We also find that schools are not always aware of or attuned to the range of online privacy and security implications. This is perhaps compounded by the fact that privacy notices and terms of service agreements are rife with vagueness, legalese and double-speak.
The large amounts of personal and transactional information some companies collect can also open students up to privacy invasions by future employers. Such collection vastly increases the potential for educational surveillance of students through students’ datafication.
Continuity in education
Schools have been exploring and using technological solutions to ensure continuity in students’ educational experiences. These solutions include taking classes online and engaging with (or deepening engagement with) online learning tools.
They also include expanding the use of videoconferencing technologies for virtual meetings involving teachers, administrators, students and parents.
Given the substantial time and money being invested, it is likely that today’s choices will affect privacy and equality in education long after this pandemic ends. Making choices now that respect students’ privacy is an investment in their futures and the future of our educational systems.
K-12 schools rapidly moving online
Universities have had existing online education platforms integrated into the institution’s existing information and communication infrastructure, with established privacy and security protections for student and faculty data.
But in K-12 systems in Canada, when schools closed due to COVID-19 there was significant variation between regions with respect to how systems use distance learning or edtech.
This means school boards and educators have faced choices about how to rapidly move into online platforms and services that are quick to implement, can accommodate lots of students and are user-friendly. In some cases, decisions are being made in a piecemeal way.
Platforms can be used for education even if they aren’t designed for it. But they don’t necessarily have adequate privacy and security protections, particularly if educators use free as opposed to paid versions.
Educators should carefully consider the surveillance and privacy risks associated with use of these platforms, and should understand the privacy policies and terms and conditions.
Zoom and Skype are two of the most familiar and user-friendly videoconferencing platforms for communicating with a number of people at once — and are versatile enough to accommodate educational instruction and discussion.
It’s not surprising that some educators are using these familiar platforms. But they should be aware that the platforms collect a great deal of personal information about students. This leads to potential long-term risks to student privacy and autonomy.
For example, the March 18, 2020, version of Zoom’s K-12 Privacy and Security policy stated that data collected from K-12 students includes:
- a user’s name and other similar identifiers;
- a student’s school;
- the student’s device, network and internet connection; and
- the student’s use of the Zoom platform, including actions taken, date and time, frequency, duration, quantity, quality, network connectivity, and performance information related to logins, clicks, messages, contacts, content viewed and shared, calls, use of video and screen sharing, meetings and cloud recording.
Consent for Zoom’s information collection is given by the “School Subscriber” — typically the students’ school, school district or teacher — on behalf of parents and students.
Children’s rights to consent
This fails to recognize the rights that students have under the UN Convention for the Rights of the Child, most particularly their right to participate in decisions affecting them. It also leaves open the possibility that educators, in their own well-intentioned efforts to bring quality, interactive lessons to their students are “consenting” beyond what they’re authorized to do.
But it’s not the job of individual educators to dig through legal terms and decide what kinds of protections students will or will not have with respect to their data.
Instead, ministries, departments of education and school districts need to offer clear guidance to help educators navigate decisions about educational technology. In some instances, school districts may have or adopt a list of approved products or platforms.
Ministries, departments of education and school districts should also begin to think carefully about how edtech allows new forms of student surveillance by the school itself. This can happen by allowing for, among other things, monitoring the websites students access, the words they type and their online conversations.
To start, they can look to third parties — such as Media Smarts, Common Sense Media, Consortium for School Networking and Future of Privacy Forum — that have updated information on privacy and data protection practices of edtech products and services.
Policymakers should be supporting teachers, administrators and school boards to insist that ed tech companies default in favour of privacy-respecting practices.
We stand on the precipice of magnifying existing problems exponentially unless those responsible for education pause. We are already dangerously close to what The Economist has dubbed “the coronopticon” — a brave new age of surveillance and data control catalyzed by hasty tech decisions under COVID-19. Decisions about technology in the classroom need not move us even closer.
Author Bios: Jane Bailey is Professor of Law and Co-Leader of The eQuality Project at L’Université d’Ottawa/University of Ottawa, Jacquelyn Burkell is Associate Vice-President, Research at Western University, Priscilla Regan is Professor in the Schar School of Policy and Government at George Mason University and Valerie Steeves is Full Professor, Department of Criminology at L’Université d’Ottawa/University of Ottawa