We are witnessing the emergence of a new generation of disinformation that is upsetting our benchmarks.
Indeed, concern over the growing popularity of deepfakes , or malicious hyper-trick, has led author and synthetic media consultant, Nina Schick, to announce an imminent infocalypse .
Hyperchip are audio and video recordings of real people speaking and performing things they have never said or done , created using machine learning algorithms.
For example, the Dalí Museum has achieved the feat of synthetically reviving Salvador Dalí so that he can welcome visitors to his exhibition and take a selfie, or selfie, with them.
But also, a woman used this same technology to create fake videos of her daughter’s rival cheerleaders in order to discredit them.
The real problem with malicious hyperfakes, or deepfakes, therefore lies in the unethical use of technology, which is becoming increasingly sophisticated and accessible . It allows ill-intentioned individuals and organizations to create and disseminate deception, while our society is not equipped to distinguish the true from the false, much less to act with discernment against this phenomenon.
Our team of researchers and students in education, educational technology and design at Laval University and Concordia University has been funded by a joint initiative between Canadian Heritage and the Social Sciences and Humanities Research Council (SSHRC) to explore digital collective agency as an essential skill to help thwart deepfakes.
Human agency , as defined by psychology researcher Albert Bandura, is our ability to influence the course of events and our environment through our actions. This implies intentionality, the ability to project oneself into the future, self-reactivity and self-reflexivity. Applied to digital, this skill involves the transition from passive observation of deepfake sharing on social networks to the active adoption of online strategies to denounce these deceptions and counter them with facts.
The mutation of disinformation
Following an exchange meeting with guests experts in research, journalism and social media to discuss solutions to “flatten” the curve of disinformation, amplified by the health crisis “, the Chief Scientist of Quebec , Rémi Quirion, comes out with one observation: there is great urgency for training in disinformation from primary school.
At the same time, during the webinar “Do your research, the cognitive economy of disinformation” , Sébastien Tremblay, cognitive science researcher at Laval University, highlighted the role of training humans in critical thinking. with regard to digital.
This skill is also cited by the reference framework for digital skills . Stemming from the digital action plan in education and higher education by the Ministry of Education and Higher Education, this framework defines digital skills, and proposes strategies for integrating them into the training offer.
Obviously, finding solutions to disinformation beyond the interventions offered by internet giants , such as deactivating fake accounts and removing deceptive content , is essential and vital for humanity.
Misinformation derails conversations and prevents people from engaging on specific issues . It allows conspiracy theories to take hold in the minds of vulnerable individuals. As the example of Pizzagate demonstrates , the myths generated by disinformation campaigns persist even after they have been debunked.
The infodemic which prevailed during the health crisis turned out to be uncontrollable.
The gravity of the situation as well as the importance of acting quickly and collectively in the face of this threat is currently coming up against anti-vaccine movements which, through the dissemination of misleading information , endanger the health of the population. Eve Dubé, researcher at the National Institute of Public Health of Quebec and the Research Center of the CHU de Quebec, affirms that “the pandemic has provided an opportunity for reconciliation between conspirators and anti-vaccines” .
However, if the misinformation is not new , why is the appearance of hyperfactorism to worry us?
As mentioned by Giorgio Patrini, CEO and Chief Scientist at Deeptrace, the rise of this phenomenon now forces us to doubt the veracity of any audiovisual content. The denials become more and more credible and the liar’s dividend more powerful. Anyone can easily question irrefutable facts .
Additionally, videos are particularly effective in triggering emotional reactions as we generally regard video as overwhelming proof of truthfulness.
Solutions to counter deepfakes
Ashish Jaiman, director of technology operations at Microsoft, explains that the four broad categories of interventions to counter malicious deepfakes can be boiled down to regulations and legislative actions , platform policies and governance , technology tools and the media education .
However, while debates about the best solutions to adopt may be taking place at the moment, education has the advantage of addressing the problem at its source. Tom van de Weghe, deepfake researcher at Stanford University, discusses the need to build the resilience of our society , to inoculate it so that it has a better radar for deepfakes.
In this sense, several awareness campaigns use well-known personalities, such as actor Tom Cruise , to warn the general public about the risks associated with malicious use of synthetic media.
As examples of initiatives to denounce deepfakes, in 2018 Jordan Peel, actor and founder of Monkeypaw Productions , published President Obama’s deepfake , one of the first awareness videos with millions of views. More recently, the citizen movement On Est Prêt published a deepfake by President Macron calling for a “real climate law”. The Channel 4 team has the Queen of England , Elizabeth II, dancing on a table during a mock Christmas 2021 address.
Thus, through the use of a conversational robot , Alexis De Lancer, host of the program Les Décrypteurs , on Radio-Canada, introduces visitors to hypertrucage. The Reuters news agency offers online training in 16 languages on manipulated media content.
Digital collective agency
However, despite these initiatives helping to raise awareness, the challenges of deepfakes are enormous. There is no such thing as a miracle solution and we need all the players to mobilize in order to curb the problem.
Once we identify the factors that lead to people’s vulnerability to disinformation, we can better intervene.
Until then, equipping citizens to not only protect themselves, but also to use their digital skills to take action and protect their family, their entourage, their community and our society within digital environments becomes a necessity.
Deepfakes, or its variants of disinformation, quickly become viral and pervasive, and therefore almost impossible to detect. As a society, it is high time to move from reaction to empowerment through collective digital actions.
Author Bios: Nadia Naffi is an Assistant Professor, Educational Technology, Chair in Educational Leadership in the Innovative Pedagogical Practices in Digital Contexts at Université Laval, Ann-Louise Davidson is Concordia University Research Chair, Culture Maker; Associate Professor, Educational Technology at Concordia University and Francois Berger is an Assistant Researcher also at Laval University