
In Media and Information Literacy (MIL) courses, the emphasis is ostensibly placed on “critical thinking,” that is, the ability to take a step back from information and form a well-founded opinion.
But what about the skill of “informed consumption,” that is, the ability to reflect on one’s consumption, choices, needs, and budgets with full knowledge of the facts, and to assert one’s rights? This is much more often relegated to the back burner. Yet it is included in the Council of Europe’s digital citizenship education program.
The landmark case won by a consumer of Meta’s (Instagram and Facebook) and YouTube (a Google subsidiary within Alphabet) apps illustrates its power to influence. The jury ruled in favor of the plaintiff, finding that these platforms had caused her harm due to the product design, whose features and functionality led to mental health problems.
This is the first time that digital addiction has been recognized as a severe dependency leading to compulsive behavior, without substance use. This opens up possibilities for action in classrooms. How can we turn the attention economy against those who manipulate and monetize it?
In media and information literacy (MIL), critiquing the attention economy requires understanding the strategies platforms use to capture and maintain user interest, ensuring advertisers’ messages leave a strong and repeated impression and encourage purchases. The skill of “informed consumption” reminds users that they are not simply an audience, but both consumers and citizens.
Building the user as a consumer and citizen, and not as a simple user, requires recognizing their capacity for autonomous action, i.e., recourse, and their capacity for group action, i.e., protest, or even conflict.
Defining “informed consumption”
For American consumers, the legal avenues are clear and well-defined, as they allow for both legal action and filing customer complaints with a recalcitrant company. This has undeniable benefits for some: KGM could receive $6 million from Meta and YouTube in compensatory and punitive damages.
For European consumers, the means of action are less clearly defined, even though the platforms are managed by the Directorate-General for Communications Networks, Content and Technology (DG Connect), which intertwines content and its transmission. The repertoire of actions against platforms is largely controlled by states, which tends to disempower, or even absolve, consumers and, consequently, citizens.
The regulations put in place by the Digital Services Act (DSA) (2024) require algorithmic transparency: very large platforms (more than 45 million monthly active users) must publish quarterly reports detailing their recommendation algorithms and amplification metrics , which exposes their functionalities without however forcing them to change them.
On the user side, the DSA creates the status of trusted whistleblowers – these are organizations recognized for their expertise in detecting, identifying, and reporting illegal or harmful content. The solution for the consumer is therefore to identify these whistleblowers and contact them. In France, the media regulator, ARCOM, designates these bodies that can act on behalf of users . Addictions France , for example, is one of them, as is Indecosa CGT , which advocates for consumer rights.
Questioning the figure of the user
The platforms seek to put the user in the physical and psychological condition to use the screen . This relationship establishes a direct sharing contract, often sealed by “general terms and conditions of use” (GTC) that are illegible, especially to minors, which give the platforms broad control over the data given in exchange for “free” access.
In the United States, this sharing agreement was tested in 1996 and again in 2012 through “web blackouts.” Internet companies displayed a black screen on their homepages or suspended their services to protest proposed legislation that threatened their freedom of commerce. In 1996, they thus obtained immunity under Section 230 of the Communications Decency Act. In 2012, they succeeded in having the SOPA anti-piracy law repealed, a law that aimed to expand copyright enforcement powers to combat online infringement.
In the name of freedom of expression and online consumption, they encouraged users to sign petitions, boycott companies that did not follow the movement, and demonstrate in the streets to show their support for the platforms and thus obtain the withdrawal of the proposed legislation. In Europe, this repertoire of actions was used far less.
But the conditions of use have deteriorated, and with them the status of users, who have become hidden workers , contributing their attention to enrich the platforms. Likes are transformed into hard cash through search engine optimization and product placement; content creation is considered a form of free writing, which violates labor law, especially when it involves minors.
To raise awareness of the role of advertisers and shareholders
In media and advertising integration (MAI), it’s also about revealing what’s really going on behind the scenes. Advertisers and shareholders are two invisible but crucial players in the attention economy. As the primary payment method for media in a competitive commercial system, advertising integration dictates attention-grabbing features, such as recommendation algorithms, push notifications, infinite scroll design, viral formats, and clickbait headlines.
The attention economy is based on this continuous flow, which is conducive to addiction. The sale of advertising space finances the “free” web through features such as behavioral targeting, data buying and sharing, and so on. The price of these advertising units depends on the platform’s ability to guarantee prolonged exposure (viewing time, engagement rate).
Users must be informed of the enormous sums involved, of the duopoly of the two advertising agencies Meta and Google , recently joined by Amazon, which forces all other players on the internet to go through their ordeal, including audiovisual media and the press.
Advertising is also an increasingly concentrated sector , with five or six media agencies controlling it and having agreed to regulate each other, particularly regarding the sale of online advertising space. Their system is complex and opaque, making the value chain (created or destroyed) difficult for users, including content creators who are paid per click, to understand.
Added to this is the role of the shareholders of these platforms, most of which went public between 2004 (Google) and 2012 (Facebook). Not to mention that their founders retain control of both the capital and the shares . These actors—the agencies and shareholders— have contributed to breaking the sharing agreement with the average user . This breakdown became apparent to the general public with their about-face and their alignment with Donald Trump’s ultra-liberal policies in 2025. It reveals the growing divide between the tech elite and the progressive base of their users who believe in social progress driven by digital networks.
Empowering the platforms
The features of the design are not inevitable. This is demonstrated by the thousands of lawsuits filed by young people, individually, but also collectively by school districts and state attorneys general (before the federal court of first instance for the Northern District of California, for example).
These measures will face resistance from the platforms, which systematically appeal, arguing the difficulty of legally measuring and demonstrating the individual impact of digital addiction without widespread scientific consensus. But they signal a new era for users. Beyond legal recourse (for product design flaws or addictive features), the doctrine of “duty of care” is evolving. It opens the door to ethical design initiatives for features, such as disabling likes, screen time limits, and pause notifications, all aimed at giving users back control.
These cases highlight the need for international cooperation to harmonize functional standards, as well as the age of digital majority . There is generally a two-year gap between the United States and Europe, which benefits American platforms. The Children’s Online Privacy Protection Act (COPPA), which came into effect in 1998 and was revised in 2013 and again in 2025, sets the age of digital majority at 13 in the United States, while in France, for example, it is 15.
These cases highlight the potential for international class action lawsuits, especially since the Representative Actions Directive (2020), which represents a “new deal” for consumers , particularly regarding unfair commercial practices and abusive contract terms. European consumers could join forces with their American counterparts and push for changes to screen functionality.
The KGM case brings into sharper focus the ban on smartphones in schools, advocated by several countries worldwide. The ban leaves media and information literacy (MIL) practitioners skeptical, as it is unenforceable without the necessary legal, technical, and educational support. Recognizing the responsibility of platforms for designing addictive products is more promising… provided that consumers take action, with or without government support.
It is indeed the design of social media applications that must change, for minors as well as adults. In terms of resilience, the effectiveness of both the smartphone ban and the democratic shield will depend on the implementation of “informed consumption” skills, through media and information literacy (MIL), from a very young age.
Author Bio: Divina Frau-Meigs is Professor of Information and Communication Sciences at Sorbonne Nouvelle University