TikTok may be impacting teens’ mental health, with suicide risks. The company is being sued by the US courts over its TikTok Lite Rewards app. The lawsuits revealed that the company was aware of the negative effects on mental health, reinforced by the algorithm. European law has banned the app.
TikTok Lite Rewards is a rewards program integrated into TikTok Lite, the lighter version of TikTok. The user earns points by watching videos, inviting friends, following creators, or even just opening the app every day. There are also campaigns for specific events. These rewards can then be exchanged for gifts or money.
Internal TikTok documents produced during a US lawsuit prove that executives were aware of the app’s effects on teens: they can become addicted to TikTok after watching just 260 videos, or in just 35 minutes!
The mental health effects result from the gains and likes, sources of instant gratification that encourage a constant search for approval online. This economic model creates anxiety, inferiority complexes and a distorted perception of reality among young people.
The algorithm amplifies this effect by pushing profiles that match narrow beauty standards and devaluing those deemed “unattractive.” These are applying the app’s beauty filters that TikTok recognizes are toxic and harm young people’s body image.
The company knew its algorithm could quickly push users, especially teens, toward negative content related to depression or suicide. The flawed algorithmic moderation exposed young people to dangerous videos (self-harm) that racked up tens of thousands of views before they were removed.
Because the company knew this and failed to protect users, more than a dozen US states and the District of Columbia are suing it, alleging that the app harms children’s mental health with a product designed to be used compulsively and excessively: it’s the addictive effect that is the company’s fault.
The societal impact of rewards programs on social networks
Several dimensions of our society are at stake: data protection, impact on young people, ethics and digital overconsumption.
The hunt for personal data is a well-known issue for gaming apps. Programs like TikTok Rewards Lite collect personal data to personalize rewards and content offered. This raises the question of privacy and security of information, especially since minors are not always aware of the implications of such sharing. TikTok has also been slow to delete the accounts of children under 13.
In exchange for rewards, users provide information about their online behavior. This surveillance becomes normalized in common practices, which can impact how young people perceive privacy and their rights online. This can lead them to accept levels of surveillance that they may not have tolerated in other contexts.
Rewards encourage users to spend more time on the app, a strategy that creates digital addiction and hyperconnection, especially among young people who are more vulnerable to instant reward mechanisms ( age has a negative effect on habitual or addictive behavior on smartphones ). However, the time management tools implemented by TikTok, supposed to reduce usage, had only a minimal effect, reducing daily usage by only 1 minute 30 seconds.
TikTok Rewards Lite uses “gamification” techniques to maximize user engagement, which drives repetitive behavior and exploits psychological mechanisms, such as delayed gratification, to retain users. These practices are sometimes considered manipulative because they encourage users to consume content continuously, without them always being aware of this influence.
Users are exposed to promotional messages that encourage them to spend money impulsively to obtain exclusive benefits. The gamification of mobile payment creates a social and economic impact for customers and providers of these applications.
The long-term impacts on young people’s digital habits and perceptions of social values resulting from these programs are still uncertain, but it is clear that they require critical reflection on the part of users and legislators.
Europe in action: safer virtual spaces for children against digital giants
Last August, the European Commission shut down the TikTok Lite Rewards service to ensure compliance with the Digital Services Act (DSA), including the requirement for a risk assessment before an app goes live.
The decision published on October 4 , concerns TikTok’s failure to identify, analyze and assess systemic risks and to take reasonable, proportionate and effective mitigation measures. TikTok then committed to withdraw TikTok Lite Rewards in Spain and France and not to (re)introduce a similar program in the EU.
The European regulation on Artificial Intelligence (AI) (RIA) further regulates certain aspects of this type of program by prohibiting practices that manipulate behavior. This concerns any AI system that exploits the vulnerabilities of a person or a group of people. This could include limits on the use of gamification or addictive notifications to encourage engagement. The penalty incurred is 35 million euros or 7% of global turnover.
The RIA also requires AI systems to be transparent: if TikTok rewards are based on predictive algorithms or personalized recommendations, the company should disclose how those rewards are calculated, providing greater transparency to users. Algorithmic transparency is essential to understanding and interpreting AI models’ decisions or choices: it allows users to know the recommendation mechanisms, reducing the potential for manipulation.
Digital actors must respect the rule of law and human rights, including under the principles of non-discrimination and equality. AI legislation aims to protect young audiences who are vulnerable, including by avoiding the exploitation of cognitive biases, particularly those that influence young people’s behaviour. This could include limitations on the intensity of notifications, reminders to limit prolonged use, and the obligation to offer security settings adapted to minors.
European laws are generally ahead of the curve on data protection and digital rights, but their enforcement is a challenge especially in the face of global companies . Moreover, AI laws in Europe will not prevent foreign companies from continuing these practices outside Europe.
This application model that activates the brain reward system is also implemented by other applications or games such as Candy Crush, Pokemon Go, Coin Master, Cash of Clans, Zynga Poker or Duolingo.
Author Bio: Nathalie Devillier is Doctor of International Law, Historical Authors