How to design safer technology for children and adolescents?

Share:

Children and adolescents are at greatest risk in the digital environment. After all, not only is this the age group that spends the most time online, but the potential impacts of these risks are also more severe for them.

On the one hand, it is important that both minors and their parents and educators, the healthcare professionals who care for them, and the rest of their environment understand what these risks are and how they manifest themselves in order to detect them and implement strategies to avoid or mitigate them. However, it would also be advisable to change the way many digital products are designed, implemented, and offered. Only in this way will the new generations be able to maintain a healthy relationship with technology and take full advantage of its potential.

The 5 Cs

There are different classifications of the risks that technology can pose to children and adolescents, but one of the most widespread is the “5 Cs,” proposed by the Organization for Economic Cooperation and Development (OECD) in 2011 and revised in 2021.

This name corresponds to the acronym formed by the initials of the different risk categories:

  • Content : These are risks that arise when minors are exposed, as passive subjects, to illegal, age-inappropriate, or harmful content. This also includes hateful content, harmful advice, or misinformation specifically targeted at them that exploits their vulnerabilities.
  • Conduct : These risks arise when children and adolescents become active subjects and use technology to engage in activities that pose a risk to others. In other words, they are the ones who generate or disseminate illegal, inappropriate, harmful, hateful, or misleading content. Current examples include sexting (when self-produced photographs or videos with sexual connotations are sent via mobile phone or other camera device ) or the creation and dissemination of pornographic deep fakes .
  • Contact : In this case, the risks relate to social interaction that occurs through digital means, and minors are, once again, passive subjects. They arise when contact established through technology (social media, video game chats, messaging apps, etc.) turns them into targets of harassment, hatred, violence, or any other harmful action. This category also includes grooming , when an adult poses as a minor to contact other minors, build a relationship of trust, and exert some form of control or blackmail over them, usually for sexual purposes.
  • Consumer : In the context of the digital economy, minors are sometimes advertised for products that are not suitable for them (tobacco, alcohol, dating or gambling apps ) and that exploit their vulnerabilities (lack of experience or maturity, credulity, etc.) or that they cannot directly identify as advertising (for example, through influencers ). They may also be categorized based on their habits, tastes, or preferences when they are identified as consumers, or potential future consumers. Or they may be asked to consent to agreements or contracts that are not beneficial to them, or to make decisions that are not appropriate for them due to their age (decisions that should be made by their parents or guardians).
  • Cross-Cutting : This category encompasses highly heterogeneous and cross-cutting risks that can have significant impacts on childhood and adolescence. They represent perhaps the greatest challenge today.

Among these risks we can distinguish:

– Privacy risks : This refers to overexposure caused by social media, whether through posts made by minors or by those around them—for example, through sharing by their own parents or other relevant adults ( sharenting ). This also includes risks arising from the increasingly widespread processing of personal data in educational settings or during leisure activities, including the use of biometric access control systems.

– Risks associated with new technologies : Children and adolescents are, in many cases, one of the groups most likely to test and adopt technological innovations due to their natural inclination to be “up-to-date,” but also because the technology sector is particularly interested in this age group, as they are the customers of the future. We’re referring to smart toys, geolocation devices such as watches or bracelets, virtual reality headsets, devices that process neurodata to play video games, the use of artificial intelligence, etc.

– Risks associated with mental and physical health : This category had to be explicitly included due to growing scientific evidence that excessive or abusive technology use has a significant impact on children’s physical and mental health. This type of problematic use is closely related to what are known as addictive design patterns, which are very common in today’s digital products.

Addictive design patterns

An addictive pattern is “a characteristic, attribute, or design practice that determines a specific way of using platforms, applications, or digital services, leading users to spend significantly more time or with a higher level of engagement than is expected, convenient, or healthy for them.”

It’s important to keep in mind that, in many cases, the customers of digital products aren’t the users themselves, but other companies that pay to show them advertising or deliver their messages. Therefore, the more time a user spends online, and the higher their level of engagement, the greater the benefit for the digital product provider.

On the one hand, because more personal data about you will be collected or inferred, data that has direct value for real customers, but also indirect value: the better you know the user, the more personalized and effective the advertising and messages you deliver can be. On the other hand, the more time a user spends online, the more advertising and messages you can deliver. In short, it’s about transforming user attention and behavior into a monetizable asset; hence the interest in keeping them online as much as possible.

And how do you achieve this? By incorporating addictive patterns into product design, such as pull-to-refresh , infinite scrolling , infinite streaming , periodic rewards, autoplay, push notifications , countdowns, algorithmic recommendations, persuasive language, cognitive overload, previews, and progress bars.

Currently, more than 30 such patterns have been identified, and they can be found on social media, audiovisual content platforms, video games, and learning apps , to name just a few examples, as they are ubiquitous. All of these products could function just as well without these patterns, which are implemented consciously and deliberately to serve the business model of certain internet providers and players despite the risks they pose to people. These are not risks inherent to the technology itself, but rather risks that could be avoided if it were designed differently.

Impacts

The aforementioned design patterns promote addictive behavior disorders, non-substance addictions, problematic use or abuse, and substance use disorders. Although they affect all users, they pose a particularly high risk to children and adolescents.

The effects of these behavioral disorders can be physical (musculoskeletal pain, altered perception of emotional state, brain damage) and psychological (depression, anxiety and stress, loneliness, low self-esteem and satisfaction, eating and sleeping disorders).

At early ages, these effects are not only direct but also indirect: time spent online is not spent on other activities such as spending time with friends or family, playing sports, reading, or playing outdoors.

Various studies suggest that the increase in mental health problems could be related to all of these behavioral disorders, both as a cause and an effect. In other words, mental health problems make children more vulnerable to addictive behaviors, and these behaviors, which aim to increase screen time and online use, in turn increase mental health problems.

It’s important to keep in mind that some of the causes of these technology-related disorders are very common among adolescents. For example, anxiety, loneliness, low self-esteem, social pressure, or fear of missing out. Furthermore, the impacts of these disorders often end up affecting academic performance, which can affect family relationships or life satisfaction levels. Again, we see this cycle in which the same factors can be both cause and effect, amplifying each other.

Vulnerable

On the other hand, many studies agree that adolescents and young adults are more vulnerable to addictive patterns because they spend so much time exposed to them. In addition to their higher level of digital competence leading them to a false sense of security, parents (also affected by addictive patterns) are not good role models for their use of technology. Added to this is the fact that many have not yet developed their own criteria or possess sufficient self-control to develop self-protection strategies.

To protect children and adolescents from addictive patterns, professionals advise combining awareness (both their own and those around them) of all the aforementioned risks, self-discipline and self-control, and other practical strategies related to conscious and thoughtful decision-making. These decisions relate to the selection of devices and uses (including defining spaces, schedules, and time limits for these uses) or the applications installed and how they are configured (always opt for a safe default configuration that disables notifications, patterns like autoplay, etc.).

But the entire responsibility should not be left in the hands of minors and their families. Governments, regulators, judicial authorities, and supervisory authorities must assume their obligations to ensure that children and adolescents can take advantage of the opportunities offered by the digital space while being adequately protected from the risks it poses.

And, obviously, the technology industry must assume its obligations to protect children and adolescents and design technology differently than it currently does, complying with current regulations, both on personal data protection and on artificial intelligence and digital services, to mention just a few relevant examples.

Author Bio: Marta Beltrán is Head of the Scientific Department of the Spanish Data Protection Agency; professor on leave at Rey Juan Carlos University

Tags: