“A better Internet starts with you: more connected, more secure.” With this motto, Safe Internet Day 2021 is celebrated . This year we are all being asked, children and young people, families, teachers and schools, as well as companies and political leaders, to get involved in this task.
The platforms are asked to create a better Internet, ensuring that there is positive content on their channels and promoting safe services for their users.
“Keeping our community safe is our highest priority. We have reached an agreement with the Guarantor (of data protection) and from today we will take additional measures to support our community in Italy. As of February 9, we will send each user in Italy our ‘age gate proccess’ again, and only users over 13 will be able to continue using the application after passing this process. In addition, we are developing a new ‘report’ button to allow users to report accounts that may be under the age of 13, which will be reviewed by our team and removed, if necessary. There is no finish line when it comes to protecting our users, especially to the youngest and our work in this important area does not stop. Therefore, we continue to invest in the people, processes and technology that help us maintain our community as a safe space for positive and creative expression. “
Alexandra Evans, Head of Child Safety for TikTok for Europe.
This is the statement released by the Chinese platform TikTok after the sentence for the death of the 10-year-old Italian girl as a result of one of the viral challenges, the #BlackOutChallenge .
What content is published on TikTok?
In TikTok we can see videos of dances, young people (and not so young) doing play-backs , magic tricks, crafts, cooking recipes, beauty tutorials and endless activities. But also, and increasingly, social or political activism, educational or health.
For this reason, so many universities, companies or political parties have seen the opportunity to create and publish content in a clear, simple and direct way for their target audience: adolescents and young people.
TikTok, like the rest of applications and social networks, is a platform whose main objective is for users to stay on it for as long as possible in order to make it profitable, either through advertising impacts, with the registration of data or their behavior. .
Its success is due to the way in which, thanks to artificial intelligence and its algorithm, it collects and uses user data to improve their experience. As brands compete for users’ attention, personalized content has become paramount.
A viral phenomenon
But for some years now, it has become fashionable for famous people, such as influencers and youtubers , or any anonymous individual, to sign up for a challenge , either for a charitable cause or just for fun. These viral trends are shared by instant messaging or social networks and each user contributes their personal version of the challenge.
Well-designed and focused challenges can help achieve certain objectives, such as raising awareness about the need to care for the environment, fighting for the rights of the most vulnerable, equal opportunities, education, etc. (with tags like #TodosPorElClima , # TheRealChallenge and # DanceForChange …).
The problem arises when challenges appear that endanger the health or physical integrity of users, especially if they are minors. Some of the most dangerous challenges are spreading today on TikTok, where thousands and thousands of teens challenge themselves to play dangerous games.
And that’s how you have viralizado in recent months after being in the app the #BenadrylChallenge , the #KnockOutChallenge , the #SupergluelipsChallenge or media sadly by the death of at least two minor the # BlackOutChallenge .
Minors on the platform
According to Organic Law 3/2018 on Protection of Personal Data and Guarantee of Digital Rights of December 5, article 7.1 (LOPDGDD) , in Spain the minimum age for a minor to be manager of the treatment of their personal data – for example, having a profile on social networks.
In the case of TikTok, the platform itself specifies in its privacy policy that it is not directed to children under 13 years of age and, in addition, it urges us to report through a form if we believe that they have or collect personal data about a minor child. of that age.
In fact, the rating they have in the download markets, both iOS and Android, is 12+ in order to let families know that it is not an application for minors of this age.
But how is it validated that the minor is not lying when entering his age or confirming that he has parental consent (mandatory)? Well, simply, the platform trusts the honesty and maturity of all minors. And this is the big problem, the minimum access age and the platforms: there is no effective double verification system.
As published in a Statista study , the presence of younger children in social networks is a reality, there are 47.7% of users of 4 (yes, we have written it well) to 15 years on Instagram and 37.7% TikTok. That is why it is so important that families monitor the activity of minors when they use their tablets or smartphones , either through a parental control tool or through family conversations.
Increasing security measures
After the legal battle with the United States and the sanction imposed in 2019 , the Chinese platform introduced a series of measures to limit access to inappropriate content and unknown persons by minors, prohibiting jokes, online harassment and challenges or challenges that could cause injury.
In addition, from 13 to 16 years old, children will use a restricted version of TikTok with content according to their age and with blocking of private messages between users. However, the problem is that this cannot be applied to the accounts of all those minors who did not put their real age when registering.
In addition, in November they expanded the Family Pairing function , which allows us to link our son or daughter’s account to ours, to, in this way, manage or control functions such as the time they spend connected, who can see their profile, who can comment on their videos, send them messages or see the content they like.
Content and privacy control
On January 13, the Chinese platform further strengthened the privacy and security of its youngest users:
- “All those accounts belonging to users between 13 and 15 years old will change their settings automatically so that they are private by default” (although, as we said before, it is impossible to apply to the accounts of all those minors who do not put their real age when they discharge).
- “Videos created by children under 16 can only be commented on by Friends or Nobody ” (the Everyone option disappears ).
- “The duo and paste options will only be available to those over 16 years old.”
- “The option to allow the download of your videos will only be available in content created by users over 16 years old.”
- “The option to recommend your account to others will be disabled by default for those users who are between 13 and 15 years old.”
These new updates join the previous changes, which included the impossibility of using direct messages and the realization of live broadcasts by users under 16 years of age and restrictions on the purchase, sending and receiving of virtual gifts for minor users of 18 years.
But, although there are new functions to block content, messaging or users, without a doubt, the probability of risks does not disappear. Minors may continue to access challenges in one way or another, either through the platform or because they are sent by their friends through WhatsApp. Monitoring their activity on the mobile at an early age is essential.
It may even be a good option to use the application on the parents’ mobile or tablet so that the little ones can see the content they like without having to create an account (you can download the app and view videos and hashtags ), if required. consider that the time is not yet for it.
Educating is the best tool
Italy’s blocking of the platform has opened a deep debate. Legally, European countries, including Spain, can request this emergency measure (protect the data of minors, since it is not possible to verify the age of its users) under the protection of the new General Data Protection Regulation (RGPD) .
The option should not be to block or ban the platforms, because if it is not this, there will be another. Parental mediation and digital education at home is always the best formula to ensure that our children make safe and healthy use of technology.
Author Bio: Laura Cuesta Cano is Responsible for Communication and Digital Content in PAD Service. Professor at Camilo José Cela University