TikTok and the protection of minors
The European Commission opens a formal procedure against TikTok for a breach of the new Digital Services Act
On February 17th, the Digital Services Act, a landmark piece of European Union legislation aimed at making the online environment safer, fairer and more transparent, comes into force for all online intermediaries in the EU .
The Digital Services Act improves the protection of European Union users against illegal goods and content and strengthens the respect of their rights on online platforms where they interact with other users, share information or buy products.
Standalone platforms and users have new responsibilities. All online platforms with users established in the European Union, with the exception of small and micro-enterprises that employ less than fifty people and have an annual turnover of less than 10,000,000 euros, are obliged to implement measures to fight against illegal content, including goods and services.
The implications of these responsibilities include the obligation to protect minors, including the total prohibition of targeting minors through advertising based on profiles or personal data, and users must be informed by providing information about the advertisements they receive, including why they are shown to them and who funded the advertising. Advertising targeting users based on sensitive data, such as their political, religious beliefs, sexual preferences, etc., should be prohibited. On the other hand, they must provide a justification to users affected by a content moderation decision, such as the deletion of content or the suspension of accounts, include this justification in the database for the transparency of the Services Act Digital, and also provide users with access to a complaint mechanism to challenge content moderation decisions. It is compullinked sory to publish a report on the content moderation procedures, at least once a year. The general conditions and parameters used by their content recommendation systems must be clear and they are obliged to designate a point of contact for the authorities as well as for the users.
In this way, the Digital Services Act protects users against illegal content, counterfeit products and misinformation. Promote transparency in content moderation decisions and platforms’ algorithms. It encourages competition in the digital market and creates a safer, fairer and more transparent online environment for all EU users.
Within this regulatory framework, on February 19th, 2024, the European Commission opened a formal procedure to assess whether TikTok is in breach of the Digital Services Act in four main areas to the protection of minors, advertising transparency, access to data by researchers and risk management of addictive design and harmful content.
Based on the preliminary investigation conducted to date, including an analysis of the risk assessment report submitted by TikTok in September 2023, as well as TikTok’s responses to formal requests for information of the Commission (on illegal content, protection of minors and access to data).
With this assessment of breaches of the protection of minors, the European Commission questions the effectiveness of TikTok’s age verification tools and default privacy settings for minors. In addition, the mitigation measures applied in this regard, in particular the age verification tools used by TikTok to prevent minors from accessing inappropriate content, may not be reasonable, proportionate, and effective.
With regard to advertising transparency, the investigation focuses on the lack of a searchable repository of advertisements and possible deficiencies in the information provided to users.
In the area of access to data, the European Commission considers that TikTok does not provide researchers with transparent access to public data on the platform.
And finally, it evaluates Risk Management where the European Commission examines whether TikTok is taking sufficient measures to mitigate addictive risks due to its design including algorithmic systems, which can stimulate behavioral addictions and to prevent the dissemination of harmful content.
This assessment is necessary to counteract the potential risks for the exercise of the fundamental right to the physical and mental wellbeing of the person, respect for the rights of the child and its impact on radicalization processes.
The consequences, if these breaches are proven, would be that TikTok would face significant sanctions, such as fines or the suspension of its services in the European Union.
The opening of a formal procedure authorizes the Commission to take additional enforcement measures, such as interim measures and non-compliance decisions. The Commission is also authorized to accept any commitment acquired by TikTok to resolve the matters subject to the procedure.
What we can say is that the European Commission’s decision will set an important precedent for the application of the Digital Services Law and that the case could have a significant impact on TikTok’s business model and its operations within of Europe
The European Commission is taking a tough stance against big tech platforms to ensure compliance with the Digital Services Act and the TikTok investigation highlights the challenges of regulating online platforms and the influence they have on the society and specifically in minors with their human interaction and definition as people.
It remains to be seen whether the opening of this formal procedure against TikTok is an important step for the protection of online users in the European Union and whether the outcome of the case will have a significant impact on the European digital landscape.
The European Commission considers that TikTok does not provide researchers with transparent access to public data on the platform