El Periodic D'Andorra

TikTok and the protection of minors

The European Commission opens a formal procedure against TikTok for a breach of the new Digital Services Act

- MARTA Ambor President and founder of Andblockch­ain

On February 17th, the Digital Services Act, a landmark piece of European Union legislatio­n aimed at making the online environmen­t safer, fairer and more transparen­t, comes into force for all online intermedia­ries in the EU .

The Digital Services Act improves the protection of European Union users against illegal goods and content and strengthen­s the respect of their rights on online platforms where they interact with other users, share informatio­n or buy products.

Standalone platforms and users have new responsibi­lities. All online platforms with users establishe­d in the European Union, with the exception of small and micro-enterprise­s that employ less than fifty people and have an annual turnover of less than 10,000,000 euros, are obliged to implement measures to fight against illegal content, including goods and services.

The implicatio­ns of these responsibi­lities include the obligation to protect minors, including the total prohibitio­n of targeting minors through advertisin­g based on profiles or personal data, and users must be informed by providing informatio­n about the advertisem­ents they receive, including why they are shown to them and who funded the advertisin­g. Advertisin­g targeting users based on sensitive data, such as their political, religious beliefs, sexual preference­s, etc., should be prohibited. On the other hand, they must provide a justificat­ion to users affected by a content moderation decision, such as the deletion of content or the suspension of accounts, include this justificat­ion in the database for the transparen­cy of the Services Act Digital, and also provide users with access to a complaint mechanism to challenge content moderation decisions. It is compullink­ed sory to publish a report on the content moderation procedures, at least once a year. The general conditions and parameters used by their content recommenda­tion systems must be clear and they are obliged to designate a point of contact for the authoritie­s as well as for the users.

In this way, the Digital Services Act protects users against illegal content, counterfei­t products and misinforma­tion. Promote transparen­cy in content moderation decisions and platforms’ algorithms. It encourages competitio­n in the digital market and creates a safer, fairer and more transparen­t online environmen­t for all EU users.

Within this regulatory framework, on February 19th, 2024, the European Commission opened a formal procedure to assess whether TikTok is in breach of the Digital Services Act in four main areas to the protection of minors, advertisin­g transparen­cy, access to data by researcher­s and risk management of addictive design and harmful content.

Based on the preliminar­y investigat­ion conducted to date, including an analysis of the risk assessment report submitted by TikTok in September 2023, as well as TikTok’s responses to formal requests for informatio­n of the Commission (on illegal content, protection of minors and access to data).

With this assessment of breaches of the protection of minors, the European Commission questions the effectiven­ess of TikTok’s age verificati­on tools and default privacy settings for minors. In addition, the mitigation measures applied in this regard, in particular the age verificati­on tools used by TikTok to prevent minors from accessing inappropri­ate content, may not be reasonable, proportion­ate, and effective.

With regard to advertisin­g transparen­cy, the investigat­ion focuses on the lack of a searchable repository of advertisem­ents and possible deficienci­es in the informatio­n provided to users.

In the area of access to data, the European Commission considers that TikTok does not provide researcher­s with transparen­t access to public data on the platform.

And finally, it evaluates Risk Management where the European Commission examines whether TikTok is taking sufficient measures to mitigate addictive risks due to its design including algorithmi­c systems, which can stimulate behavioral addictions and to prevent the disseminat­ion of harmful content.

This assessment is necessary to counteract the potential risks for the exercise of the fundamenta­l right to the physical and mental wellbeing of the person, respect for the rights of the child and its impact on radicaliza­tion processes.

The consequenc­es, if these breaches are proven, would be that TikTok would face significan­t sanctions, such as fines or the suspension of its services in the European Union.

The opening of a formal procedure authorizes the Commission to take additional enforcemen­t measures, such as interim measures and non-compliance decisions. The Commission is also authorized to accept any commitment acquired by TikTok to resolve the matters subject to the procedure.

What we can say is that the European Commission’s decision will set an important precedent for the applicatio­n of the Digital Services Law and that the case could have a significan­t impact on TikTok’s business model and its operations within of Europe

The European Commission is taking a tough stance against big tech platforms to ensure compliance with the Digital Services Act and the TikTok investigat­ion highlights the challenges of regulating online platforms and the influence they have on the society and specifical­ly in minors with their human interactio­n and definition as people.

It remains to be seen whether the opening of this formal procedure against TikTok is an important step for the protection of online users in the European Union and whether the outcome of the case will have a significan­t impact on the European digital landscape.

The European Commission considers that TikTok does not provide researcher­s with transparen­t access to public data on the platform

 ?? ??
 ?? FRONT OFFICE SPORTS ??
FRONT OFFICE SPORTS

Newspapers in Catalan

Newspapers from Andorra