Blog
Jan 24, 2025
14 min read
2927 views
0

TikTok’s journey to the Digital Services Act compliance: formal proceedings, TikTok Lite, and potential fines

TikTok’s compliance journey under the DSA has been a rollercoaster of requests, formal proceedings, and legal battles. Let’s dive into the challenges it faces on the path to compliance.

TikTok and the EU's DSA

TikTok is facing intense scrutiny in the US, where it was temporarily banned for 14 hours on January 19, 2025, and still risks a full ban unless it’s sold by the upcoming deadline. Meanwhile, the platform is also navigating challenges in the European Union as the Digital Services Act takes full effect.

Over the past two years, TikTok has faced more formal requests, investigations, and compliance deadlines from the European Commission than any other online service out there. 

This article dives into the series of events that shaped TikTok’s journey under the DSA—from missed deadlines and financial levies to transparency measures and collaborations to improve safety.

What is the DSA

The Digital Services Act (DSA) is a regulation introduced by the European Union on November 16, 2022, aimed at making the online space safer for everyone. It protects users’ rights while fostering fair competition, allowing businesses of all sizes to succeed with their audiences.

The DSA has a broad reach. It applies not only to EU-based companies or those with branches in the EU but also to any business offering digital services to EU users, regardless of where they are located.

Its scope covers all types of internet intermediaries, including platforms with user-generated content, search engines, social networks, e-commerce platforms, hosting services, and other online services.

The DSA prioritizes moderating criminal content (including copyright infringements), enhancing transparency, and ensuring algorithms are safe for minors. A summary of its requirements is available here.

Additionally, the DSA introduces tailored obligations, holding larger and more complex online services to higher standards of responsibility. 

Platforms with at least 45 million monthly active users in the EU are classified as Very Large Online Platforms (VLOPs) or Very Large Search Engines (VLOSEs), requiring them to meet stricter compliance standards.

According to the European Commission, VLOPs include popular platforms such as Alibaba AliExpress, Amazon Store, Apple App Store, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Pornhub, Snapchat, Stripchat, TikTok, Twitter (X), XVideos, Wikipedia, YouTube, and Zalando. Very Large Search Engines include Bing and Google Search. TikTok was designated as a VLOP on 25 April 2023.

In addition to the initial set of obligations, VLOPs must not only to moderate criminal content but also prevent it upload to a platform. You can find the full list of additional obligations here.

The DSA’s rules officially came into full force on February 17, 2024. 

The official requests to TikTok

Every time the European Commission issues an official request to a platform under the DSA, it releases a press release as well. This way we can track and analyze the current history of TikTok’s compliance and see the main challenges of the IT giant. 

19 October 2023

The European Commission requested information from TikTok under the DSA. The focus was on TikTok’s actions to tackle illegal content, such as terrorist material, hate speech, and disinformation, as well as measures to protect minors online.

TikTok was asked to respond by October 25, 2023, for crisis-related questions and by November 8, 2023, for issues about elections and minors. 

9 November 2023

The European Commission sent formal requests to TikTok (and YouTube) to investigate how the platforms had complied with obligations to protect minors, including assessing and mitigating risks to their mental and physical health and managing minors’ use of their services.

Both platforms were required to respond by November 30, 2023.

18 January 2024

The European Commission formally requested TikTok to provide information regarding the platform’s compliance with data access obligations for researchers. This requirement ensures researchers have timely access to publicly available data on TikTok, fostering transparency and accountability, especially ahead of critical events like elections.

TikTok, along with 16 other Very Large Online Platforms and Search Engines, was required to respond by February 8, 2024. The Commission planned to evaluate the replies to determine further steps.

19 February 2024

The European Commission investigated whether TikTok broke the rules of the DSA in areas like protecting minors, being transparent about ads, giving researchers access to data, and handling risks from addictive designs and harmful content.

This investigation started after reviewing TikTok’s risk report from September 2023 and its earlier responses to the Commission. The focus was on whether TikTok:

  • Properly assessed and reduced risks from features like algorithms that might cause addiction or lead users down harmful content paths.
  • Used effective tools to verify users’ ages and protected minors’ privacy and safety with secure default settings.
  • Created a reliable way to track ads shown on the platform.
  • Allowed researchers access to public data as required by the DSA.

If TikTok failed in these areas, it could have broken several DSA rules. The investigation aimed to find the truth but didn’t assume any outcome. Other possible issues, like spreading illegal content, and actions by other regulatory bodies, were not affected by this case.

14 March 2024

The European Commission sent a formal request to TikTok to investigate how it managed risks related to generative AI. The focus was on issues like AI providing false information («hallucinations»), the spread of deepfakes, and automated tools that could mislead voters.

The Commission also asked TikTok for documents and details about its risk assessments and steps to handle generative AI’s impact on elections, illegal content, user rights, gender-based violence, minors’ safety, mental health, data protection, consumer rights, and intellectual property.

TikTok was given deadlines of April 5, 2024, for election-related questions and April 26, 2024, for other issues.

17 April 2024

The European Commission requested information from TikTok about the launch of TikTok Lite in France and Spain to make sure TikTok conducted the required risk assessment before releasing the app in the EU.

TikTok Lite introduced a rewards program allowing users aged 18+ to earn points for tasks like watching videos or inviting friends, which could be exchanged for vouchers or virtual currency.

The Commission’s concerns included the impact of TikTok Lite’s «Task and Reward Lite» program on minors’ safety and users’ mental health, particularly regarding addictive behaviors. TikTok was also asked to provide details about measures it had implemented to reduce these risks.

TikTok was required to submit the risk assessment within 24 hours and provide additional information by April 26, 2024. The Commission planned to review the replies and decide on further steps, including potential fines for incomplete or misleading information.

22 April 2024

The same month the European Commission opened a second investigation into TikTok to examine whether the launch of TikTok Lite in France and Spain violated EU rules. The DSA requires large platforms to assess and mitigate risks before introducing new features likely to have significant impacts.

TikTok allegedly launched TikTok Lite’s «Task and Reward Program»  without properly assessing risks, particularly its potential to promote addictive behaviors. This is especially troubling for minors, given the lack of effective age verification and previous concerns about the platform’s addictive design.

The investigation focused on whether TikTok:

  • Completed the required risk assessment before launching TikTok Lite.
  • Implemented adequate measures to address risks, such as impacts on users’ mental health, especially minors.

TikTok missed a deadline to submit the risk assessment by April 18, 2024, prompting the Commission to demand compliance by April 23 for the report and May 3 for additional information. Failure to comply could result in fines up to 1% of annual revenue or daily penalties of 5%.

Due to TikTok’s failure to address these risks, the Commission also considered suspending TikTok Lite’s rewards program in the EU to protect users, particularly minors. TikTok was given until April 24 to present its defense before a final decision on the suspension.

24 April 2024

The European Commission acknowledged TikTok’s decision to suspend the «Task and Reward Program» of TikTok Lite in France and Spain for 60 days for new users from April 24th and for all users from April 28th. TikTok also paused the app’s rollout in other EU countries

Two formal investigations into TikTok, including the one about TikTok Lite, were still ongoing.

EU Commissioner Thierry Breton stated that TikTok’s decision to suspend the program was noted, but the investigation into the app’s addictiveness and compliance with the DSA will continue. He emphasized that children should not be used as “guinea pigs” for social media.

5 August 2024

Following the formal proceedings opened on April 22, the European Commission made TikTok’s commitment to permanently withdraw the TikTok Lite Rewards program from the EU legally binding.

TikTok committed to:

  • Permanently removing the TikTok Lite Rewards program from the EU.
  • Not launching any similar programs to bypass this decision.

The Commission closed the formal proceedings, marking the first case closed under the DSA and the first time it accepted commitments from a platform in such proceedings.

2 October 2024

Once again the European Commission requested information from TikTok about its recommender systems. TikTok was asked to explain how it prevents manipulation by malicious actors and mitigates risks to elections, media pluralism, and civic discourse, which could be amplified by its algorithms.

29 November 2024

The European Commission sent TikTok an information request related to the ongoing Romanian elections. TikTok was asked to provide detailed explanations on how it assessed and managed risks of information manipulation, including inauthentic or automated exploitation and risks from its recommender systems. The Commission also sought details on how TikTok allows public scrutiny and third-party access to data to monitor election-related risks.

5 December 2024

During the Romanian elections, the European Commission increased its monitoring of TikTok under the DSA. It issued a “retention order” requiring TikTok to preserve data related to risks affecting elections and civic discourse in the EU. This step aimed to secure evidence for potential investigations into TikTok’s compliance with the DSA.

The Commission emphasized its commitment to enforcing the DSA diligently and coordinating with regulators across Europe to address risks like systematic inauthentic activity.

17 December 2024

The European Commission launched new formal proceedings against TikTok for potentially breaching the DSA during the Romanian presidential elections. The investigation focused on TikTok’s failure to assess and mitigate risks tied to election integrity, including manipulation of its recommender systems and policies on political ads.

Commission President Ursula von der Leyen emphasized the importance of protecting EU democracies from foreign interference, stating, «In the EU, all online platforms, including TikTok, must be held accountable».

TikTok shared an official statement outlining the steps it took to ensure the platform’s integrity during the Romanian elections. Initially released on December 6, 2024, the statement was updated on December 17, 2024, and January 7, 2025.

Throughout December, TikTok removed millions of fake likes and followers, shut down thousands of accounts impersonating Romanian government officials and politicians, and eliminated numerous fake profiles. In addition, TikTok partnered with Funky Citizens to promote media literacy and fight misinformation during the election period, reinforcing its commitment to a safe and transparent online environment

TikTok's compliance

As a VLOP, TikTok faced the added responsibility of financially supporting the enforcement of the DSA. In 2024, the EU planed to collect approximately €45 million from major online platforms to fund initiatives such as removing illegal content and improving child protection online. VLOPs were required to contribute, with the levy capped at 0.05% of their annual profit.

Shortly after the DSA took full effect in February 2024, TikTok launched legal action against the European Union, challenging the financial levy.

TikTok criticized the EU Commission’s calculation method as flawed, though it did not disclose its levy amount.

Content moderation measures on TikTok

On October 24, 2024, TikTok released its third Transparency report under the DSA, covering content moderation measures from January to June 2024.

Key highlights from the report included:

  • Removing over 22 million pieces of content and banning more than 5 million accounts across the EU for policy violations.
  • Acting on nearly 29% of around 144,000 illegal content reports, addressing both policy breaches and local law violations.
  • Enhancing automated moderation, which flagged and removed 80% of violative videos up from 62% in 2023, and deploying over 6,000 moderators fluent in all EU languages. 

     

TikTok’s Automated moderation

TikTok uses advanced automated systems to proactively detect and remove content that violates its policies before it’s viewed or shared. When a user uploads content, it undergoes an automated review, visible only to the uploader during this process. If potential violations are detected, the content is either removed automatically or flagged for human review, especially for clear-cut violations.

TikTok’s automated tools include:

  • Computer Vision Models: Identify harmful objects, logos, or emblems associated with extremist or hate groups.
  • Keyword Lists & Audio Models: Detect policy-violating text or audio content, informed by expert partners like fact-checkers.
  • De-duplication & Hashing: Recognize and prevent re-shared violative content, especially with external groups like Tech Against Terrorism for hate or extremist content.

TikTok is committed to improving these systems to remove harmful content at scale while minimizing errors. Users can appeal removals if they believe a mistake was made. As of the first half of 2024, TikTok’s automated moderation systems had an accuracy rate of 99.1% and a low error rate of 0.9%.

Manual moderation

TikTok’s human moderators work alongside automated systems to ensure fair and consistent content reviews, considering context and nuance that technology may miss. Human moderation helps improve the automated systems by providing feedback that strengthens detection capabilities. This collaboration reduces the number of distressing videos moderators must review, allowing them to focus on more complex issues like misinformation, hate speech, and harassment.

Moderators’ responsibilities include:

  • Reviewing Flagged Content: When automated systems flag potentially violative content, but can’t make an automatic decision, moderators step in for further review, using tools to identify problematic objects like extremist symbols.
  • Community Reports: Users can report content or accounts they believe violate policies, which moderators then review. However, most content is proactively detected before being reported.
  • Reviewing Popular Content: Moderators check videos that gain significant popularity to prevent widespread exposure of harmful content.
  • Appeal Assessments: Users can appeal content removals or account restrictions, and moderators assess whether to reinstate the content or account.

TikTok’s priorities

On November 27, 2024, TikTok updated its monthly active user count in the EU and confirmed that 175 million people from the EU visit the platform every month. It also outlined its priorities and product updates, along with collaborations to enhance safety and security.

Age assurance measures
TikTok enforces a 13+ age requirement with neutral age gates and technology to detect underage accounts. Around 6 million underage accounts are removed monthly. TikTok also partnered with industry experts to improve age assurance policies.

Changes to effects based on teen feedback
In response to teens’ concerns, TikTok restricted certain appearance-altering effects for users under 18. The platform now provides clearer information about how effects change the appearance and updated guidelines for creators.

Enhanced helpline resources
In 13 European countries, TikTok connects users who report harmful content to local helplines offering expert support. This initiative follows a successful pilot in France recognized as a best practice for combating cyberbullying.

Data security initiatives
TikTok’s «Project Clover» includes migrating European user data to a new data center in Norway. The initiative also involves third-party monitoring by NCC Group to ensure robust data protection and security.

Cultural and safety commitment
TikTok remains committed to fostering a safe, authentic community, setting new safety and security standards to maintain its role as a platform for entertainment, learning, and connection.

The DSA compliance with ContentCore

For platforms like TikTok, meeting the content moderation requirements of the DSA can be especially challenging.

Building and maintaining human moderation teams, training, and developing in-house software all require continuous support and resources.

For platforms dealing with user-generated videos, ContentCore offers a seamless solution. WebKyte’s ready-to-use tool identifies copyrighted and known harmful video content, automatically scanning every upload to detect violations without disrupting the experience for users and creators.

Summary

TikTok’s compliance journey under the DSA has been a whirlwind of requests, investigations, and legal battles—proving that the story is far from over. We’ll be watching closely as events continue to unfold.

From content moderation and user protection to transparency and algorithm accountability, the European Commission has made it clear that the Digital Services Act is no mere formality.

Curious about how other platforms are handling similar challenges? Check out our blog post for more insights.

By Sava

Comments

Your email address will not be published. Required fields are marked *