The European Commission vs. social media and video platforms: the Digital Services Act in action

With the explosive growth of online platforms over the past two decades, it was only a matter of time before authorities stepped in to regulate these services. As a result, we have the EU Digital Services Act (DSA), introduced by the European Commission to set new standards for content moderation and transparency. Now, more than two years later, we can observe how platforms are adapting to this new regulatory landscape.

In this article, we dive into the basics of the DSA compliance, platforms’ obligations, and the notices issued by the European Commission to uncover the key challenges faced by online services.

As a provider of automatic content recognition for social media and video platforms, we at WebKyte primarily focus on the challenges of these types of platforms.

The Digital Services Act explained

The DSA is a European Union regulation, enacted on November 16, 2022, that is aimed at establishing a safer online environment and setting balances between the interests of users, consumers, and internet intermediaries.

This safer digital environment implies that the users’ rights are protected while small and bigger businesses have equal chances to succeed among their audiences. 

The DSA directly applies in all countries of the European Union. What’s more, not only does it apply to the companies settled or having their branches in Europe, but to all companies offering their digital services to users in the European Union.

The DSA covers all types of internet intermediaries, namely, providing ‘mere conduit’ services, caching, and hosting services. This includes social networks, video platforms, search engines, e-commerce services, and other online services.

According to the DSA, an online platform is a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor functionality.

The rules of the DSA apply from 17 February 2024.

The DSA requirements

Mainly, the DSA sets forth provisions concerning the consideration of complaints about illegal content, obligatory clauses of the user agreements, and transparency accountability.

Requirement involve:

▪️ Provide user-friendly mechanisms to allow users or entities to report illegal content on a platform;

▪️ Prioritise the processing of reports submitted by so-called «trusted flaggers»;

▪️ Share the detailed information about the reasons with users when their content is restricted or removed;

▪️ Provide features for users to appeal content moderation decisions within a platform;

▪️ Quickly inform law enforcement authorities if platforms become aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person has taken place, is taking place or is likely to take place;

▪️ Redesign their UX/UI elements to ensure a high level of privacy, security, and safety of minors;

▪️ Ensure that the interfaces are not designed in a way that deceives or manipulates the users, no dark patterns are allowed;

▪️ Clearly flag ads on the interface;

▪️ Stop showing targeted ads based on sensitive data (such as ethnic origin, political opinions or sexual orientation), or targeted at minors;

▪️ Have clearly written and easu-to-understand terms and conditions and act in a diligent, objective and proportionate manner when applying them;

▪️ Publish once a year transparency reports on their content moderation processes and results.

One more requirement set out by the DSA is that online platforms have to submit information about their users upon the authorities’ requests. Platforms shall notify such users about the received request.

VLOPs as the main targets

The DSA also provides a comprehensive set of obligations, where more complex and larger services have more responsibilities.

Thus, platforms with at least 45 million monthly active users in the European Union are deemed very large online platforms (VLOPs) or very large search engines (VLOSEs), and they must comply with additional obligations.

According to the European Commission Press Corner, Very Large Online Platforms are: Alibaba AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Pornhub, Snapchat, Stripchat, TikTok, Twitter (X), Xvideos, Wikipedia, YouTube, Zalando, and Very Large Online Search Engines are Bing and Google Search. 

Additional obligations for the VLOPs

Mitigate Risks
Implement measures to prevent illegal content (e.g., copyright infringements) and rights violations. This includes updating terms of service, user interfaces, content moderation practices, and algorithms as needed.

Assess Risks
Identify and analyze systemic risks related to illegal content and threats to fundamental rights. Submit risk assessments to the European Commission within four months of designation and make them public within one year.

Strengthen Processes
Enhance internal systems, resources, testing, and oversight to effectively detect and address systemic risks.

Undergo Audits
Ensure that risk assessments and compliance with the DSA are externally and independently audited on an annual basis.

Share Ad Data
Publish public repositories of all advertisements served on their platforms.

Provide Data Access
Grant researchers, including vetted ones, access to publicly available data to ensure transparency and accountability.

Increase Transparency
Publish biannual transparency reports covering content moderation and risk management, along with annual reports on systemic risks and audit results.

Appoint Compliance Teams
Establish dedicated compliance functions to oversee adherence to DSA obligations.

Prioritize Child Safety
Design interfaces, recommender systems, and terms to prioritize children’s well-being, including implementing age verification tools to block minors from accessing pornographic content.

Assess Risks to Minors
Incorporate the impact on children’s mental and physical health into risk assessments.

One key additional responsibility for VLOPs is to financially support the enforcement of the DSA. The EU plans to collect approximately €45 million in 2024 from major online platforms to oversee compliance with the regulation. This funding supports initiatives to remove illegal and harmful content and enhance child protection online. Platforms and search engines with over 45 million EU users are required to share these costs, capped at 0.05% of their annual profit.

VLOPs against the DSA

Soon after the DSA came into the full power in February 2024, Meta and TikTok have initiated legal actions against the European Union regarding a financial levy designed to support the enforcement of the Digital Services Act (DSA). 

Meta argues that the levy is inequitable with some companies bearing a disproportionate share. Meta’s expected contribution for 2024 is €11 million (almost a quarter of the total levy), while TikTok criticized the EU Commission’s calculation method as flawed, though it did not disclose its levy amount.

During summer 2023, Zalando and Amazon filed lawsuits challenging their designation as Very Large Online Platforms under the DSA. Zalando claimed errors in applying the DSA, vague rules, unequal treatment, and disproportionate interference with its rights. Amazon alleged discrimination and violations of fundamental rights tied to requirements like ad repositories and non-profiling recommender options. Amazon also requested interim measures to suspend obligations until the court’s decision. Both cases highlight platform resistance to DSA compliance demands.

On March 1, 2024, Aylo Freesites, Pornhub’s parent company, sued the European Commission over its designation as a «very large platform» under the DSA. Aylo argues this violates principles of fairness and infringes on business freedoms by requiring an ad repository revealing user identities. The company seeks to annul the designation, exclude itself from these obligations, and have the Commission cover legal costs. This case highlights ongoing tensions between platforms and the DSA’s stringent regulations.

UGC platforms under pressure

On 18 January 2024, the European Commission sent formal information requests to 17 VLOPs and VLOSEs under the DSA, including Pinterest, TikTok, Instagram, and Snap. The requests focused on their compliance with providing researchers access to publicly available data, a key requirement for accountability and transparency. This access was crucial for monitoring illegal content, particularly ahead of national and EU elections. The platforms had until 8 February 2024 to respond, after which the Commission would assess further steps.

On 14 March 2024, the European Commission requested information from Bing, Google Search, and six VLOPs, including Facebook, TikTok, Snap, and YouTube, about their measures to address risks from generative AI. The inquiry focused on issues like AI «hallucinations», deepfakes, and automated voter manipulation, as well as impacts on elections, illegal content, fundamental rights, and child protection.

On 2 October 2024, the European Commission requested information from YouTube, Snapchat, and TikTok under the DSA about their recommender systems. The inquiry focused on how these systems influence users’ mental health, spread harmful content, and impact elections, civic discourse, and minors’ safety. Platforms were asked to detail their algorithms, including risks like addictive behavior, content «rabbit holes», and illegal content promotion. Responses were due by 15 November 2024, with potential fines for incomplete or misleading replies and formal proceedings if non-compliance persisted.

The Commission also regularly sends notices to specific platforms. Let’s take a look at the reasons why popular social media platforms and video hosting were questioned by the EU over the past year.

Requests to Linkedin

On 14 March 2024, the European Commission requested information from LinkedIn under the DSA regarding its compliance with the ban on ads based on profiling using sensitive personal data. LinkedIn was required to respond by 5 April 2024, with potential fines for incomplete or misleading replies.

The European Commission acknowledged LinkedIn’s decision to disable the feature allowing advertisers to target EU users based on their LinkedIn Group membership on 7 June 2024. 

LinkedIn’s move marked a voluntary step toward compliance, and the Commission committed to monitoring its implementation. Commissioner Thierry Breton praised the DSA’s impact, emphasizing its role in driving meaningful change in digital advertising.

Requests to Snapchat

On 10 November 2023, the European Commission requested information from Snap under the DSA about their measures to protect minors online. The inquiry focused on risk assessments and mitigation steps addressing mental and physical health risks, as well as minors’ use of their platforms. Snap was required to respond by 1 December 2023, with potential fines for incomplete or misleading replies. 

It appears the issue was resolved, as there have been no further legal actions or public updates regarding this request.

Requests to Pornhub, Stripchat, and XVideos

On 13 June 2024, the European Commission requested information from Pornhub, XVideos, and Stripchat under the DSA. The inquiry focused on measures to protect minors, prevent the spread of illegal content and gender-based violence, and implement effective age assurance mechanisms. The platforms were also asked to detail their internal compliance structures, including independent teams and compliance officers, to address systemic risks. Responses were due by 4 July 2024, with potential fines or further action for incomplete or misleading replies. These platforms submitted their first risk assessment reports in April 2024, following their designation as Very Large Online Platforms.

On 18 October 2024, the European Commission issued a second request for information under the DSA to Pornhub, Stripchat, and XVideos, focusing on transparency reporting and advertisement repositories. The platforms were asked to clarify their content moderation practices, including court orders, notices, complaint systems, and automated tools. They were also required to detail their content moderation teams’ qualifications and linguistic expertise, as well as the accuracy of their automated systems.

Additionally, the Commission requested improvements to their public ad repositories, citing concerns that they lack search functionality, multicriteria queries, and API tools required by the DSA. The platforms must respond by 7 November 2024, or face potential fines or proceedings for non-compliance. This follows an earlier inquiry into their measures for protecting minors and addressing illegal content.

Requests to X/Twitter

The European Commission formally requested information from X (formerly Twitter) under the DSA, investigating allegations of spreading illegal content and disinformation, including terrorist content, hate speech, and violent material. The inquiry also examined X’s compliance with DSA provisions on handling illegal content notices, complaint processes, risk assessment, and mitigation measures.

As a designated Very Large Online Platform, X has been required to adhere to the full DSA framework since August 2023, addressing risks like disinformation, gender-based violence, threats to public security, and impacts on mental health and fundamental rights.

X was tasked with providing information on its crisis response protocol by 18 October 2023 and addressing broader compliance measures by 31 October 2023. 

On 18 December 2023 the European Commission launched formal proceedings against X/Twitter for suspected breaches of the DSA. The investigation focused on risk management, content moderation, dark patterns, advertising transparency, and researcher data access.

The Commission examined X’s measures to counter illegal content, transparency in ads and data access, and concerns about deceptive design linked to subscription features like Blue checks.

This marked the first formal enforcement under the DSA, three years after its proposal. The proceedings aimed to gather further evidence and determine the next steps but did not prejudge the final outcome.

On 8 May 2024, the European Commission requested detailed information from X under the DSA regarding its content moderation resources and risk assessments related to generative AI.

The inquiry followed X’s latest Transparency report, which revealed a 20% reduction in its content moderation team and a drop in linguistic coverage within the EU from 11 languages to 7. The Commission sought further details on these changes and their impact on X’s ability to address illegal content and protect fundamental rights. It also requested insights into risk assessments and mitigation measures for generative AI’s effects on elections and harmful content.

On 12 July 2024, the European Commission shared its preliminary findings with X, stating that the platform likely breached the DSA in areas related to dark patterns, advertising transparency, and data access for researchers.

The Commission’s investigation involved analyzing internal documents, consulting experts, and working with national Digital Services Coordinators. It identified potential non-compliance with Articles 25, 39, and 40(12) of the DSA, which focus on transparency and accountability in content moderation and advertising.

If confirmed, the Commission could issue a non-compliance decision, imposing fines of up to 6% of X’s global annual revenue and requiring corrective actions.

On the same day 12 July 2024, Elon Musk, X’s CEO, reacted strongly to EU accusations against X for blocking researcher data and flaws in its ad database. He claimed the European Commission proposed an «illegal secret deal» for X to censor speech in exchange for avoiding fines. Musk didn’t elaborate on whether other platforms were involved but soon announced plans to challenge the EU in court, stating, «We look forward to a very public battle in court, so that the people of Europe can know the truth».

The Commission denied all the accusations.

Requests to Telegram

The Commission wanted to set Telegram as a very large online platform. Back in May 2024, Telegram was compliant with the DSA basic obligations as an intermediary service and even had a dedicated webpage for it.

Though the EU aimed to classify Telegram as a very large online platform, joining the ranks of TikTok, LinkedIn, Pinterest, and others with over 45 million monthly active users in the EU. With Telegram reporting 41 million users in the region back in February 2024, it’s likely just a matter of months before this happens.

Once designated as a very large online platform, Telegram will face additional obligations, such as conducting annual risk assessments and paying an annual fee to the EU, capped at 0.05% of their annual profits for DSA compliance supervision.

It seems like Telegram remaining somewhat of a dark horse for the EU pushes extra pressure as the Commission seeks more control and the ability to intervene more effectively with the platform.

Requests to Meta and TikTok

The European Commission has issued several official requests to Meta and TikTok concerning their DSA compliance. We’ll delve deeper into the notices and their implications for these two major platforms in an upcoming article.

Platform that couldn't comply

The Czech Republic content-sharing platform Ulož decided to change its business model because of the enactment of the DSA.

Ulož was a website that allowed users to upload different files, including music and videos, and those files could easily be downloaded by other users. The problem was that users could upload copyrighted materials without proper rightsholders permission. According to the DSA, the ‘actual knowledge of illegal activity’ is one of the criteria for admitting liability of the platform.

Thus, Ulož announced that upon December 1, 2023, it was turning from a file-sharing service to a cloud-based storage service, where users can keep and download files that only have been uploaded themselves. Now in October 2024, it has almost 40 times less traffic than in October 2022. 

How to comply

One of the main focuses of the DSA is the moderation of illegal content, including copyright infringements. For VLOPs, the obligations are not only to remove such content upon notice but also to prevent its upload.

ContentCore by WebKyte is an automated content recognition tool that helps platforms with user-generated content to detect copyright violations and duplicates of known harmful videos. Using advanced video fingerprinting and matching algorithms, ContentCore efficiently scans uploaded videos for copyright issues and duplicates.

Summary

Tensions between the European Commission and online platforms over DSA compliance are rising. The Commission is serious about enforcing the DSA, especially for very large platforms and search engines, with the goal of making the internet safer for all. Platforms need to be proactive and transparent in their cooperation, as the Commission isn’t afraid to take action. Fortunately, practical solutions exist to simplify DSA compliance, particularly for video content moderation on UGC platforms, social media, and hosting services.