Cookie Settings
We use cookies to provide the best site experience. Our privacy policy.
Cookie Settings
Cookie Settings
Cookies necessary for the correct operation of the site are always enabled.
Other cookies are configurable.
Essential cookies
Always On. These cookies are essential so that you can use the website and use its functions. They cannot be turned off. They're set in response to requests made by you, such as setting your privacy preferences, logging in, or filling in forms.
Analytics cookies
These cookies collect information to help us understand how our Websites are being used or how effective our marketing campaigns are, or to help us customize our Websites for you.
Advertising cookies
These cookies provide advertising companies with information about your online activity to help them deliver more relevant online advertising to you or to limit how many times you see an ad.

EU Digital Services Act: definition and changes in the world of UGC platforms

The DSA is a suitable legal framework for digital service providers in the European Union (EU), designed to ensure open and safe online conditions. The goal of the European DSA is to create a standard set of rules for EU partner states to govern the clarity and responsibility of online platforms.

Background and Development of the Digital Services Act

The legislative journey of the DSA

Even though the law is only valid in the EU, its consequences will reverberate globally. By it, firms will keep changing all their policies. The main goal of DSA EU is to create a safer online environment. Platforms are needed to find ways to control or release posts related to illicit goods or services or contain unlawful range and to provide users with the ability to report this content. The law prohibits targeting advertising based on a person's intimate preferences, religion, ethnicity, or political beliefs and also limits advertising targeting children. Online platforms need to be transparent about how their recommendation algorithms work.

Additional rules apply to so-called "huge online platforms". Their administrations are required to provide users with the opportunity to opt out of recommendation and profiling systems, platforms are needed to transfer data with investigators and rules, cooperate in rapid response efforts, and also show external and independent work audits.

Historical context

The European Parliament adopted the DSA in July 2022. While the EU does not require full compliance by small companies - the list of large online platforms was approved in April, and they were given four months to change their policies. Large online platforms are those with more than 45 million European users. Currently, there are 19 services included in this category, including:

  • Facebook
  • Instagram
  • LinkedIn
  • Pinterest
  • Snap Inc.
  • TikTok
  • Twitter / X
  • YouTube

What is the EU Digital Services Act?

In this digital age, governments and regulators are actively working to bring order to our online lives and move the Internet into a more regulated environment.

Both the European Union Digital Services Act (DSA) and the UK Online Safety Act (OSA) aim to strike a balance between promoting innovation and protecting the Internet for future generations.

The UK's Online Safety Act has just gone out of print and is in the final stages of royal review. The deadline for compliance is mid-2024.

While both the OSA and DSA aim to create a safer digital space, the two bills are not carbon copies of each other. They vary in scope, specificity, and obligations imposed on digital platforms.

"The Digital Services Act regulates the obligations of digital assistance as intermediaries in supplying consumers with access to interests, assistance, and content. This includes, but is not limited to, online marketplaces."
EU Digital Services Act

Key objectives and components of the Act

In particular, the European Digital Services Act must:

  • Provide better defense for online users' rights. This includes provisions allowing users to challenge conclusions made by platforms about their content, data portability, and notification and removal mechanisms for illegal content.

  • Harmonize regulations for the Digital Service Act. The DSA aims to show harmonized regulations on content moderation, advertising transparency, algorithm transparency, online marketplaces, and online advertising.

  • Increase internet platform accountability and openness. By making social media, e-commerce, and internet intermediaries accountable for the services and material they offer, the DSA suggests tougher regulations. This includes taking the appropriate actions to stop harmful activities, unlawful content, and false information from appearing online.

  • The DSA Europe is crucial in promoting collaboration among EU member states to combat disinformation, illegal content, and other cyber threats. To further strengthen this effort, stricter enforcement tactics, such as imposing fines and penalties for those who do not comply, are being implemented.

  • Strengthen market surveillance. The EU DSA proposes the creation of a new European digital services coordinator and introduces new oversight measures for platforms with substantial market authority.

How the Digital Services Act Works

Accountability for unlawful content: Online platforms must control the distance of illicit content. This includes content that initiates violence, hostility, or bias, infringes intellectual property rights, or violates privacy or consumer safety regulations. The law of the affected Member State determines illegality.

Increased transparency: Online platforms will be required to provide clear and transparent information about the advertisements they display on their platforms. This includes information about who paid for the ad, the targeting criteria, and performance metrics. There are also broader information requirements for service providers at all levels.

New rules for large online platforms: Large online platforms (whose users comprise more than 10% of the EU population) will be subject to additional regulations, including transparency obligations, data sharing requirements, and audit requirements.

New powers of national authorities. National authorities will have new powers to enforce the rules set out in the DSA, including the power to impose fines and sanctions on non-compliant platforms.

Impact on tech companies and users

Now that the law has come into force, users in the EU will be able to see that content on 19 listed digital platforms is moderated and understand how this happens.

"For the first time, users will be given complete information about why the content was moderated, removed, or banned, ensuring transparency," an EU official told reporters.

The official added that consumers and consumer rights groups would also be able to use various mechanisms to appeal the decisions if their content is moderated by February next year.

But Renda explained that most changes would be invisible to users: "Those changes that are visible and rely too heavily on end-user notification are likely to either be a bit of a hassle or irrelevant. On platforms with these notification banners will be posted until the law is clarified."

Challenges and criticisms

Lawmakers worldwide are eagerly awaiting the adoption of their regulations for platforms. We advise them to stay a few years before giving rules similar to the DSA. There is much other regulatory assignment to be done. The US, for example, is in dire necessity of an actual national privacy law. We could also employ important legal reforms to supply " competitive interoperability" or "competitive interoperability," permitting new technologies to interact with, create, and attract users away from today's incumbents. There is also room for effective legal discussion and reform concerning more enterprising "middleware" or "protocols, not platforms" about content restraint. Any "DSA 2.0" in other nations will be better served if it builds on the demonstrated victories and unavoidable losses of individual DSA provisions once the law is up and running.

Comparison with global digital regulations

Since the bill was first presented, people across the political range have frequently argued that the existing ruling would damage the usefulness of encryption in personal contacts, reduce internet protection for UK residents and businesses, and threaten freedom of address. That's because the state added a new clause over the summer that needs tech companies to deliver end-to-end encrypted messages to be checked for child sexual abuse material (CSAM) so it can be reported to management. Nevertheless, the only method to guarantee that a message does not have illegal material is to employ client-side scanning and review the contents of the news before encrypting them.

DSA and similar legislation in other regions

Several classes can be learned from the DSA that are worth considering in other countries.

To the credit of the DSA's drafters, many of its content restraint and clarity controls reflect long-standing problems of the international polite community. The DSA also bypassed difficult "processing time" requirements like those adopted in Germany or needed beneath the EU Terrorist Content Regulation and offered in other countries, including Nigeria, which require disposal with 24 hours' notice.

Lawmakers in other countries should think about the DSA's approach but also be aware of the possible harm from unnecessary global fragmentation in the elements of laws. Venues of any size, especially smaller ones, will work with similar but not identical conditions across countries—wasting operational resources, harming competition, and risking further Internet balkanization. One solution to this problem could be the modular standard offered by former FCC commissioners Susan Ness and Chris Riley.
Observing this process, legislators could opt for some standardized legal language or requirements to ensure international uniformity while embracing their regulations where there is room for national variation.

Future of the Digital Services Act

Online platforms operating in the EU will be required to publish the number of their active users by February 17, 2023. This information will be published in a public section of their online interface and must be updated at least once every six months.

Suppose a platform or search engine has over 45 million users (10% of the European population). In that case, the European Commission will define the service as a "huge online platform or huge online search engine." These services will be given four months from their designation as a "huge online platform or huge online search engine" to comply with DSA obligations, including conducting and submitting their first annual report to the European Commission. Risk assessment. Among other things, when such platforms recommend content, users can change the criteria used, opt out of receiving personalized recommendations, and publish their terms and conditions in the official languages of all Member States where they offer their services.

Long-term impact of the DSA

EU Member States will have to appoint Digital Service Coordinators (DSCs) by February 17, 2024. The DSC will be the national body responsible for ensuring national coordination and promoting the practical and consistent application and enforcement of the DSA. February 17, 2024, is also the date all regulated entities must comply with all DSA rules.

As we have seen with GDPR and other laws, companies that violate these laws will likely be subject to significant fines and penalties. Over time, affected companies will become more compliant to achieve compliance. Data protection, user privacy, and consent-based marketing can be expected to continue to become increasingly essential for companies that want to grow and maintain good relationships with their customers.

The role of the DSA in shaping future digital policies

It may take time, but changes in digital markets must be accompanied by increased transparency and encouragement of competition and innovation, which will benefit consumers and small companies and force regulators to work harder to provide platforms and services that people want rather than simply relying on their size, revenue, lobbying power, and market dominance to stay on top. These changes will likely have meaningful global implications as the scope of privacy law expands.

Anyone can upload videos to a variety of video services. These downloads can occasionally occur at a rate of thousands per second. What is manually downloaded there cannot be followed. Platforms, however, are in charge of the material they host. WebKyte's ContentCore for video platforms facilitates the identification of copyrighted and criminal content among user-generated uploads.


An essential regulator of the EU's digital market is the DSA. In this way, there is a guarantee that online platforms hold accountability, for the content they display regardless of their location. There is the growing impact of the EU and the necessity, for a strategy. With its potential to greatly shape the digital economy not just within the EU, but also globally, US companies operating in the EU must be prepared for the implementation of new, comprehensive legal requirements soon.