A guide to the UK Online Safety Act: what it is and how video platforms can comply

The Online Safety Bill is a new set of laws protecting juniors and grown-ups online. This will force social media services and video-sharing platforms to be more accountable for the safety of their users on their platforms.

Background of the UK Online Safety Act

A rather excessive and jumbled interpretation of itself, the bill, descended from the legislative plan following Boris Johnson’s removal in July, has given the last report stage, telling the House of Commons that it now has one last opportunity to discuss its bill. Content and vote to approve it.

Nevertheless, the ruling must pass unharmed via the House of Lords before obtaining royal assent and evolving law. Although the bill’s final plan has yet to be issued if it is not given by April 2023, the law will be abolished entirely according to parliamentary regulations, and the process will also have to begin in a new parliament.

What is the UK Online Safety Act (Bill)?

The UK Online Safety Bill is designed to ensure that different types of online services are free, from harmful content while also safeguarding freedom. The bill seeks to protect internet users from potentially harmful material as well as prevent children from accessing dangerous content. It does these by-passing conditions on how social media and other online platforms consider and remove unlawful material and content they deem dangerous. According to the government, the decision is “a commitment to making the UK the most unassailable place in the world to access the internet.”

Detailed explanation of the Act

Internet search engines and online platforms that let people generate and share content are covered by the legislation. This includes discussion forums, certain online games, and websites that distribute or showcase content.

Parts of the legislation mimic rules in the EU’s newly passed Digital Services Act (DSA), which prohibits targeting users online based on their faith, gender, or sexual preference and demands large online platforms disclose what steps they undertake—measures to combat disinformation or propaganda.

The UK communications regulator will be appointed as the regulator of the online security regime and will be given a degree of power to collect data to help its oversight and enforcement actions.

Differences from previous online safety laws

The EU Digital Services Act and the UK Online Safety Act share the same goal of regulating the digital world, but each has different characteristics.

The DSA takes a comprehensive approach, addressing a wide range of online user concerns, while the OSA focuses more closely on combating illegal content that causes great harm. In addition, OSA emphasizes the importance of proactive monitoring as opposed to DSA’s response procedures for notices and removals.

online safety bill UK

How the act protects online users

The bill would make social media groups legally accountable for providing children’s and young people’s security online.

It will save children by making social media platforms:

  • Quickly remove illegal content or prevent it from appearing at all. This includes removing content that promotes self-harm.
  • Discourage children from accessing dangerous and age-inappropriate content.
  • Enforce age restrictions and age verification measures.
  • Publishing risk assessments provides greater transparency about children’s threats and hazards on major social media platforms.
  • Provide clear and accessible ways for parents and children to register issues online when they occur.


The UK Online Safety Act would protect adults in three ways through the “triple shield.”

All services in question will need to take steps to prevent their services from being used for illegal activities and to remove illegal content when it does appear.

Category 1 services (the most extensive services with the highest level of risk) must remove content prohibited by their terms.

Category 1 services must also supply adult users with tools that give them greater control over the content they visit and with whom they interact.

The bill now includes adult user empowerment responsibilities with a list of forms of content that will be identified as harmful and to which the user must have access to tools to monitor their exposure. This definition includes encouragement, promotion, or instruction in suicide, self-harm, and eating disorders; or content that is offensive or incites hatred towards people with protected characteristics. Given recent events such as the removal, subsequently rescinded, of suicide prevention prompts on Twitter (now X) in December 2022, the LGA welcomes the specific inclusion of suicide and self-harm in the Bill.

UK online bill

Responsibilities of digital platforms

Over 200 sections of the UK Online Safety Bill outline the duties of digital platforms regarding the content that is published on their channels. It is a thorough piece of law. These platforms have a “duty of care” under the law, which makes the internet a safer place for users—especially younger ones.


By establishing age restrictions and age verification processes, this law would shield children from age-inappropriate content. It would also hold internet service providers more accountable by requiring the prompt removal of illegal content.


The UK has initially sought to be a pioneer in addressing digital safety issues, particularly about children’s exposure to inappropriate content online. However, despite various delays, the European Union took the lead in implementing the Digital Services Act in August.


Proposed initially more than four years ago, the bill shifts the focus from cracking down on “legal but harmful” content to prioritizing the protection of children and the eradication of illegal content online. Technology Minister Michelle Donelan touted the Online Bill UK as “game-changing” legislation in line with the government’s ambitions to make the UK the safest place online.

Penalties for non-compliance

Three years and four excellent ministers since the UK country first issued the Internet Harms whitepaper — the cause of the existing Internet Safety Bill — the Conservative Party’s ambitious attempt to regulate the Internet has returned to Parliament after numerous revisions.

If the bill becomes law, it will apply to any service or site with users in the UK or target the UK as a market, even if it is not based in the country. Failure to comply with the proposed rules would expose communities to penalties of up to 10% of international annual turnover or £18 million ($22 million), whichever is more prominent.

Critiques and controversies

Since the bill was first presented, people across the political range have frequently argued that the existing ruling would damage the usefulness of encryption in personal contacts, reduce internet protection for UK residents and businesses, and threaten freedom of address. That’s because the state added a new clause over the summer that needs tech companies to deliver end-to-end encrypted messages to be checked for child sexual abuse material (CSAM) so it can be reported to management. Nevertheless, the only method to guarantee that a message does not have illegal material is to employ client-side scanning and review the contents of the news before encrypting them.

Penalties for non-compliance

In an open letter marked by 70 organizations, cybersecurity professionals, and elected officials after Prime Minister Rishi Sunak reported he was producing the bill to Parliament, the signatories argued that “encryption is critical to keeping internet users protected online, to build financial security through a business-friendly UK economy that can weather the cost of living crisis and ensure national security.”

“UK businesses will have less defense for their data discharges than their peers in the United States or the European Union, making them more vulnerable to cyber-attacks and intellectual property theft,” the letter notes.

Balancing online safety with freedom of expression

Matthew Hodgson, the co-founder of Element, a decentralized UK messaging app, said that while there is no doubt that platforms need to provide tools to protect users from any content — be it offensive or simply something they don’t do — I don’t want to see: The idea of effectively requiring the use of backdoors to access private content, such as encrypted messages, in case it turns out to be harmful content, is controversial.

“The second you put in any kind of backdoor that can be used to break the encryption, it will be exploited by attackers,” he said. “And by opening it up as a means for corrupt actors or villains of any stripe to be able to subvert encryption, you might as well have no encryption at all, and the whole thing would collapse.”

“The two statements are completely contradictory, and unfortunately, those in power do not always understand the contradiction,” he said, adding that the UK could end up in a situation similar to Australia, where the government passed legislation allowing government enforcement agencies to require businesses to hand over user information and data, even if they are protected by cryptography.

Hodgson argues that the UK government should not promote privacy-destroying infrastructure but rather prevent it from becoming a reality that more authoritarian regimes might adopt, using the UK as a moral example.

Response from tech companies and civil liberties groups

There are also concerns about how some UK Online Bill provisions will be enforced. Francesca Reason, a lawyer in the regulatory and corporate defense group at law firm Birketts LLP, said many tech companies are concerned about the more demanding requirements that could be imposed on them.

Reason said there were also issues of practicality and empathy that would need to be addressed. For example, is the government going to prosecute a vulnerable teenager for posting self-harm images online?

Comparative perspective

It is worth comparing the UK Online Safety Bill with its international equivalents, as legislators in several jurisdictions have sought to regulate content moderation on social media platforms. These proposed legislative measures provide us with a helpful set of criteria by which to evaluate a security bill.

These comparators help identify the different degrees to which governments have chosen to intervene in monitoring and moderating the content of services. The US and EU models focus on design choices that enhance the user experience by making user experience and procedures transparent and accessible. The Indian and Brazilian models, by contrast, are much more explicitly focused on channeling authorized content into peer-to-peer services. The UK Government has stated its preference for the first approach, but it still needs to be developed in the Bill, as discussed in the following sections.

Implementation and enforcement

Platforms will be needed to show that they have strategies to meet the conditions set out in the bill. Ofcom will examine how these processes protect internet users from harm.

Ofcom will have the ability to take action against companies that fail to comply with their new responsibilities. Visitors will be fined up to £18 million or 10 percent of their annual international turnover, whichever is more prominent. Criminal prosecution will be carried against senior managers who fail to respond to Ofcom’s data requests. Ofcom will also be able to hold companies and old managers (if at fault) criminally responsible if a provider fails to comply with Ofcom enforcement information about typical child protection duties or child sexual abuse and exploitation of its services.

In the most severe cases, with the court’s consent, Ofcom can order cost providers, advertisers, and internet service providers to stop using the site, stopping it from receiving cash or being accessed from the UK.

What tips platforms can advise for users to stay safe online under the new regulations

  • Do not post any personal information online, such as your address, email address, or mobile phone number.
  • Think carefully before posting your photos or videos. Once you post your photo online, most people will see it and be able to download it, it will no longer be just yours.
  • Keep your privacy settings as high as possible
  • Never give out your passwords
  • Don’t be friends with people you don’t know

Conclusion

The new rules introduced by the Online Safety Act are significant, and businesses will have to spend a lot of extra time, money, and resources to ensure compliance, especially given the severe consequences of violating these laws.

Due to the stringent enforcement powers and consequences of violating these laws, it is critical that Internet service providers quickly take steps to understand their responsibilities under the Online Safety Act and modify their processes to comply with it.

There are many video platforms where anyone can upload videos. Sometimes there can be thousands of such downloads per second. It is impossible to track what is downloaded there manually. However, platforms are responsible for the content they store. ContentCore by WebKyte for video platforms helps to identify copyrighted and criminal content among user-generated uploads.

It’s best to speak to IT and data protection professionals if you need advice on this topic and how to prepare for the consequences when the Online Safety Act comes into effect.