Navigating The Fine Balance: UK’s Online Safety Act Analysis

LinkedIn Google+ Pinterest Tumblr

The United Kingdom’s contested bill on online safety has officially been approved, transitioning into law and augmenting the power of the telecoms regulator Ofcom. The Online Safety Act, as it is now called, places the responsibility on social media platforms and similar entities to monitor and control potentially harmful content, particularly in the realms of terrorism, pornography, and self-harm promotion.

No surprise, this initiative has engendered extensive debate over the past couple of years. Supporters of the bill laud its potential benefits, especially where children’s online safety is concerned. However, critics disagree, voicing concerns that range from ethical considerations of censorship and privacy, to technical complications. Tech giants impacted by the act assert that it undermines the principle of end-to-end encryption in messaging.

Yet, despite the disparaging views, the act is now enacted. This brings a new assignment for Ofcom, the role of online safety regulator. This regulator is now empowered to impose fines on those companies straying from established guidelines, with punishment potentially extending to prison sentences for company executives. Though it appears doubtful that Ofcom’s jurisdiction extends that far. Ofcom does, however, have the authority to enforce penalties of up to £18 million or 10% of a firm’s worldwide yearly revenues, with the greater of these two applied. Government officials have been vocal in reminding stakeholders that penalties for large platforms could enter the billions of pounds region.

However, the validity of these claims and whether such penalties will be enacted remains to be seen. But for now, the government is passionately asserting that the UK is globally the safest place to be online.

While this bill is now law, it’s important to understand that there’s much more work ahead. Most of the act’s provisions will be initiated in the coming two months, establishing Ofcom as the online safety regulator as soon as possible. However, the rollout will be phased, with a consultation process starting in November, followed by a phased implementation of the new rules.

According to Ofcom’s own release, this process could extend over a few years. Its plan is to publish draft codes and guidelines on the illegal harms duties aspect of the act in late 2024. The following phases will encompass child safety, pornography, protection of women and girls, and transparency. These are expected to be completed in the first half of 2025.

Here’s what Ofcom CEO Dame Melanie Dawes stated: “We know a safer life online cannot be achieved overnight; but Ofcom is ready to meet the scale and urgency of the challenge. We’ve already trained and hired expert teams with experience across the online sector, and today we’re setting out a clear timeline for holding tech firms to account. Ofcom is not a censor, and our new powers are not about taking content down. Our job is to tackle the root causes of harm. We will set new standards online, making sure sites and apps are safer by design. Importantly, we’ll also take full account of people’s rights to privacy and freedom of expression.”

The road ahead is indeed a tightrope walk, but it seems that Ofcom has already hit the ground running in embracing its new responsibilities.

Write A Comment