Security

Balancing Online Safety with Freedom: Examining Ofcom’s New Duties

LinkedIn Google+ Pinterest Tumblr

The UK telecoms regulator Ofcom has embraced its newly bestowed duties embodied in the Online Safety Act and propounded its initial consultation. Resolute in vindicating the rights of younger users online and addressing the potentially detrimental content, the authority does not shy away from waving a big stick of severe penalties for any tech companies and social media platforms that fail to appropriately govern their content.

Recognising the significance of its task, Ofcom is methodically embarking on its role, conceptualising a series of consultations to comprehensively enmesh the law’s various aspects. As a starting point, the regulator invited the industry experts and other major stakeholders to provide their feedback on its preliminary illegal harm online codes and guidelines, which are expected to come into force late next year.

Subsequently, businesses will be given a three-month window to carry out their risk assessment in the time when the codes are awaiting parliament’s approval. When the codes finally become law, Ofcom will start exerting its authority from the beginning of 2025.

Simultaneously, the second phase will commence, exploring child safety duties and pronography. Firstly, by shedding light on the obligations of adult sites to restrict minors from gaining access to inappropriate content, and secondly, suggesting additional protections against the promotion of harmful behaviours such as suicide, self-harm, and cyberbullying, anticipated in spring 2024.

To accentuate their commitment to protecting children, Ofcom illuminated some alarming statistics about children’s online experiences. The findings reveal that 60% of youngsters between 11 and 18 have had unsettling online interactions, 30% have received unwelcome friend or follow requests, and 16% have been privy to inappropriate images or solicitations.

The proposed regulations require extensive internet platforms, particularly social media outlets such as Instagram and TikTok, to conform to guidelines regarding how children can be contacted online. The primary objective is to limit strangers’ access to minors while fostering user privacy and anti-fraud measures.

Ofcom’s CEO Dame Melanie Dawes stated, “Regulation is here, and we’re wasting no time in setting out how we expect tech firms to protect people from illegal harm online, while upholding freedom of expression. Children have told us about the dangers they face, and we’re determined to create a safer life online for young people in particular.”

Ofcom’s rapid progress further puts the tech giants under the spotlight as fulfilling their ethical responsibilities towards their users becomes more crucial than ever. While we anticipate the implementation of these changes, the regulator’s effectiveness in ensuring accountability and upholding user safety remains to be seen in the coming year or so.

Write A Comment