UK Online Safety Act: Tech firms face new rules to combat online harms

TAGS

The has come into force in the , placing strict legal obligations on technology companies to tackle illegal content and protect vulnerable users. With illegal harms regulation now legally enforceable, tech firms must take swift action to address harmful activities such as terrorism, , fraud, and hate speech on their platforms. Ofcom, the UK’s communications regulator, has published its first set of codes of practice and safety guidance, marking the start of a new era for digital accountability.

What Does the Online Safety Act Mean for Tech Firms?

Tech platforms, including social media networks, search engines, messaging apps, and file-sharing services, are now required to assess and mitigate illegal content risks. Ofcom has set a clear timeline for compliance, with all platforms having until 16 March 2025 to complete risk assessments that identify the presence and spread of illegal harms.

The UK's Online Safety Act holds tech firms accountable to tackle illegal harms online.
The UK’s Online Safety Act holds tech firms accountable to tackle illegal harms online.

From 17 March 2025, firms must begin safety measures implementation, deploying over 40 recommended actions outlined in Ofcom’s codes. These include improving content moderation systems, increasing transparency in handling user reports, and introducing safety-by-design features to make platforms inherently safer.

Senior accountability will also play a pivotal role. Platforms must appoint a high-level executive responsible for overseeing compliance with illegal harms regulation, ensuring that user safety is prioritised at the highest level of governance.

How Will the New Rules Protect Children?

Child safety is central to the Online Safety Act, with specific measures designed to combat child sexual abuse and exploitation. Ofcom’s research highlights that children and teenagers increasingly experience inappropriate interactions online, such as receiving sexualised messages from strangers. These findings emphasise the need for urgent regulatory intervention.

See also  Nokia and Solis Tower Telecom do Brasil Partner to enhance digital agriculture in Brazil

Under the new regulations, platforms must adopt safeguards that prevent online grooming and limit interactions between children and unknown users. By default, children’s profiles, locations, and friend connections must remain private. Additionally, platforms are required to:

Restrict non-connected accounts from sending direct messages to minors.

Remove children from algorithmic friend suggestions to reduce their exposure to potential predators.

Provide clear and accessible information to help children make informed decisions about sharing personal data online.

Tech firms must also deploy automated hash-matching tools to detect and remove child sexual abuse material (CSAM). Platforms at a higher risk of hosting such content, including file-sharing services, face additional scrutiny and are expected to adopt robust detection technologies.

Ensuring Safety for Women and Girls Online

The Online Safety Act introduces targeted measures to address online harms disproportionately impacting women and girls. Ofcom’s codes require platforms to take down non-consensual intimate images—commonly known as “revenge porn”—and implement user tools to block or mute accounts engaged in harassment and stalking.

Following feedback from consultations, Ofcom has strengthened its guidance to help platforms identify and remove posts shared by organised criminals coercing women into prostitution. Additional protections include tackling cyberflashing and enhancing systems to detect and remove illegal intimate imagery.

See also  NextGen Healthcare to go private in acquisition by Thoma Bravo

These measures aim to provide a safer and more empowering digital experience for women and girls, who remain among the most vulnerable online.

What Are Ofcom’s Enforcement Powers?

Ofcom has emphasised that it will not hesitate to use its full enforcement powers against platforms failing to comply with the new rules. Tech firms face fines of up to £18 million or 10% of global annual revenue, whichever is higher. For severe breaches, Ofcom can apply for court orders to block platforms in the UK entirely.

Dame Melanie Dawes, Ofcom’s Chief Executive, stated that the safety measures implementation marks the end of unchecked digital harm. She reiterated that tech firms accountability is now non-negotiable, with platforms facing rigorous oversight to ensure compliance.

Ofcom has already begun working closely with tech companies, including smaller platforms, to help them prepare for the upcoming deadlines. However, the regulator has made it clear that support will not shield firms from enforcement action if they fall short of their obligations.

What’s Next? Ofcom’s Roadmap for 2025

The introduction of the Online Safety Act marks only the beginning of Ofcom’s comprehensive regulatory framework. The regulator plans additional consultations in Spring 2025 to explore further measures, including:

Blocking accounts of repeat offenders who share illegal CSAM content.

See also  Pix acquires View Labs to enhance real estate visualization capabilities

Using to detect child sexual abuse and terrorist material.

Implementing hash-matching tools to prevent the spread of non-consensual intimate images.

Establishing crisis response protocols for handling emergency events, such as riots or terror incidents.

More protections for children will also come into effect in April 2025, focusing on harmful content promoting self-harm, eating disorders, and cyberbullying. These measures underscore the UK’s commitment to creating a safer digital environment for all users.

A Global Model for Online Safety Regulation

The Online Safety Act sets a new global standard for tech firms accountability and digital regulation. As governments worldwide monitor the UK’s approach, the emphasis on illegal harms regulation and child protection online could inspire similar measures in other jurisdictions.

Ofcom’s ability to impose significant fines and block non-compliant platforms sends a clear message: prioritising safety is no longer optional for technology companies. Platforms must now take proactive steps to comply with safety measures implementation and demonstrate a clear commitment to protecting users from harm.

As the year progresses, 2025 promises to be transformative, with Ofcom’s oversight ensuring that the digital landscape becomes safer, more accountable, and less hospitable to illegal activities.


Discover more from Business-News-Today.com

Subscribe to get the latest posts sent to your email.

CATEGORIES
TAGS
Share This

COMMENTS

Wordpress (0)
Disqus ( )