Social algorithms must change to protect children – Ofcom

The Database

It is Ofcom’s job to enforce new, stricter rules following the introduction of the Online Safety Act – these codes set out what tech firms must do to comply with that law.

Ofcom says they contain more than 40 “practical measures.”

The centrepiece is the requirement around algorithms, which are used to decide what is shown in people’s social media feeds.

Ofcom says tech firms will need to configure their algorithms to filter out the most harmful content from children’s feeds, and reduce the visibility and prominence of other harmful content.

Other proposed measures include forcing companies to perform more rigorous age checks if they show harmful content, and making them implement stronger content moderation, including a so-called “safe search” function on search engines that restricts inappropriate material.

Speaking to BBC Radio 4’s Today programme, Dame Melanie described the new rules as “a big moment”.

“Young people are fed harmful content on their feed again and again and this has become normalised but it needs to change,” she said.

According to Ofcom’s timeline, these new measures will come into force in the second half of 2025.

The regulator is seeking responses to its consultation on the draft codes until 17 July, after which it says it expects to publish final versions of them within a year.

Firms will then have three months to carry out risk assessments on how children could encounter harmful content on their platforms and their mitigations, taking Ofcom’s guidance into account.

Dame Melanie added: “We will be publishing league tables so that the public know which companies are implementing the changes and which ones are not.”