Websites operating in the UK will soon have to assess whether their services put users at risk of seeing illegal material, such as child abuse, suicide, fraud and terrorism, or face fines.
The warning comes from Ofcom in a new code of practice to help sites comply with the Online Safety Act, which became law in October 2023.
Under the rules – outlined at www.snipca.com/53249– sites will need to state how illegal content might appear on their services and explain how they’ll stop it.
Ofcom has given companies until 16 March to comply. If they miss this deadline, they could be fined up to £18m or 10 per cent of their global turnover, whichever is greater.
Dame Melanie Dawes, head of Ofcom, told BBC News this was the tech industry’s “last chance” to make changes. She told sites that if they don’t act, “demands for things like bans for children on social media are going to get more and more vigorous”.