Dating app users to be protected against nude images in cyberflashing crackdown

Dating app users are to be better protected against the “vile crime” of cyberflashing as a new law “turning up the heat on tech firms” comes into force today.

Published

One in three teenage girls has received unsolicited sexual images, the Government said, as it described the latest change as part of its wider commitment to tackle online abuse and halve violence against women and girls.

Social media and dating app platforms are, from Thursday (January 8), required by law to detect unsolicited nude images and prevent them reaching users.

The change, which makes what is known as cyberflashing a priority offence under the Online Safety Act, means that rather than reacting after such images have been received, platforms must take proactive steps to prevent this content appearing in the first place.

The Department for Science, Innovation and Technology warned that those platforms which fail to comply with the legislation could face fines of up to 10 per cent of their qualifying worldwide revenue or have their services blocked in the UK.

Communications watchdog Ofcom will consult on new codes of practice stating what steps platforms must take to protect users but methods are expected to include automated systems to pre-emptively detect and hide such images, stricter content policies and moderation tools.

An Ofcom spokesperson said: “We’ll consult on updates to our codes of practice soon to reflect this change to the law, and we’ll hold platforms to account for protecting people from this despicable crime.”

Technology Secretary Liz Kendall said: “We’ve cracked down on perpetrators of this vile crime – now we’re turning up the heat on tech firms. Platforms are now required by law to detect and prevent this material.

“The internet must be a space where women and girls feel safe, respected and able to thrive.”

'We will deploy the full power of the state to make this country safe for women and girls'

Safeguarding minister Jess Phillips said cyberflashing had, for too long, been “just another degrading abuse women and girls are expected to endure”.

She added: “By placing the responsibility on tech companies to block this vile content before users see it, we are preventing women and girls from being harmed in the first place.

The Bumble dating app was the first to explicitly moderate cyberflashing – through an AI-powered feature which automatically detects and blurs nudity in images sent within chats and allows the user to choose to view, block, or report it – and has welcomed the change in the law.

Elymae Cedeno, the app’s vice president of trust and safety, said: “Receiving unsolicited sexual images is a daily violation that disproportionately impacts women and undermines their sense of safety online.

“Strengthening the law to make cyberflashing a priority offence is an important step towards ensuring platforms proactively address this behaviour to better protect members.”

The change comes in the same week Ms Kendall called on Elon Musk’s X to urgently deal with its artificial intelligence chatbot Grok being used to create sexualised deepfake images of people, including children.

She backed regulator Ofcom, which is looking into X and xAI, the firm founded by Mr Musk which created Grok, to take “any enforcement action” deemed necessary.

Users of social media platform X appear to have prompted Grok to generate images of children “in minimal clothing”.

Tech tycoon Mr Musk has previously insisted that “anyone using Grok to make illegal content will suffer the same consequences as if they uploaded illegal content”.

X has said it takes action against illegal content, including child sexual abuse material “by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary”.