Under proposed new legislation, describing the measure as part of an “ongoing battle” with technology companies on behalf of victims, Sir Keir Starmer has said tech platforms will be forced to remove intimate images shared without consent within 48 hours.
Sir Keir Starmer has said tech platforms will be forced to remove intimate images shared without consent within 48 hours under proposed new legislation, describing the measure as part of an “ongoing battle” with technology companies on behalf of victims.
The Prime Minister announced the government will introduce the requirement through an amendment to the Crime and Policing Bill currently progressing through the House of Lords.
Under the plans, companies that fail to comply could face fines of up to 10 percent of their global turnover or have their services blocked in the UK.
The proposal would require platforms to remove intimate images once flagged and prevent them from being re-uploaded. Victims would only need to report the content once, rather than approach multiple sites individually.
The government said intimate image abuse should be treated with the same severity as child sexual abuse material and terrorist content.
Speaking to the BBC, Mr Starmer said the rule would mean a victim “doesn’t have to do a sort of whack-a-mole chasing wherever this image is next going up”.
He added companies are “already under that duty when it comes to terrorist material so it can be done” – also saying: “It’s a known mechanism,” and said “we need to pursue this with the same vigour.”
Mr Starmer added enforcement would involve fines and other measures “yet to be determined”, overseen by “a combination of oversight bodies in relation to what’s online and then it will be a criminal matter”.
He said he did not think this would include prison sentences for tech bosses.
Technology Secretary Liz Kendall said: “The days of tech firms having a free pass are over… no woman should have to chase platform after platform, waiting days for an image to come down”.
Janaya Walker, interim director of the End Violence Against Women Coalition, added the proposal “rightly places the responsibility on tech companies to act”.
The move follows legislation introduced earlier in February making non-consensual deepfake images illegal in the UK.
It also comes after a dispute between the government and X in January, when its AI tool Grok was used to generate images of real women wearing very little clothing.
The function was subsequently removed.
A government report published in July 2025 found young men and boys were largely targeted for financial sexual extortion, sometimes referred to as “sextortion”, where victims are asked to pay to prevent intimate images being shared. A parliamentary report in May 2025 recorded a 20.9 percent rise in reports of intimate image abuse during 2024.
The government said Internet service providers would also receive guidance to block access to sites hosting illegal content, targeting websites that fall outside the scope of the Online Safety Act.
Sir Keir Starmer says tech platforms will be forced to remove intimate images shared without consent within 48 hours






