The United Kingdom is implementing new legislation requiring tech companies to remove non-consensual intimate images within 48 hours of being reported. Companies that fail to comply face hefty fines up to 10% of their global revenue and potential service blockages.

LONDON – The United Kingdom announced Thursday it will mandate technology platforms remove intimate images posted without permission within two days of being flagged, or face substantial financial penalties reaching up to 10% of their worldwide revenue.
Officials say these measures aim to strengthen protections for women and girls amid growing concerns about digital abuse, where private photos can be rapidly distributed online and artificial intelligence tools can generate explicit content instantly.
The British government revealed plans to modify current legislation moving through parliament, establishing a mandatory requirement for major social media platforms to eliminate reported non-consensual intimate content within 48 hours.
While sharing such material without permission is already prohibited under British law, victims have struggled to get platforms to permanently delete these images from their services.
“The online world is the frontline of the 21st century battle against violence against women and girls,” Prime Minister Keir Starmer said in a statement.
The rise in unauthorized intimate imagery has intensified Britain’s broader discussions about internet safety regulations. Government officials are evaluating potential restrictions on social media access for teenagers under 16, similar to Australia’s recent prohibition.
British authorities indicated their media oversight agency Ofcom is exploring whether to handle illegal intimate image sharing with the same level of seriousness as child exploitation and terrorism-related material.
Under the proposed system, victims would need to file only one complaint, with platforms required to delete identical content across all their services and block future uploads of the same material.
Penalties for non-compliance would apply to a platform’s total qualifying global income, a metric Ofcom uses that encompasses revenue generated worldwide from regulated services.
Ofcom announced separately it will expedite decisions on new regulations requiring platforms to implement specialized blocking technology called “hash-matching” to prevent illegal intimate images from being uploaded initially. The agency expects to finalize these rules by May, with implementation potentially beginning this summer.
JetBlue Flight Makes Emergency Landing at Newark After Engine Trouble
Olympic Hockey Semis Set: Canada Faces Finland, US Takes on Slovakia
Hornets Star LaMelo Ball Unharmed in Downtown Charlotte Vehicle Collision