Over 4,000 celebrities subjected to deepfake pornography, investigation reveals
A recent investigation has revealed that over 4,000 well-known personalities (including 250 British celebrities) have been exploited by deepfake pornography. The analysis comes from Channel 4 News whose own presenter, Cathy Newman, was found to be among the victims. The list of affected individuals encompasses female actors, TV personalities, musicians, and YouTubers. The investigation focused on the top five deepfake websites, which used advanced AI technology to manipulate and superimpose the faces of the individuals onto explicit content.
Deepfake websites attracted 100 million views in a quarter
The investigation has disclosed that the five deepfake websites have collectively attracted a staggering 100 million views in just three months. Newman, being one of the victims, described her experience as invasive and unsettling. "It's incredibly disturbing to think that someone out there has created this false representation of me, and I have no idea who they are," she expressed.
Online Safety Act aims to take on deepfake pornography
The UK's Online Safety Act, enacted on January 31, has made it illegal to distribute deepfake pornography without consent. However, the act doesn't make the creation of such content a criminal offense. The legislation was primarily enacted in response to the increasing prevalence of AI-generated deepfake pornography. In the first three quarters of 2023, a staggering 143,733 new deepfake pornographic videos were uploaded to the 40 most frequently visited deepfake pornography websites, surpassing the total from all previous years combined.
How regulators and tech giants are addressing deepfake concerns
UK's broadcasting regulator Ofcom is currently deliberating on how best to implement the Online Safety Act. An Ofcom representative commented, "Deepfake material that is illegal is profoundly troubling and harmful." Tech giants Google and Meta (the owner of Facebook and Instagram) have also pledged their commitment to tackling the issue. Ryan Daniels from Meta stated, "Meta has a strict policy against child nudity, sexualized content involving children, and AI-generated non-consensual explicit images."
Victims condemn deepfake pornography
Sophie Parrish, a victim from Merseyside found manipulated explicit images of herself online before the new legislation was enacted. Speaking to Channel 4 News, she said, "It's extremely violent and degrading. It feels as though women are being reduced to mere objects." These personal stories underscore the deeply distressing effects of deepfake pornography on affected individuals and their lifestyles.