Creating sexually explicit deepfake images to be criminalized in UK
The United Kingdom will criminalize the production of sexually explicit deepfake images as part of a wider strategy to combat violence against women. The Ministry of Justice has declared that those found guilty of creating such content without consent could face legal repercussions and an unlimited fine, even if there's no intention to distribute the images. Distribution of such images could lead to imprisonment.
Why does this story matter?
As artificial intelligence technology rapidly advances, the proliferation of deepfake images and videos has become a concern. In response to the recognized national crisis of violence against women and girls, the UK government has directed law enforcement agencies to prioritize addressing this issue. The proposed law aims to help curb this escalating trend, which is frequently used to inflict emotional distress or embarrassment on victims.
Government's stance on deepfake content production
"This new offence sends a crystal clear message that making this material is immoral, often misogynistic, and a crime," Laura Farris, minister for victims and safeguarding, said in a statement. The government is also introducing new criminal offenses for people who take or record real intimate images without consent, or install equipment to enable someone to do so. A new statutory aggravating factor will be brought in for offenders who cause death through abusive, degrading or dangerous sexual behavior.
'Police, prosecutors should be equipped to enforce these laws'
According to the Guardian, Yvette Cooper, the shadow home secretary, supported the announcement. She said, "Superimposing somebody's image on to sexually explicit photos and videos is a gross violation of their autonomy and privacy, which can cause enormous harm, and it must not be tolerated." "It's essential that the police and prosecutors are equipped with the training and tools required to rigorously enforce these laws in order to stop perpetrators from acting with impunity," she added.
91% say deepfake technology threat to women's safety: Magazine survey
Deborah Joseph, the editor-in-chief of Glamour UK, expressed approval for the proposed amendment. "In a recent survey conducted by Glamour, 91% of our readers identified deepfake technology as a threat to women's safety. Through firsthand accounts from victims, we've witnessed the profound impact," she stated. "While this is an important first step, there is still a long way to go before women will truly feel safe from this horrendous activity," she told the Guardian.