US Congress calls on Big Tech to address deepfake concerns
What's the story
The US Congress has stepped up its scrutiny of major tech firms including Apple, Meta, Google, and Microsoft over deepfake non-consensual intimate images.
Letters have been sent to top executives like Apple CEO Tim Cook, raising concerns over apps that allow the creation of harmful images.
The letters follow reports of apps that allow users to swap faces into explicit content.
Plan of action
Plans to curb explicit content questioned
The US Congress is now calling on these tech giants to detail their plans to prevent the generation and spread of such content.
Apple's App Store policies have particularly come under fire in the letter addressed to Cook.
Despite having App Review Guidelines, some apps with potential for misuse have slipped through the cracks, questioning Apple's ability to enforce its own rules.
Legislative reference
Congress cites 'TAKE IT DOWN' Act in letter to Apple
The Congress's letter to Cook also mentions the TAKE IT DOWN Act, a legislative effort to tackle non-consensual intimate images.
It asks a number of questions regarding Apple's approach to deepfake pornography, including its strategies and timeline for tackling the issue.
The letter also seeks information on who is involved in developing these strategies and how user reports are handled.
App removal
Information on app removal criteria
Further, the Congress letter also asks Apple about the criteria for removing problematic apps from the App Store and remedies for victims whose images have been misused.
This emphasizes an increasing call for accountability from Apple, given its control over the App Store.
Earlier reports have shown apps creating non-consensual deepfake images using videos from adult content sites like Pornhub.
Insufficient measures
Apple's measures to curb deepfake misuse deemed insufficient
Despite removing the concerned apps, incidents have highlighted gaps in Apple's review process.
The company has taken measures like blocking "Sign in with Apple" on deepfake websites and preventing AI tools from generating explicit content.
However, critics see these steps as minimal and demand stricter oversight against dual-use apps.
The letter sent to Apple, along with those to other tech companies, stresses the need for increased precautions for AI-based image and video manipulation apps.