US considers legalizing AI jailbreaking to uncover biases, training data
The US government is currently contemplating an exemption to the Digital Millennium Copyright Act (DMCA) that would allow researchers to bypass the terms of service on AI tools without legal consequences. The proposal aims to promote "good faith research" that uncovers biases, inaccuracies, and training data in AI systems. This exemption would also enable researchers to investigate these systems for potential harmful outputs and discrimination.
Department of Justice supports proposed exemption
The Department of Justice (DoJ) has expressed support for the proposed exemption, stating that "good faith research can help reveal unintended or undisclosed collection or exposure of sensitive personal data." It further emphasized the importance of such research in identifying systems whose operations are unsafe, inaccurate, or ineffective. The Department underscored the significance of this research especially when AI platforms are used for critical purposes where any unintended, inaccurate, or unpredictable AI output could result in serious harm to individuals.
Researchers's role in understanding AI tools highlighted
Researchers, academics, and hackers have played a crucial role in understanding how closed-sourced AI tools like ChatGPT and Midjourney operate. They often coax these systems into revealing information about their training data, which frequently includes copyrighted material secretly scraped from the internet, biases, and weaknesses. However, this type of research often breaches the terms of service users agree to when signing up for a system.
MIT researcher advocates for proposal
Shayne Longpre, an MIT researcher advocating for the exemption, expressed concerns about the trustworthiness of these AI models and their potential use for discrimination. He noted that many researchers have had their accounts suspended due to violations of terms of service while conducting good-faith research. Longpre added that these terms have a chilling effect on research as companies are not transparent about their enforcement process.
Opposition to proposed DMCA exemption emerges
The US government's proposal has met with opposition. Morgan Reed of the App Association, a lobbying group representing AI companies, argued that researchers should obtain prior consent from AI companies before conducting such research. The DVD Copy Control Association, which represents major movie studios and a pioneer of DRM, has also voiced against the proposed exemption.
Hacking Policy Council supports proposed exemption
Harley Geiger of the Hacking Policy Council, which supports the exemption, stated in a filing with the copyright office that an exemption is "crucial to identifying and fixing algorithmic flaws to prevent harm or disruption." He added that a "lack of clear legal protection under DMCA Section 1201 adversely affects such research." The proposed exemption wouldn't prevent companies from trying to stop this type of research but would legally protect researchers who violate company terms of service to do so.