Published in AI

Boffins want to make jailbreaking AI legal

by on19 July 2024


There is some government backing

A group of researchers, academics, and hackers are advocating for the ability to break AI companies' terms of service to conduct "good faith research" without fear of legal repercussions.

The US government is considering an exemption to copyright law that would allow breaking technical protection measures and DRM on AI systems to investigate biases, inaccuracies, and training data. This exemption would support "good faith" security and academic research, even if it involves circumventing protective systems.

The Department of Justice supports this, stating that "good faith research can help reveal unintended or undisclosed collection or exposure of sensitive personal data." Such research is crucial when AI platforms are used for important purposes, where inaccurate outputs can cause serious harm.

Much of what is known about closed-source AI tools like ChatGPT and Midjourney comes from researchers and users who trick these systems to reveal their training data, biases, and weaknesses. This often violates terms of service, such as OpenAI's, which prohibit reverse engineering and bypassing protective measures.

MIT researcher Shayne Longpre said "there is a lot of apprehensiveness about these models and their design, their biases, being used for discrimination, and, broadly, their trustworthiness." He added that many researchers face account suspensions or legal concerns for conducting good-faith research.

The exemption would amend Section 1201 of the Digital Millennium Copyright Act, which currently restricts such activities. Other exemptions under this section allow for hacking devices for repair and protect security researchers.

Harley Geiger of the Hacking Policy Council stated that an exemption is "crucial to identifying and fixing algorithmic flaws to prevent harm or disruption," and noted that a "lack of clear legal protection under DMCA Section 1201 adversely affects such research."

Last modified on 19 July 2024
Rate this item
(1 Vote)