Copyright Office Calls for Federal Law to Combat Unauthorized Deepfakes
A new Copyright Office report urges Congress to create federal rights protecting all individuals from unauthorized AI replicas
The United States Copyright Office has issued a warning about the pressing need for nationwide protections against unauthorized deepfakes.
A recent report addressing the intersection of copyright law and AI underscored the urgency for Congress to implement federal legislation shielding individuals from unsettling and unauthorized digital replicas, such as deepfakes and voice clones.
The report emphasized that such measures are crucial to safeguarding personal privacy and preventing the misuse of AI-generated content.
“It has become clear that the distribution of unauthorized digital replicas poses a serious threat not only in the entertainment and political arenas but also for private citizens,” Shira Perlmutter, register of copyrights and director of the U.S. Copyright Office.
Deepfakes are becoming an increasingly concerning issue ahead of the November election.
Only last week, Tesla CEO and X owner Elon Musk shared a video featuring a manipulated version of Kamala Harris’ voice.
Earlier this year, an AI-generated version of President Biden’s voice told voters to stay home during the New Hampshire primary.
While authorities, including the FBI and Department of Justice, have publicly expressed concern about deepfakes in the run-up to November, no such federal regulation covers deepfakes.
A spate of legislation is in progress to address unauthorized deepfakes, but these laws are fragmented, focusing on specific applications. For instance, the Deepfakes Accountability Act aims to safeguard national security from deepfakes and Tennessee’s ELVIS Act safeguards vocal rights of musicians.
“The impact is not limited to a select group of individuals, a particular industry, or a geographic location,” the Copyright Office said in its report, urging the need for comprehensive legislation.
The office contended that current legal remedies for those harmed by unauthorized digital replicas are insufficient and that existing federal laws are “too narrowly drawn to fully address the harm from today’s sophisticated digital replicas.”
Among the recommendations for federal legislation on deepfakes, the Copyright Office suggested protecting all individuals, not just celebrities, from unauthorized digital replicas.
The proposed law would establish a federal right that protects all individuals during their lifetimes from the knowing distribution of unauthorized digital replicas.
The right would be licensable and also contain explicit First Amendment accommodations.
The office also suggested that liability should stem from the distribution of replicas, not their creation and should not be limited to commercial uses.
The recommendations also call for both injunctive relief and monetary damages as remedies, with the possibility of criminal liability in certain cases.
The Copyright Office proposes that while federal law should provide a baseline of protection, states should be allowed to offer additional safeguards.
“We look forward to working with Congress as they consider our recommendations and evaluate future developments,” Perlmutter said.
"This suggestion by the Copyright Office might spur a more broadly crafted federal right to digital objects rendered using AI in general,” said Randy McCarthy, a shareholder at the law firm Hall Estill. “Which could be both helpful and problematic.”
“This is a good example of a specific case that it would be good to look at more generally at AI-generated objects and who owns those,” McCarthy said. “It does raise the question that knowing where it’s going — all these digital elements and objects floating around, who owns them? If I created an image and someone else modifies it, do they own that as well?”
“The proposal is significant because, if followed by Congress, it would establish a version of federal publicity rights that to this point have existed primarily at the state level,” said Evan Everist, a partner at law firm Dorsey & Whitney. “Additionally, the Copyright Office contends that Section 230 of the Communications Decency Act, which provides immunity to online service providers for many types of illegal third-party content, should not apply to content violating the new digital replica right in order to encourage prompt removal of such content.”
About the Author
You May Also Like