The U.S. Supreme Court on Monday declined to hear a case over whether artworks created by artificial intelligence without human authorship could receive copyright protection. The case, filed by computer scientist Stephen Thaler, had the potential to rewrite copyright protection in the age of A.I.
The Justice Department had urged the Supreme Court to deny hearing the case, saying lower courts correctly upheld the U.S. Copyright Office’s longstanding position that copyright protection requires human authorship and that the case is not an appropriate vehicle for Supreme Court intervention.
Last month, Thaler told the high court that the federal government was defending a copyright rule not found in copyright law, urging the justices to reject the government’s request to block review of his case.
The Supreme Court did not provide a reason for declining to hear the case.
Sign up for Urgent Matter
If you value clear reporting on how courts, agencies and cultural institutions shape the future of art and technology, consider a paid subscription to Urgent Matter.
No spam. Unsubscribe anytime.
The case, Thaler v. Perlmutter, stems from a 2018 application for copyright protections that Thaler filed with the U.S. Copyright Office for an artwork titled A Recent Entrance to Paradise, which was generated using an A.I. system he created.
In his application, Thaler listed his system, the “Creativity Machine,” as the author of the artwork and included a note stating that the work was autonomously created by his A.I. system. He sought to register it as a work made for hire as the owner of the algorithm.
A Copyright Office registration specialist refused to register the claim in August 2019, finding that it “lacks the human authorship necessary to support a copyright claim.” The Copyright Review Board then denied the application in February 2022.
Thaler then took the matter to a U.S. District Court, filing a lawsuit against the Copyright Office and its director, Shira Perlmutter. In August 2023, amid the boom in generative A.I., Judge Beryl A. Howell determined that the Copyright Office was “correct that human authorship is an essential part of a valid copyright claim.”
The three-judge appeals court panel affirmed the lower court’s decision in March 2025, again determining that only works with human authorship could be granted copyright protections under U.S. copyright law.
On October 9, Thaler filed his petition for a writ of certiorari with the Supreme Court to challenge the statutory interpretations of authorship requirements under U.S. law.
“This case presents the question of whether a work outputted by an artificial-intelligence system without a direct, traditional authorial contribution by a natural person can be copyrighted. A straightforward reading of the Copyright Act leads to the conclusion that it can and should be,” lawyers for Thaler wrote in the application.
“The U.S. Copyright Office, however, imports words into the Act that Congress never drafted and requires vague elements of human authorship that arose from the Copyright Office itself—without statutory support.”
Urgent MatterAdam Schrader
His case could potentially have had widespread ramifications for copyright protections for A.I.-generated art. Thaler asserted in his petition that if his petition were to be denied, even if the Copyright Office’s human authorship test is later overturned, it would “be too late.”
“The Copyright Office will have irreversibly and negatively impacted AI development and use in the creative industry during critically important years,” lawyers for Thaler wrote.
Separately, the Copyright Office has issued guidance stating that works generated entirely by artificial intelligence without human creative control are not eligible for copyright protection, though works incorporating sufficient human authorship may qualify. Recent registrations have reflected that distinction.
Stories like this take time, documents and a commitment to public transparency. Please support independent arts journalism by subscribing to Urgent Matter and supporting our work directly.