Computer scientist Stephen Thaler told the U.S. Supreme Court this month that the federal government is defending a copyright rule not found in copyright law.

Thaler made his arguments in a brief responding to the government’s arguments that the court should refuse to hear the dispute. He urged the justices to reject the government’s request to block review of his case.

The Justice Department had urged the Supreme Court to deny hearing the case, saying lower courts correctly upheld the U.S. Copyright Office’s longstanding position that copyright protection requires human authorship and that the case is not an appropriate vehicle for Supreme Court intervention.

Thaler’s new brief sharply reframes the fight. Rather than focusing primarily on whether artificial intelligence systems can be treated as authors, the filing casts the dispute as a question of agency power — whether the Copyright Office created a rule that does not appear in the text of federal copyright law.

“The Copyright Office has imposed a non-statutory Human Authorship Requirement for copyright registration,” Thaler wrote. He argued that the requirement was derived “not from the Constitution or the Copyright Act but from its own nonbinding agency materials.”

The case, Thaler v. Perlmutter, stems from a 2018 application for copyright protections that Thaler filed with the U.S. Copyright Office for an artwork titled A Recent Entrance to Paradise, which was generated using an A.I. system he created.

Thaler is the president of Imagination Engines, which has developed new artificial intelligence capabilities for institutions from defense contractor Raytheon to NASA.

In his application, Thaler listed his system, the “Creativity Machine,” as the author of the artwork and included a note stating that the work was autonomously created by his A.I. system. He sought to register it as a work made for hire as the owner of the algorithm.

The Copyright Review Board denied the application in February 2022, records show, after a Copyright Office registration specialist had earlier refused to register the claim in August 2019, finding that it “lacks the human authorship necessary to support a copyright claim.”

A federal district court and the U.S. Court of Appeals for the D.C. Circuit upheld that decision.

In urging the Supreme Court to decline review, the government argued that copyright law has always understood “author” to mean a human being. It also said the Copyright Office does not categorically reject works that use A.I. tools, but instead examines whether a person exercised creative control over the expression in the work.

Thaler’s reply challenged both points.

First, Thaler argued that the Copyright Act does not explicitly limit authorship to natural persons. He said that while corporations do not physically create works, the statute sometimes labels them as “authors” for legal purposes.

He said that drafting choice in the law shows Congress did not write an explicit rule limiting authorship to human beings.

It also notes that the law sets copyright terms for certain works — including anonymous works and works made for hire — based on a fixed number of years rather than the lifespan of a human creator.

The government’s reading of the statute ignores that language, Thaler argued.

Second, Thaler disputed the government’s claim that the Copyright Office evaluates A.I.-assisted works on a case-by-case basis.

DOJ urges Supreme Court to reject A.I. copyright claim
The case, which tests the definition of authorship, could redefine the multibillion-dollar generative A.I. industry.

“In the time since Dr. Thaler’s registration was denied, the Copyright Office has either rejected numerous applications disclosing the use of A.I. or required applicants to disclaim content made using A.I. and thus prevented registration of that content,” Thaler argued.

He cited agency guidance that states that “if a work’s traditional elements of authorship were produced by a machine ... the Office will not register it.” He argued that, in practice, this amounts to a rule that bars protection for fully A.I.-generated works.

The reply also pushes back on the government’s argument that the case is a poor vehicle for review because Thaler purportedly conceded there was no human creative contribution.

The filing said he maintained only that the work lacked “traditional human authorship,” not that there was no human involvement at all. It noted that Thaler designed and programmed the A.I. system and argued in earlier stages that he should be considered the author under existing doctrines.

Thaler also invoked the Supreme Court’s recent decision in Loper Bright Enterprises v. Raimondo, which held that courts must independently interpret statutes rather than defer to the federal government's interpretations.

He argued that reading Congress’s silence as approval of the Copyright Office’s human authorship rule would conflict with that ruling.

Beyond domestic law, the filing pointed to developments overseas. Thaler noted that China and the United Kingdom allow copyright protections for art made by A.I.

“This is a vital time for A.I. development and its use in creative industries and for the international
competitiveness of the United States, which is stifled by the Copyright Office’s policy,” Thaler argued.

The Supreme Court has now scheduled the case for consideration at its private February 27 conference, according to the court’s docket. That means the justices will meet behind closed doors to decide whether to grant Thaler’s request for review.

If the court declines to hear the case, a lower court ruling affirming the Copyright Office’s rejection of Thaler’s application will remain in place. If the justices agree to review it, the case would present the court with an opportunity to directly address if works generated entirely by A.I. systems qualify for copyright protection.

Stories like this take time, documents and a commitment to public transparency. Please support independent arts journalism by subscribing to Urgent Matter and supporting our work directly. 

Share this article
The link has been copied!