Sign up for our free daily newsletter
YOUR PRIVACY - PLEASE READ CAREFULLY DATA PROTECTION STATEMENT
Below we explain how we will communicate with you. We set out how we use your data in our Privacy Policy.
Global City Media, and its associated brands will use the lawful basis of legitimate interests to use
the
contact details you have supplied to contact you regarding our publications, events, training,
reader
research, and other relevant information. We will always give you the option to opt out of our
marketing.
By clicking submit, you confirm that you understand and accept the Terms & Conditions and Privacy Policy
In late 2022 with the public release of large language model (LLM)-based tools such as ChatGPT and Dall-E, anyone could instantly generate text and images with just a simple, plain-English prompt. But IP ownership questions immediately surfaced. Who made those images? Who wrote those words? And most importantly, who owns them? Although the law around AI and copyrights is still developing, one question appears well settled: US copyright law requires a human author for any work to be eligible for copyright creation. But what makes someone an author under the Copyright Act?
The recent holding from the DC Circuit in Thaler v. Pulmater confirmed that only human beings can be the authors of copyrightable works. Back in 2018, long before the public release of ChatGPT and Dall-E, a computer scientist named Dr. Steven Thaler filed to register a copyright claim on an image depicting train tracks running through a flowered tunnel. On the application, Dr. Thaler listed the sole author as the “Creativity Machine”, a computer system that Dr. Thaler had created. The Copyright Office denied Dr. Thaler’s application, stating that the work “lacks the human authorship necessary to support a copyright claim”.
Dr. Thaler filed for reconsideration twice, but the office maintained its rejection on those same grounds. Dr. Thaler filed suit, and the District Court upheld the office’s decision. On appeal, the DC Circuit came to the same conclusion: “The Creativity Machine cannot be the recognised author of a copyrighted work because the Copyright Act of 1976 requires all eligible work to be authored in the first instance by a human being.”
Given the history of the human-authorship requirement, the holding in Thaler is unsurprising. Thaler was decided on the narrow issue of whether a machine could be an author under the Copyright Act, and did not reach other arguments. The court did not evaluate whether the constitution itself requires human authorship as the Copyright Office argued, finding instead that the Copyright Act contained this requirement.
Likewise, the court did not consider Dr. Thaler’s argument that he is the work’s author because he made and used the Creativity Machine, holding that Dr. Thaler had waived that argument by not making it to the Copyright Office, and instead listing only the Creativity Machine as the author on the copyright application. So, Thaler leaves at least one question seemingly unanswered: can you claim authorship by directing an LLM?
Recent guidance from the Copyright Office suggests that, as technology currently stands, no amount of prompting or selection of alternative prompt results can confer authorship under the Copyright Act. The Copyright Office recently released part two of its commentary on Copyright and Artificial Intelligence, where it emphasised the need for human intervention in the creative process to make a work copyrightable. The office took the position that, given current tools available in commercial LLMs, no amount of prompting or selecting among alternative prompt results provides sufficient human control for AI users to be authors and obtain copyright. Effectively, the Copyright Office considers millions of LLM-generated text and images to be legally authorless.
However, the Copyright Office did not entirely shut the door on GAI, speculating that “AI systems could someday allow users to exert so much control... that the system’s contribution would become rote or mechanical” and that the work could therefore qualify for copyright protection. For example, the office considered systems that allow “expressive inputs”, such as a user’s own illustration to be modified based on a prompt. The office stated that the user would be the author of at least the portion of the output which was attributable to their expressive input.
However, most publicly available generative artificial intelligence (GAI) systems are going the other way. Recent GAI systems are increasingly streamlined, creating more desirable, detailed outputs given the same prompt compared to prior systems. Three years ago, creating an AI video involved substantial effort and post-processing. Today, there are dozens of models that can output a polished video with a simple prompt. Under the office’s reasoning, as GAI systems become more autonomous, their outputs may become less copyrightable.
The hard line the Copyright Office took on the insufficiency of prompt engineering may be softer and fuzzier than it seems. Courts generally respect guidance from the Copyright Office, but it is not law. Courts might reject the Copyright Office’s guidance and hold that prompt engineering and iterative output selection can qualify as “authorship” under the Copyright Act. Given how quickly GAI tools are developing, we may soon see cases where a court accepts the Copyright Office’s guidance but nonetheless holds that the sophisticated tools available in that system rendered the GAI’s contribution “rote or mechanical”.
In any case, as the courts and society grapple with what authorship means, we should keep in mind that the purpose of the Copyright Act is to foster creativity. As the Copyright Office remarked: “Society would be poorer if the sparks of human creativity become fewer or dimmer.” The AI copyright question is not only about property rights, but also what it means to be human and to make art.
Avery Williams is a principal at McKool Smith in Dallas, specialising in commercial litigation and IP matters. Joseph Micheli is an associate, specialising in patent litigation support.
Email your news and story ideas to: [email protected]