A new policy report from the U.S. Copyright Office says that songs and other artistic works created with the assistance of artificial intelligence can sometimes be eligible for copyright registration, but only if the ultimate author remains a human being.
The report, released by the federal agency on Wednesday (March 15), comes amid growing interest in the future role that could be played in the creation of music by so-called generative AI tools — similar to the much-discussed ChatGPT.
Copyright protection is strictly limited to content created by humans, leading to heated debate over the status of AI-generated works. In a closely-watched case last month, the Copyright Office decided that a graphic novel featuring AI-generated images was eligible for protection, but that the individual images couldn’t be protected.
In Wednesday’s report, the agency said that the use of AI tools was not an automatic ban on copyright registration, but that it would be closely scrutinized and could not play a dominant role in the creative process.
“If a work’s traditional elements of authorship were produced by a machine, the work lacks human authorship and the Office will not register it,” the agency wrote. “For example, when an AI technology receives solely a prompt from a human and produces complex written, visual, or musical works in response, the traditional elements of authorship are determined and executed by the technology — not the human user.”
The report listed examples of AI-aided works that might still be worthy of protection, like one that creatively combined AI-generated elements into something new, or a work that was AI-generated that an artist then heavily modified after the fact. And it stressed that other technological tools were still fair game.
“A visual artist who uses Adobe Photoshop to edit an image remains the author of the modified image, and a musical artist may use effects such as guitar pedals when creating a sound recording,” the report said. “In each case, what matters is the extent to which the human had creative control over the work’s expression and ‘actually formed’ the traditional elements of authorship.”
Under the rules laid out in the report, the Copyright Office said that anyone submitting such works must disclose which elements were created by AI and which were created by a human. The agency said that any AI-inclusive work that was previously registered without such a disclosure must be updated — and that failure to do so could result in the cancellation of the copyright registration.
Though aimed at providing guidance, Wednesday’s report avoided hard-and-fast rules. It stressed that analyzing copyright protection for AI-assisted works would be “necessarily a case-by-case inquiry,” and that the final outcome would always depend on individual circumstances, including “how the AI tool operates” and “how it was used to create the final work.”
And the report didn’t even touch on a potentially thornier legal question: whether the creators of AI platforms infringe the copyrights of the vast number of earlier works that are used to “train” the platforms to spit out new works. In October, the Recording Industry Association of America (RIAA) warned that such providers were violating copyrights en masse by using existing music to train their machines.
“To the extent these services, or their partners, are training their AI models using our members’ music, that use is unauthorized and infringes our members’ rights by making unauthorized copies of our members works,” the RIAA said at the time.
Though Wednesday’s report did not offer guidance on that question, the Copyright Office said it had plans to weigh in soon.
“[The Office] has launched an agency-wide initiative to delve into a wide range of these issues,” the agency wrote. “Among other things, the Office intends to publish a notice of inquiry later this year seeking public input on additional legal and policy topics, including how the law should apply to the use of copyrighted works in AI training and the resulting treatment of outputs.”
Read the entire report here: