AI Insight: Protecting Artists in the Age of AI
In April of 2023, fans across the nation were excited to hear about a seemingly new Drake featuring The Weeknd song titled “Heart on My Sleeve.” The song quickly gained popularity, amassing over 11 million views on platforms like TikTok, Spotify and other social media platforms within a matter of days. Yet, the creation of the song did not involve Drake or The Weeknd. Instead, the song was created using artificial intelligence (AI). The AI was trained on Drake and The Weeknd’s works and created the song by mimicking the artists’ voices, writing styles, and musical styles. While “Heart on My Sleeve” has since been removed from all social media platforms, its release sparked a wave of AI-generated music. Since then, AI has been used to create a multitude of songs mimicking artists’ voice, musical styles, and writing styles. Currently, social media is flooded with AI-generated music, with a recent example being an AI-generated Taylor Swift song depicting a breakup with Travis Kelce.
Additionally, AI has been employed to have artists and celebrities cover songs. For instance, Kanye singing “Bubbly” by Colbie Caillat and Helen Keller singing “Your Raise Me Up” by Josh Groban. While AI-generated music may seem amusing on the surface, it sends an alarming message about the potential dangers of AI misuse. Consequently, many legislators, artists, legal scholars, and music industry professionals are collaborating to find effective solutions to protect artists against such misuse.
The Right of Publicity and Proposed Legislation
Since AI-generated songs and cover versions generally do not feature a direct copy of any particular copyrighted work, copyright law does not offer adequate protection against AI-generated material. As a result, many are turning to the right of publicity as a potential source of redress for unauthorized AI-generated material. The right of publicity refers to an individual’s right to control the commercial use of their name, image, and likeness (NIL). Currently, the right of publicity is governed by state laws, leading to a patchwork of regulations across different jurisdictions. While the majority of states recognize this right either through statutes or case law, the level of protection provided to individuals varies widely. Each state has the autonomy to establish its own framework for protection and the remedies available, resulting in inconsistencies in judicial interpretation, forum shopping, and uncertainty regarding the extent of protection and available remedies for individuals.
Unsettled by the recent advancements in AI and the drawbacks of the current right of publicity framework, the “House Judiciary Subcommittee on Intellectual Property and the Internet” has convened multiple hearings, with the latest held on April 10, 2024. During these hearings, experts in the music industry, artists, professors, and legislators discussed AI-related concerns and potential regulatory options. Currently, three proposed acts related to the right of publicity are under consideration: the No AI Fraud Act, the Federal Anti-Impersonation Right Act (FAIR Act), and the No FAKES Act. Among these, the No AI Fraud Act is emerging as the leading legislation for potential passage. Under the No AI Fraud Act, individuals would be granted a property right in their likeness and voice. The act would permit:
(1) The rights to be freely transferable and descendible.
(2) Individuals to have the option to approve the use of their digital depiction or digital voice replica use for a new performance.
(3) Various remedies for unauthorized use of an individual’s likeness and voice.
(4) A First Amendment defense against alleged violations.
The Pros and Cons of New Right of Publicity Legislation
In the hearing, witnesses included artist Lainey Wilson, CEO of the Recording Academy Harvey Mason Jr., President of Software and Information Industry Association Christopher Mohr, and Law Professor at University of Pennsylvania Law School Jennifer Rothman. Lainey Wilson and Harvey Mason Jr. were proponents of new right of publicity legislation. They cited the current misuse of AI, which poses threats to an artist’s livelihood and reputation. Wilson specifically referred to AI misuse as a “gut-punch” when used in ways an artist never imagined or would have allowed. The proponents also pointed out the inconsistency present in the current right of publicity laws and the lack of clear addressal of AI in any existing legislation. Furthermore, the proponents favored the passage of the No AI Fraud Act. Specifically, they praised the creation of a personal property right in one’s image and voice. In fact, Harvey Mason Jr. referred to the creation of the right as “just common sense” and “long overdue.” Additionally, proponents favored the bill’s recognition of the First Amendment as freedom of expression is crucial to the creation of music.
On the other hand, Christopher Mohr and Jennifer Rothman addressed concerns related to the passing of new legislation. They spoke of current legislation available, such as the Lanham Act false endorsement claim and state right of publicity acts, which offer redressability for current unauthorized AI uses of an artist’s name, image, and likeness. Additionally, they highlighted Congress’s lack of knowledge on copyright law compared to state and federal courts, arguing that courts are best suited to resolve any ambiguities regarding AI.
Most importantly, opponents pointed out a major pitfall within proposed legislation: the allowance of NIL rights to be transferred indefinitely. This means no limits on the scope or duration of the transfer of NIL rights. In the music industry, this creates the fear that a record label can buy an artist’s NIL rights early in their career and create music from AI technology mimicking the artist’s voice without their permission indefinitely, even after the artist’s death. While alleged safeguards exist in the legislation, they are elusive. For example, one alleged safeguard within proposed legislation is the opportunity for an artist to consult with legal representation before transferring away their NIL rights or signing a recording contract that contains a provision allowing for such. While on paper, this seems to address potential misuse, in reality, it offers little protection for artists.
The current music industry framework is known to be lopsided in favor of record labels when it comes to power and control. Young artists are often desperate to sign any contract presented to them with the hope of breaking into the industry. Labels are known to take advantage of artists’ ambitions by including unfavorable terms in their contracts. Thus, labels are likely to insert provisions regarding NIL transfer in contracts presented to artists. Furthermore, artists, even those able to afford legal representation before signing a contract, may disregard legal counsel given the hope of signing with a label. Therefore, opponents, such as Rothman, are advocating for any legislation that is passed to only permit the licensing of NIL rights and to make limitations on such licensing clear and definite to prevent interpretation as a transfer. The NO FAKES Act of 2024 exemplifies the suggested limitations to transfer rights by prohibiting assignment during the licensee’s lifetime and for a period post-mortem. The NO FAKES Act also places requirements on licenses to further avoid interpretation as transfers. A license must be no more than 10 years in duration, in writing, and include a reasonably specific description of the intended uses of the digital replica of the licensee’s voice or visual likeness. Limitations such as the foregoing, would allow artists to monetize their NIL rights in beneficial situations while preventing AI misuse.
The Path Forward
In addition to discussing proposed right of publicity legislation, the committee also addressed the imminent need for updates to existing laws to better recognize the importance of ‘personhood,’ given that human creation is no longer the sole avenue of innovation in today’s AI-centered society. Several significant legislation questions were raised: Should the DMCA takedown service apply beyond copyright infringement to impersonations? Should artists be able to copyright works created through AI technology mimicking their voice? What regulations should apply to the licensing and transfer of minors’ NIL rights?
While there are many uncertainties concerning the future of AI regulation, here at Rockridge®, we pride ourselves on staying current with the ever-changing legal landscape. We are committed to ensuring our clients are protected from threats to their NIL and intellectual property.