Tennessee Governor Signs ELVIS Act To Bolster AI Protections For Artists’ Voices And Likenesses

Tennessee Governor Signs ELVIS Act To Bolster AI Protections For Artists’ Voices And Likenesses

Tennessee Governor Bill Lee signed a bill today that bolsters the protections for the use of a person’s image and likeness in areas like artificial intelligence.

The Ensuring Likeness Voice and Image Security Act, or ELVIS Act, expands on the state’s right of publicity law. The new law expands the unauthorized uses of a person’s image and likeness to include not just name, photograph, or likeness, but the use of a person’s voice.

The bill also finds that a person is liable to a civil action “if the person distributes, transmits, or otherwise makes available an algorithm, software, tool, or other technology, service, or device.”

Related Stories

The law includes an exemption for news, public affairs, or sports broadcasts or accounts, to the extent that it is protected by the First Amendment. There also is a fair use exemption for the purposes of comment, criticism, scholarship, satire or parody.

Lee was joined by musicians and other advocates at a signing ceremony at Robert’s Western World in Nashville. “The leaders of this are showing artists who are moving here following their dreams that our state protects what we work so hard for, and I personally want to thank all of our legislators and people who made this bill happen,” said Luke Bryan.

A group of recording artists, songwriters, composers, publishers and other figures are advocating for the additional protections as part of the Human Artistry Campaign .

The group is advocating for federal legislation, the No AI Fraud Act, that would prevent a person from producing or distributing an unauthorized AI-generated replica of an individual to perform in an audiovisual or sound recording without the consent of the individual being replicated. Essentially, the legislation would address the lack of a right of publicity law at the federal level.

The emergence of AI has raised concerns, and even alarm, over the proliferation of deepfakes. Lainey Wilson testified at a Los Angeles field hearing of a House Judiciary subcommittee in February, telling lawmakers, “I do not have to tell you how much of a gut punch it is to have your likeness or your voice ripped from you and used in ways that you could never imagine or would never allow. It is wrong, plain and simple.”

The Motion Picture Association, meanwhile, has expressed First Amendment concerns. A spokesperson said when the House bill was introduced earlier this year that “any legislation must protect the ability of the MPA’s members and other creators to use digital replicas in contexts that are fully protected by the First Amendment.” The studio trade association also noted that the recent SAG-AFTRA contract includes “rights to informed consent and compensation for use of their digital replicas.”