Did you ever glimpse an attractively grizzled former President of the United States, dancing barefoot on a beach with a lady who did not look like his famous first lady spouse, in an advertisement for an oceanside resort for seniors? Or a pitch for dental insurance that appeared to include an A-list actor who made a Castaway movie that included a DIY tooth extraction scene?
In all likelihood, it wasn't your eyes that lied. It was AI.
AI can generate songs that were never sung in the style, sound and voice of a music megastar; it can make politicians appear to say things that they never said; it can duplicate Old Masters or create new or lost art in the style of anyone; it can write essays or newscopy; or take exams. It can probably duplicate and improve upon a rival's video advertisements. (The latter is an extrapolation, that case was not about how an envied ad was recreated.)
The genius folks who once stole books by the library-ful (and got away with it) are now stealing celebrity likenesses, because they can.
Maybe that is a bridge too far, to coin a phrase. Enter the bipartisan draft of a No F.A.K.E.S. Act to protect celebrities from having their voices or likenesses lifted for the purpose of making them appear to endorse ideas or products that they would never consent to endorse.
Legal blogger Laura Lamansky for the law firm Michael Best & Friedrich LLP discusses how rights of publicity are under seige by tech, and what a bipartisan group of Senators hope to do to rein in AI.
Long time copyright-friendly Senators Amy Klobuchar of Minnesota, Thom Tillis of North Carolina, Marsha Blackburn of Tennessee, and Chris Coons of Delaware have released a draft of the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023 (known as the No F.A.K.E.S. Act).
One suspects that they started with the acryonym and twisted a thesaurus to come up with the full name of the proposed legislation! After all, it is about a lot more than "Entertainment". Carping aside, it is a laudable effort. As Laura Lamansky explains, (emphasis, hers):
"The bill would create a new right to control digital copies of a person's image, voice, and likeness and such protection would last 70 years after the individual's death. Violators of the Act could be liable for $5,000 per violation."
As with existing copyright law, there are some exceptions in the Act to cover legitimate ventures such as documentaries and parody.
The other copyright exceptions in the DMCA cover news reportage and education. My personal view is that AI should not be making up the news.
For another take on the No Fakes Act, I recommend the analysis by legal bloggers Jennifer A. Kenedy and Jorden Rutledge of the law firm Locke Lord, published here:
https://www.lockelord.com/newsandevents/publications/2023/10/no-fakes-act
Under existing law, an individual can only sue for rights of publicity if their likeness has commercial value. The No Fakes Act would protect (as I interpret it) private persons from, for instance, deep-faked revenge porn.What the Locke Lord lawyers write is:
"...A student bullying another by creating a “digital replica” of his target and spreading it through school would likely be free from any right-of-publicity suit under the current law, as the victim’s “identity” is without commercial value, and, in any event, the damages would be negligible or difficult to prove. The NO FAKES Act, however, would prohibit this conduct and stack statutory damages in multiples of $5,000.00, not just for the creation of the unauthorized digital replica, but also for each unauthorized distribution. It would also make clear that anyone forwarding or hosting that content could also be liable...."
Emphasis is mine. There is a lot more in the Locke Lord article, and it is important. I can only scratch the surface.
As has been pointed out, one of the sticking points in the recent SAG-AFTRA strike was that movie studios appear to have attempted to require extras (or background actors) to sign away their rights to their likeness in perpetuity. One might be paid to be part of a crowd scene once, and one's likeness could be cut and pasted into myriad, literally, future crowd scenes and one would never be paid again.
For authors, it's best to know this information, because if you use AI on your cover art, or if your webmistress uses it, this No FAKES Act might affect you one day.
All the best,
No comments:
Post a Comment