October 27, 2022
Digital simulations of Elon Musk, Tom Cruise, Leonardo DiCaprio, and others have shown up in ads, as the image-melding technology grows more popular—and presents the marketing industry with new legal and ethical questions, reports The Wall Street Journal.
Among the recent entries: Last year, Russian telecommunications company MegaFon released a commercial in which a simulacrum of Hollywood legend Bruce Willis helps defuse a bomb. And just last week, Elon Musk seemed to star in a marketing video from real-estate investment startup reAlpha Tech.
All the videos of digital simulations were created with so-called deepfake technology, which uses computer-generated renditions to make the Hollywood and business notables say and do things they never actually said or did.
Some of the ads are broad parodies—and the meshing of the digital to the analog in the best of cases might not fool an alert viewer. Even so, the growing adoption of deepfake software could eventually shape the industry in profound ways while creating new legal and ethical questions, experts said.
Authorized deepfakes could allow marketers to feature huge stars in ads without requiring them to actually appear on-set or before cameras—thereby bringing down costs and opening new creative possibilities.
But unauthorized, they create a legal gray area: Celebrities could struggle to contain a proliferation of unauthorized digital reproductions of themselves and the manipulation of their brand and reputation, experts said.
“We’re having a hard enough time with fake information. Now we have deepfakes, which look ever more convincing,” said Ari Lightman, professor of Digital Media and Marketing at Carnegie Mellon University’s Heinz College of Information Systems and Public Policy.
U.S. lawmakers have begun to address the deepfake phenomenon. In 2019, Virginia outlawed the use of deepfakes in so-called revenge porn; Texas outlawed them in political campaigns; and California banned them in both. Last year, the U.S. National Defense Authorization Act instructed the Department of Homeland Security to produce annual reports on threats posed by the technology.
But experts said they aren’t aware of laws specifically addressing the use of deepfakes in commercials.
Celebrities have had some success suing advertisers for the unauthorized use of their images under so-called right of publicity laws, said Aaron Moss, chair of the Litigation Department at law firm Greenberg Glusker. He cited Woody Allen’s $5 million settlement with American Apparel in 2009 over the director’s unapproved appearance on a billboard advertising the risqué clothing brand.
Both Paperspace and reAlpha had lawyers review the videos and took steps to ensure that viewers understood that the celebrities depicted didn’t actually endorse the companies’ products or participate in the making of the videos, the companies said.
The Paperspace video originally appeared on its own website and was designed to educate users about deepfake technology, said COO Daniel Kobran.
The Musk video by reAlpha included “robust disclaimers” establishing it as satire, said CMO Christie Currie. So did a similar video reAlpha released last year, in which an ersatz version of the Tesla chief sat in a bubble bath and explained the concept of Regulation A+ investing, or equity crowdfunding.
“There’s obviously always a little bit of risk with any parody type of content,” Currie said in an interview, “but generally as long as it’s meant to be educational, satirical, and you have disclaimers in place, there shouldn’t be a problem as long as you’re not pushing a transaction.”
The likelihood that someone of Mr. Musk’s stature would sue a startup for a deepfake video is low, and those companies might decide the risk is well worth the considerable publicity it would generate for them, Moss said.
“A lot of these companies purposefully get as close to the line as possible in order to almost troll the celebrities they’re targeting,” he said.
Research contact: @WSJ