banner
News center
Our team is wholly committed to providing professional technology and solutions for you.

Digital Replicas, a Fear of Striking Actors, Already Fill Screens

Aug 18, 2023

Advertisement

Supported by

The technology for morphing flesh-and-blood performers into virtual avatars has been improving for years. Now it has become an issue in the actors’ strike.

By Marc Tracy

To pack three seasons’ worth of English soccer stadiums with exasperated or exhilarated crowds, the Apple TV+ comedy “Ted Lasso” turned to dozens of background actors and powerful visual effects technology.

Using a technique known as crowd tiling, the company Barnstorm VFX helped film groups of extras in one alignment before rearranging them and filming them again, and then cutting and pasting the various groupings to fill all the seats. The show’s makers also used crowd sprites, in which actors were filmed individually on green screens and then arranged to appear as part of the crowd. There were even digital doubles: three-dimensional models whose movements were informed by a motion actor.

Innovations in digital technology and artificial intelligence have transformed the increasingly sophisticated world of visual effects, which can ever more convincingly draw from, replicate and morph flesh-and-blood performers into virtual avatars. Those advancements have thrust the issue toward the top of the grievances cited in the weekslong strike by the actors’ union.

SAG-AFTRA, the union representing more than 150,000 television and movie actors, fears that a proposal from Hollywood studios calling for performers to consent to use of their digital replicas at “initial employment” could result in its members’ voice intonations, likenesses and bodily movements being scanned and used in different contexts without extra compensation.

Duncan Crabtree-Ireland, SAG-AFTRA’s chief negotiator, said it would be impossible for actors to provide informed consent without knowing how their digital replicas would be used in a cinematic universe or, in some cases, unknown future projects.

“That’s really abusive,” he said, “and not an OK way for companies to deal with somebody’s image, likeness or persona. It’s like owning a person.”

In an explanation on its website, the union says its counterproposals include guarantees for “informed consent and fair compensation when a ‘digital replica’ is made or our performance is changed using A.I.”

A spokesman for the Alliance of Motion Picture and Television Producers, the organization that negotiates for the studios, disputed the union’s characterization of its proposal. The alliance’s position would “only permit a studio to use the digital replica of a background actor in the motion picture for which the background actor is employed,” the spokesman, Scott Rowe, said in an emailed statement. “Any other use requires the background actor’s consent and bargaining for the use, subject to a minimum payment.”

There were 17,000 active members of the union who performed background work in the last year, and more than 80,000 who have done it at some point in their careers, according to union figures. Background actors receive a daily rate of $187 for an eight-hour day.

Jennifer E. Rothman, a professor at the University of Pennsylvania’s law school who specializes in intellectual property, said that if limits on digital replicas were not hammered out at the bargaining table, lower-profile performers might not realistically be able to say no to studio demands.

“It’s the up-and-comers and the extras who won’t have any leverage,” she said.

Lawson Deming, a visual effects supervisor and a co-founder of Barnstorm, echoed that view. Famous actors will be able to negotiate into their contracts that they own their likenesses, he said, but a vast majority of the SAG-AFTRA membership will not be so lucky.

“It’s not a question of technology,” he said. “It’s a question of who has the power in the relationship.”

That is partly because the technology is already here.

Such scenarios can sound like science fiction, but “performances” by the past selves of aged or even deceased actors have helped carry movies like 2016’s “Rogue One: A Star Wars Story.” Aided by motion capture recorded on a different actor, Peter Cushing, who died in 1994, reprised his role as Grand Moff Tarkin from the original 1977 “Star Wars” film. (His estate gave permission.)

“Digital humans have been part of the visual effects process for quite a while now — about 20 years,” said Paul Franklin, a visual effects supervisor at DNEG.

Initially they were created for what Franklin called “digital stunt doubles,” replicas of actors for stunts that were so death-defying or impossible that real-life stunt doubles would not do them. In one case, he helped a digital replica of the actor Henry Cavill fly as Superman in 2013’s “Man of Steel.”

Doing such work often involves a practice known as photogrammetry, in which many photographs of something physically real — like an actor’s head — can be used to digitally reconstruct it in three dimensions.

Technical wizardry is also used to build crowd scenes like the ones in “Ted Lasso.” For 2012’s “The Dark Knight Rises,” Franklin used digital techniques to fill the nearly 70,000 seats at Pittsburgh’s professional football stadium with just 11,000 extras.

Bob Wiatr, a visual effects compositor, was instrumental in filling out crowd scenes for “Daisy Jones & the Six,” a limited series on Amazon that had its debut this year. In one scene in which the camera, angled behind the titular rock band, looks onto the crowd, real background actors occupy the front rows, while computer-generated avatars fill up the rest.

“Sometimes there are people that are recognizable in the front,” Wiatr said, referring to other projects, “and they decide they want to do it from another angle later after they’ve already shot the scene, so they recreate the shot, and then usually you have 3-D-generated people — a lot of software can make it.”

That does not mean, however, that the task is simple or cheap. Deming of Barnstorm cautioned that concerns over reusing actors’ digital scans might be overblown.

“It is very complex to digitally take a scan of someone and make it animatable, make it look realistic, make it functional,” he said, though he allowed that “we’re making very big leaps.”

The biggest leaps are now being made in artificial intelligence. Last month the comedian Sarah Silverman joined class-action lawsuits against the companies OpenAI and Meta, accusing them of using her copyrighted written work to train their artificial intelligence models. SAG-AFTRA is concerned about something similar happening to actors’ performances.

Linsay Rousseau, a voice actor and performance capture artist, said performers were fearful of a future in which artificial intelligence reduces or eliminates roles for humans.

“We’re worried,” she said, “that we go in and record a session, they then take it, synthesize that voice, and don’t call us back. Or they process them to create new voices and thus do not call actors in to do that work.”

One visual effects company, Digital Domain, said in a statement that in the past five years it had used A.I. to “greatly accelerate” and “increase the accuracy” of digital avatars based on background actors.

“Machine learning is used to create a library of all possible facial shapes of a given actor, or the possible deformations of a piece of clothing or a set of muscles,” the statement said. “This library is then used to create a lifelike digital version of what was captured. We also have the technology now where we can create performances of historical figures based on existing footage.”

Franklin, a two-time Oscar winner for his visual effects work on “Inception” and “Interstellar,” said it was clear that technology had advanced beyond the scope of typical industry contracts.

“I think it is a valid concern,” he said. “People think, ‘Well, what’s going to happen to this data? How’s it going to be used in the future?’”

Marc Tracy is a reporter on the Culture desk. More about Marc Tracy

Advertisement