AI & Robotics

How AI is changing the film industry

Revolution in Hollywood: The use of AI could fundamentally change the way films are produced. Will stars and starlets eventually be replaced by digital avatars? Actors and actresses are worried.

900,000 dollars in annual salary - that's how much Netflix recently offered for a specialist who could build an AI platform for the company. The intelligence from the computer would help with the planning of the programme - the hint that it should also create great new content quickly disappeared from the advertisement. Perhaps Netflix didn't want to get caught in the middle of the heated debate that has been roiling Hollywood for some time.

In mid-July 2023, a strike by actors and actresses began there that received worldwide attention. One of the reasons for the industrial action is the concern that stars could sooner or later be largely replaced by artificial intelligence. The extent to which the work of actors and actresses is already influenced by AI is new to many observers.

Digital Special Effects - known for a long time

Digital special effects have been around for decades. The increasing computing power of computers has opened up more and more possible applications. The sea was already digitally animated in the 1997 Hollywood classic Titanic. Even then, actors' faces were scanned and digitally inserted. Today, this technology has reached a level of perfection undreamed of at the time. Film characters - and also heroes of video games - are now often hybrid beings who owe their existence on the one hand to the portrayal by an actor, and on the other hand to the imaginative design by a digital effects team.

Digital Doubles - the doubled star

The process begins with the creation of a digital double of the actor or actress. This requires a special volumetric studio that is enclosed in a capsule shape and equipped with dozens of remote cameras inside. The actor stands or sits at a predetermined position and goes through a series of agreed-upon movements, poses and facial expressions. Thousands of photos are taken from all angles while the face is scanned in various states. But characteristic body movements are also recorded and saved (motion capture). The resulting individual images are assembled into a digital three-dimensional figure on the computer. The positions of all the cameras are known to the system so that the images can be arranged correctly in perspective without any problems.

This virtual shell can be worked with in various ways. For example, stuntmen can be digitally "dressed up" with the actor's shell; their heads are exchanged for the actor's head and their movements serve only as a pattern according to which the digital twin is then animated.

When creating avatars for computer games, the digital double of a real actor can be equipped with fantastic body features. This creates fantasy beings with realistic-looking facial expressions and body movements. Digital figures created in this way can be placed in any setting against any kind of background. Visual effects artists can make them act and speak as they wish without having to use the real actor or actress again.

These techniques used to be very costly, as both the shooting and the animation require a lot of working time of highly qualified specialists. They are therefore often used to implement minor changes in the plot or dialogue after the fact. If speech text is rewritten and re-recorded after shooting, the mouth movements, for example, can be changed in the recording with limited effort.

Revolution through AI?

Artificial intelligence could radically simplify work with digital effects in the future - and thus usher in a completely new chapter in film history. The ability of AI to analyse large amounts of data and rearrange them independently is enormously attractive for working with digital doubles.

Artificial intelligence can process not only facial scans and images obtained in traditional motion-capture techniques. The database is much broader: in principle, AI can analyse all film recordings that exist of a person and extract the typical facial expressions and movement patterns from them - for all types of action situations for which film material is available. The digital twins thus become many times more detailed and realistic.

For example, the makers of the latest Indiana Jones film were faced with the challenge of making the now 81-year-old Harrison Ford in the title role appear young again for many scenes. To do this, they had AI analyse the previous four Indiana Jones films, including the unreleased footage. The software then created a convincing digital twin of the young Harrison Ford's face for all possible situations and lighting conditions. The real Ford's face was replaced by the digital alter ego.



Following this pattern, actors and actresses in the film business could become virtually ageless and take on youthful roles into old age - if the real stars are still needed in person at all. But not only actors, but also real people could be digitised by AI on the basis of documentary recordings. This opens up completely new possibilities for film biographies of historical personalities; but the creation of deep fakes - fake, pseudo-documentary film material - should also become much easier in the future.

The AI used in Indiana Jones was still quite limited in its capabilities. A large team of specialists was still needed to implement the face swap; reworking was often necessary if the AI's results were not completely convincing. However, many experts believe it is only a matter of time before artificial intelligence will also largely automate animation processes.

Sven Bliedung, for example, CEO of the digitalisation studio Volucap in Babelsberg, hoped in a Spiegel interview not only for cost savings of up to 60 per cent, but also for a democratisation of filmmaking. In the future, producers could work with an AI instead of a large special effects department, which could create photorealistic backgrounds and scenery as desired or insert effects such as explosions or fire into a shot at will. In this way, big visions could be realised with a small budget.

With extras, producers already don't need to spend much effort. Since they are usually not shown in close-up, the requirements for simulations are not as high as for real roles. For a long time now, digital technology has been able to multiply a small number of filmed extras to create the impression of a large crowd.

Meanwhile, extras are also being digitised in a similar way to actors - and sold in online shops as digital twins. Specialised suppliers like Renderpeople produce them in this way in stock, so to speak. Do you still need a walker with a dog for a scene? A digital copy might be available in the shop for as little as 150 euros ... At the moment, the new digital extra would still be inserted into the scene by hand, but in the future the AI might be able to do that too ("Insert the extra from the file into the camera perspective on the footpath in front of the green house and let him walk parallel to the street to the right ...").

What AI cannot (yet) do

Some of these ideas are still dreams of the future. At the moment, AI cannot create longer photorealistic videos that look really authentic. There is usually a hitch somewhere; be it that a movement sequence looks unnatural or a face too smooth. The makers of Indiana Jones 5 discovered how difficult it is to reproduce the human eye convincingly digitally: We are all highly trained to interpret the eye expression of our counterpart; even the smallest nuances of movement change how we perceive the other person. The AI was overwhelmed with the analysis and reproduction of these subtleties; the human specialists had to constantly intervene to correct them.

Such problems could eventually be solved through progress in the development of AI. But another limitation of AI will remain: The results can only ever be as good as the material on which the AI has been trained. If you don't have enough good shots of an actress, a historical figure, a natural phenomenon or a landscape, the AI cannot learn to reproduce them convincingly digitally. So the cameramen are unlikely to become superfluous for the time being.

Are stars becoming digitally interchangeable?

Actors and actresses, however, could be the ones to suffer from progress. Hollywood will certainly not want to give up on continuing to build stars - real people with whom the audience may identify. But the position of most actors vis-à-vis the production companies is likely to weaken as long as there is no non-transferable right to the digital copy of one's own face. Thus, in future, an actor in a series production could be dismissed if he demands a higher fee - in the next season, the role will then be filled by a less expensive colleague to whom the familiar face will simply be digitally mounted. Follow-up films of successful productions would not need to engage the old actors again - they could shoot with nobodies on whom AI puts the stars' faces.

So the latest development actually raises further questions for the film industry that need to be answered. Perhaps the technical revolution will be followed by a revolution in rights and contract models.

Read more