In 1975, George Lucas founded the visual effects company Industrial Light & Magic (ILM). Two years later, he used a computer-controlled camera system to shoot the opera film, Star Wars: A New Hope. The ILM built the system and coupled it with custom-built processors to replicate camera movements. Lucas said the system had “a very powerful impact on the storytelling” and gave him “creative freedom”. However, compared to today’s sophisticated tech, Lucas’ AI system was pretty basic. Marvel villain Thanos’ origin story is a good case in point. The tech team put tracking dots on Josh Brolin’s face to capture his expressions. The data was then used to train the AI model, and voila! Thanos was born.
Today, AI is used in all stages of film production. Let’s zoom in.
Scriptwriting
In 2016, AI wrote the script for a 10-minute short film, Sunspring. The model was trained on scripts from the 1980s and 1990s. The AI, called Benjamin, used a recurrent neural network to generate the script. The movie, starring Silicon Valley fame Thomas Middleditch, made the top ten at the Sci-Fi London film festival. Since then, the same AI model created two more films.
AIM Daily XO
Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.
In 2019, comedian and writer Keaton Patti used an AI bot to generate a batman movie script. He posted the first page of the hilarious script and it went viral.
I forced a bot to watch over 1,000 hours of Batman movies and then asked it to write a Batman movie of its own. Here is the first page. pic.twitter.com/xrgvgAyv1L
— Keaton Patti (@KeatonPatti) August 13, 2019
In 2020, Calamity AI used Shortly Read, an AI tool built on GPT-3, to write the screenplay for a three-and-half-minute short film.
Download our Mobile App
The rapid advances in NLP and large language models have allowed filmmakers to crack the scriptwriting process. However, we are yet to see a full-length feature entirely written by AI.
Visual Effects (CGI)
Filmmakers today use CGI to bring dead actors back to life on screen. For example, two beloved Star Wars characters, Carrie Fisher (Princess Leia) and Peter Cushing (Grand Moff Tarkin) were recreated in Rogue One (the makers used CGI to make the actors look exactly like in the 1977 Star Wars: A New Hope). Carrie Fisher died before completing her scenes for Episode 9: The Last Jedi and CGI was used to complete her story. In the Fast and Furious franchise, the late Paul Walker was virtually recreated to finish his scenes.
Scheduling/pre-production
Cinelytic is an AI-based startup that assists studios, and independent film companies make faster and smarter decisions through the film’s value chain. Warner Bros has partnered with Cinelytic to implement an AI-based project management system. The platform provides analytics services, scheduling, financial modelling etc.
Belgium-based ScriptBook offers an AI-based script analysis and financial forecasts tool that analyses scenes and recommends whether or not to promote them. ScriptBook has built an AI algorithm to standardise and automate the process. However, the company said its algorithm is not a replacement for a decision-maker. ScriptBook gives a detailed break-up of the scenes including the genre, age restrictions, MPAA (Motion Picture Association of America) rating, and related films. The platform provides scene analysis, the character’s attractiveness score, the emotions evoked by the scenes, the results of the gender equality measurement based on the Bechdel test, etc.
Vault’s RealDemand AI platform analyses thousands of key elements of the story, outline, script, castings, and trailer to maximise ROI 18 months before a film’s release by factoring in release date, country, audience age etc.
Subtitling
“Once you overcome the 1-inch tall barrier of subtitles, you will be introduced to so many more amazing films,” said Korean auteur Bong Joon Ho, at the 2019 Oscars. The film industry is leveraging the advancements in universal translation to increase the reach of movies.
For example, Star Wars has been translated into more than 50 languages to date. However, you still need humans in the loop to ensure the subtitles are accurate.
Trailers
Researchers from the University of Edinburgh created an AI model based on a pair of neural networks that can generate an engaging trailer. The team used the system to generate more than 40 trailers for existing films, and Amazon Turk employees preferred the AI-generated trailers over the official ones.
“To create trailers automatically, we need to perform low-level tasks such as person identification, action recognition, and sentiment prediction, as well as higher-level tasks such as understanding connections between events and their causality, as well as drawing inferences about the characters and their actions,” according to the paper.
The researchers combined two neural networks. The first examines the film’s video and audio to identify scenes of interest. And the second is essentially the judge of what is interesting. It watches a textualised version of the film, similar to a screenplay, and uses natural language processing to identify significant and emotional moments. Based on how the neural networks process the input data, the completed model generates novel trailers using “movie understanding.”