Avatar Makers Venture Into Cloud Based VFX Workflows

Ang Lee’s 2012, Oscar-nominated movie, Life Of Pi needed 30 hours to render a single image. Imagine the number of frames that will be generated if we freeze a 2-hour long movie. That’s a lot of hours. Visual effects artists use a wide range of animation software to integrate computer-generated imagery with live-action footage.

Rendering these images into motion pictures is compute-intensive and time-consuming. During mega projects, VFX creators work round the clock, slouching on office couches for micro naps to the warmth of never stopping processors. However, things have changed now. 

The last six months especially have been difficult for the movie production teams across the globe. With uncertain lockdowns and work from home, companies have started to look towards alternate solutions.


Sign up for your weekly dose of what's up in emerging technology.

Today, the creators of groundbreaking works such as Avatar and The Lord of the Rings, Weta Digital have announced that they would be leveraging Amazon Web Services (AWS) for their VFX workflows. Talking about their latest collaboration, WETA Digital CEO, Prem Akkaraju said that back in March, Weta established a remote collaborative workflow to keep the work moving. Especially the Avatar sequels. With AWS, WETA wants to take unhindered cloud-based VFX workflows to a larger scale. 

What Can AWS Offer 

Source: WETA Digital

Based in New Zealand, Weta Digital is the largest single-site VFX studio in the world, drawing artists from over 40 countries. So far, WTA has earned six visual effects Academy Awards!

With tight schedules and heavy budgets, the production houses don’t want to leave any stones unturned. WETA Digital is keen on leveraging AWS’s unlimited compute capacity virtually. In this way, they want to ensure the safety of their artists while facilitating productivity.

Moving rendering workloads on AWS infrastructure brings real efficiencies to demanding production schedules, scaling the rendering workloads to thousands, or even tens of thousands of cores in minutes.

Source: AWS

For instance, using Amazon Elastic Compute Cloud (Amazon EC2), Weta Digital wants to access the broad range of specialised Graphics Processing Units (GPUs). They want to integrate machine learning into the VFX creation process. The company believes that deep learning can help generate super realistic characters.

AWS is the world’s first cloud provider to offer NVIDIA’s Tesla V100 GPUs with Amazon EC2 P3 instances. These workhorses are designed to optimise compute-intensive ML workloads. With 640 Tensor Cores, NVIDIA Tesla V100 GPUs break the 100 teraflops barrier of deep learning performance.

For example, with Amazon EC2 P3 instances, users can significantly reduce machine learning training times from days to hours. EC2 P3 instances offer up to one petaflop of mixed-precision floating-point performance, as well as a 300 GB/s second-generation NVIDIA NVLink interconnect that enables high-speed, low-latency GPU-to-GPU communication.

Source: Luma

It took awhile for the computer hardware to match up to the creativity of filmmakers. Today, thanks to cloud services, one might not even need to own heavy machinery to edit their movies. 

Special teams are designated to run the rendering alone. Now, thanks to multiple intelligent cloud platforms, the whole VFX workflow might undergo tremendous changes. Top studios like Technicolor, Paramount and Dreamworks have been using the cloud for their movies. 

Marvel Studios, in their latest Spider-Man outing, introduced Elementals. The VFX work for Spiderman was awarded to Luma Pictures and they, in turn, picked Google Cloud for the rendering job. According to Google Cloud, what could have taken 8 hours on a local render farm was cut down to just 90 minutes.

“It’s no longer a question of whether or not the transition to the cloud is happening, but rather how fast,” said Simon Vanesse, Mikros Head of Animation speaking about how AWS cloud was used in the making of the animated movie Sherlock Gnomes.

More Great AIM Stories

Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

Conference, in-person (Bangalore)
Cypher 2023
20-22nd Sep, 2023

3 Ways to Join our Community

Whatsapp group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our newsletter

Get the latest updates from AIM