Avatar Makers Venture Into Cloud Based VFX Workflows

Ang Lee’s 2012, Oscar-nominated movie, Life Of Pi needed 30 hours to render a single image. Imagine the number of frames that will be generated if we freeze a 2-hour long movie. That’s a lot of hours. Visual effects artists use a wide range of animation software to integrate computer-generated imagery with live-action footage.

Rendering these images into motion pictures is compute-intensive and time-consuming. During mega projects, VFX creators work round the clock, slouching on office couches for micro naps to the warmth of never stopping processors. However, things have changed now. 

The last six months especially have been difficult for the movie production teams across the globe. With uncertain lockdowns and work from home, companies have started to look towards alternate solutions.

AIM Daily XO

Join our editors every weekday evening as they steer you through the most significant news of the day, introduce you to fresh perspectives, and provide unexpected moments of joy
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Today, the creators of groundbreaking works such as Avatar and The Lord of the Rings, Weta Digital have announced that they would be leveraging Amazon Web Services (AWS) for their VFX workflows. Talking about their latest collaboration, WETA Digital CEO, Prem Akkaraju said that back in March, Weta established a remote collaborative workflow to keep the work moving. Especially the Avatar sequels. With AWS, WETA wants to take unhindered cloud-based VFX workflows to a larger scale. 

What Can AWS Offer 

Source: WETA Digital

Based in New Zealand, Weta Digital is the largest single-site VFX studio in the world, drawing artists from over 40 countries. So far, WTA has earned six visual effects Academy Awards!


Download our Mobile App



With tight schedules and heavy budgets, the production houses don’t want to leave any stones unturned. WETA Digital is keen on leveraging AWS’s unlimited compute capacity virtually. In this way, they want to ensure the safety of their artists while facilitating productivity.

Moving rendering workloads on AWS infrastructure brings real efficiencies to demanding production schedules, scaling the rendering workloads to thousands, or even tens of thousands of cores in minutes.

Source: AWS

For instance, using Amazon Elastic Compute Cloud (Amazon EC2), Weta Digital wants to access the broad range of specialised Graphics Processing Units (GPUs). They want to integrate machine learning into the VFX creation process. The company believes that deep learning can help generate super realistic characters.

AWS is the world’s first cloud provider to offer NVIDIA’s Tesla V100 GPUs with Amazon EC2 P3 instances. These workhorses are designed to optimise compute-intensive ML workloads. With 640 Tensor Cores, NVIDIA Tesla V100 GPUs break the 100 teraflops barrier of deep learning performance.

For example, with Amazon EC2 P3 instances, users can significantly reduce machine learning training times from days to hours. EC2 P3 instances offer up to one petaflop of mixed-precision floating-point performance, as well as a 300 GB/s second-generation NVIDIA NVLink interconnect that enables high-speed, low-latency GPU-to-GPU communication.

Source: Luma

It took awhile for the computer hardware to match up to the creativity of filmmakers. Today, thanks to cloud services, one might not even need to own heavy machinery to edit their movies. 

Special teams are designated to run the rendering alone. Now, thanks to multiple intelligent cloud platforms, the whole VFX workflow might undergo tremendous changes. Top studios like Technicolor, Paramount and Dreamworks have been using the cloud for their movies. 

Marvel Studios, in their latest Spider-Man outing, introduced Elementals. The VFX work for Spiderman was awarded to Luma Pictures and they, in turn, picked Google Cloud for the rendering job. According to Google Cloud, what could have taken 8 hours on a local render farm was cut down to just 90 minutes.

“It’s no longer a question of whether or not the transition to the cloud is happening, but rather how fast,” said Simon Vanesse, Mikros Head of Animation speaking about how AWS cloud was used in the making of the animated movie Sherlock Gnomes.

Sign up for The Deep Learning Podcast

by Vijayalakshmi Anandan

The Deep Learning Curve is a technology-based podcast hosted by Vijayalakshmi Anandan - Video Presenter and Podcaster at Analytics India Magazine. This podcast is the narrator's journey of curiosity and discovery in the world of technology.

Ram Sagar
I have a master's degree in Robotics and I write about machine learning advancements.

Our Upcoming Events

24th Mar, 2023 | Webinar
Women-in-Tech: Are you ready for the Techade

27-28th Apr, 2023 I Bangalore
Data Engineering Summit (DES) 2023

23 Jun, 2023 | Bangalore
MachineCon India 2023 [AI100 Awards]

21 Jul, 2023 | New York
MachineCon USA 2023 [AI100 Awards]

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR

Council Post: The Rise of Generative AI and Living Content

In this era of content, the use of technology, such as AI and data analytics, is becoming increasingly important as it can help content creators personalise their content, improve its quality, and reach their target audience with greater efficacy. AI writing has arrived and is here to stay. Once we overcome the initial need to cling to our conventional methods, we can begin to be more receptive to the tremendous opportunities that these technologies present.