MITB Banner

Watch More

After Apple, Amazon, Google and Microsoft, Meta Now Builds Its Own AI Chips

Meta recently introduced its first in-house silicon chip designed for AI workloads, called MTIA (meta training and inference accelerator)
Listen to this story

Meta recently introduced its first in-house silicon chip designed for AI workloads, called MTIA (meta training and inference accelerator). This AI chip is a custom-designed ASIC built with next-generation recommendation model requirements in mind. The accelerator is built on TSMC’s 7nm process at a 25W TDP and provides 102.4 TOPS at INT8 precision and 51.2 TFLOPS at FP16 precision. 

Meta believes that by having it in-house, they can optimise every single nanometer of the chip so they don’t waste any part of the architecture, in terms of area, alongside bringing down the power for the chip, thereby reducing the cost for the ASIC. 

The company said that the benefit of building its own ASICs is that they now have access to real workloads that are used by its ads team and other groups at Meta, where it can execute performance analysis on its design, fine-tune, and tweak all the parameters that go into high-performance solutions by incorporating the silicon with the software environment. 

With this, the team at Meta is able to speed up the software development cycles and deploy the models at a much faster pace and help to improve the user experience. 

Powered by PyTorch 

Meta said that it has also developed a compiler technology that runs under the PyTorch environment. “MTIA executes on those workloads with the highest performance and lowest power,” added the team, saying that it achieved the efficiencies compared to today’s GPUs. 

Further, it is that its new chip was designed in collaboration with a lot of cross-functional teams that care about the chip, the board, the system, the rack, the data center, their constraints and optimisations as well as the software parts of IT firmware, compiler, application level, runtimes, PyTorch, models, application models. “So, all of this has come together to put a system that is optimised and tailored for Meta’s workloads, and MITIA is just one piece of it,” said Olivia Wu, design lead at MTIA. 

She believes that by having an in-house design, the team is able to take control of their destiny and are able to specify the architecture of the design and match it with the roadmap for the workload that is coming out in the future. 

Meta vs the world

Meta isn’t the only one that is working on developing in-house AI chips. Recently, reports emerged that Microsoft has been working on its own in-house AI processor, called Athena, in partnership with the chip company AMD.  Read: After Google, Microsoft Targets NVIDIA

Besides Microsoft and Meta, Google, Apple and Amazon have also been working on developing in-house AI chips. For instance, Google has built a supercomputer to train its models with its TPUs (Tensor Processing Units). Apple has been working on M1 and M2 chips for quite some time now. Amazon, on the other hand, is working on Trainium and Inferentia processor architectures. 

Access all our open Survey & Awards Nomination forms in one place >>

Picture of Tasmia Ansari

Tasmia Ansari

Tasmia is a tech journalist at AIM, looking to bring a fresh perspective to emerging technologies and trends in data science, analytics, and artificial intelligence.

Download our Mobile App

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
Recent Stories