Active Hackathon

After Amazon, Apple And Google, Facebook To Develop Its Own ML Chip

Facebook is building a machine learning chip to manage content recommendations for its users.
Facebook chip, ML Chip

US-based tech giants have been on a hiring spree, spending millions on developing in-house computer chips. This sudden shift can be directed towards financial savings, minimising dependency on chip vendors, and improving the performance of its machine learning models. And when talking about trends by tech giants, Facebook cannot remain too far in the line. 

A report by The Information reveals that Facebook Inc. has joined the chip-making wagon and is currently developing a machine learning chip. The chip is aimed at helping the social media tech giant manage tasks, including the content recommendation to users, The Information reported, attributing two people familiar with Facebook’s latest project. 


Sign up for your weekly dose of what's up in emerging technology.

If successful, Facebook would be able to develop cost-efficient yet powerful semiconductors, reducing the carbon footprint of the data centres that it plans to build in the future. The report also suggests that Facebook has already developed a video transcoding chip to improve its in-app recorded and live-streamed video experience. 

Facebook’s long-pending plan 

Google has its own TPU (Tensor Processing Unit) line of AI processors, along with a custom Argos video-transcoding chip that YouTube uses.

Three years ago in 2018, Facebook poached Google’s lead chip developer Shahriar Rabbi as its Head of Silicon Engineering. That move was suggestive of Facebook’s plans to get into the silicon-designing business. The same year, Facebook started building custom chips to power servers and consumer hardware. Later in 2019, Facebook Engineering’s blog hinted at the company designing chips for handling AI inference and video transcoding for improved efficiency of its infrastructure that 2.7 billion users across its platforms then used. According to The Information’s report, Facebook is deploying more than 100 people to work on the ML chips. 

Declining business for Chip Vendors 

Tech giants have earlier taken the chip-making route. Amazon, one of the largest purchasers of computer chips in the world, got into chip designing. Tech behemoth Apple took the same course and called off its partnership with Intel. 

For the longest time, Intel chip technology has been used in personal computers and server systems. However, both Amazon and Apple are inclining towards British company Arm that licenses smartphones and consumer products. In June 2020, the cloud computing arm at Amazon got into marketing computing services based on its Arm-backed chips. Amazon went as far as claiming that its service was faster and more cost-efficient than Intel’s offerings — it was one-fifth the price of Intel’s. Later in November, Apple introduced Mac computers with their own Arm-backed chips. 

More recently, Google announced that it was going to launch its own microprocessor chip, Tensor, for Pixel smartphones. Samsung also announced that it would be deploying new software from Synopsys to use AI to create chips. In fact, Samsung is now planning to build a chip factory in Texas. Last month, IBM announced its new chip Telum to leverage deep learning inference at scale. 

Sources close to the project revealed that Facebook’s machine learning chip would be built completely in-house. This move is increasingly reducing the dependency of chip users (tech companies) on chip vendors, including Intel, Advanced Micro Devices and NVIDIA

The semiconductor industry, valued at $425.96 billion last year, is undergoing a transformational change. Chip users are leveraging their in-house resources to make their own chip components. This dependency reduction will save tech companies money and force chip designers to become competitive, ensuring higher efficiency and performance. 

Whether Facebook’s new ML chip will be at par with that of its chip vendors, only time will tell. The Information’s source revealed that they have no new updates to share on Facebook’s future plans, “Facebook is always exploring ways to drive greater levels of compute performance and power efficiency with our silicon partner and through our own internal efforts.”

More Great AIM Stories

Debolina Biswas
After diving deep into the Indian startup ecosystem, Debolina is now a Technology Journalist. When not writing, she is found reading or playing with paint brushes and palette knives. She can be reached at

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Rising 2023 | Women in Tech Conference
16-17th Mar, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM