Why Did OpenAI Disband Its Robotics Team?

When we created robotics, we thought that we could go pretty far.
Why Did OpenAI Disband Its Robotics Team?

Last month, OpenAI cofounder Wojciech Zaremba said the company has disbanded its robotics team in a Weights & Biases podcast. “I was actually working for several years on robotics. Recently, we changed the focus at OpenAI. I disbanded the robotics team. There are actually plenty of domains that are very rich with data. Ultimately that was holding us back, in the case of robotics,” said Zaremba. 

Why now? 

The decision was quite hard on him, but he later realised that this was best — from a company’s perspective (to achieve artificial general intelligence (AGI)). “When we created robotics, we thought that we could go pretty far. We had self-generated data and reinforcement learning. But, at the moment, I believe that actually pre-training allows to give model 100x cheaper IQ points, and that might follow with other techniques,” said Zaremba. 

Citing GPT-3, Zaremba said pre-training in language models includes training machine learning models on unsupervised tasks such as ‘next word predictions’. “But, in the case of robotics, we do not have such data,” he added. 

Initially, when OpenAI started robotics projects, Zaremba said they lacked clarity on how and what they wanted to build. But, over time, they got more clarity on things they wanted to focus on. 

Robotics crisis

Besides the dearth of data, OpenAI’s move away from robotics reflects the economic realities and capital intensive nature of the projects. Three years ago, Rethink Robotics was acquired by HAHN Group. Last year, the maker of driverless trucks, Starsky Robot shut shop. Same year, SoftBank halted the production of its famed robot Pepper

SoftBank acquired Boston Dynamics from Google in 2017 and recently sold the robotics company to Hyundai for $1.1 billion. Boston Dynamics caught the popular imagination with viral videos of its humanoid and dog-like robots.

Zaremba said building robots requires high computing capabilities. Plus, there are technical issues with running machines in real-time. “There are two possibilities to successfully deploy robots. One is to collect a lot of data. Another possibility is that we need powerful video models like that of powerful text models,” Zaremba said. 

The notable powerful language models at present include Google’s Switch Transformer, GPT-3, DistilBERT, Google Gshard, BAAI’s Wu Dao 2.0, and GPT-J

According to MIT’s ‘The state of industrial robotics: Emerging technologies, challenges and key research directions,’ the key challenges holding back the robotics industry includes the high cost of integration, lack of standards, inflexibility, better balance of speed and safety, data protocols, and investments to enable technologies.  

(Source: Allerin)

OpenAI’s robotic projects 

OpenAI first demonstrated its robotics work in October 2019, when it published research detailing a hand guided by an AI model with 13K years of cumulative experience, called Dactyl, which replicated a human’s hand movement to solve the Rubik’s cube. ‘Our robot still hasn’t perfected its technique though, as it solves the Rubik’s Cube 60% of the time (and only 20% of the time for a maximally difficult scramble),’ according to OpenAI.

OpenAI, at the time, had said, as a result of ADR development, more developers will be able to divert from building task-specific robots to general-purpose machines. ‘Indubitably, we are decades away from having a robot that will make its own decisions without human intervention, but leveraging ADR, developers can successfully attain it,’ as per OpenAI.  

In April 2019, OpenAI hosted its first Robotic Symposium. The event brought together a ‘diverse set of people’ from robotics and machine learning communities, alongside academics and industry leaders, to create a platform to exchange ideas and address open questions in building complex robot systems. 

Four years ago, OpenAI had released Robotschool, open-source software for robot simulation, integrated with OpenAI Gym. In February 2018, it released eight simulated robotics environments and a Baselines implementation of Hindsight Experience Replay. OpenAI has used these environments to train models which work on physical robots

Download our Mobile App

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR