Meta open sources OPT-66B

This completes a full release of logbooks detailing the development of all the OPT models.

On June 23, 2022, Meta announced the release of the Open Pretrained Transformer(OPT-66B), the largest unrestricted open-sourced model to date. The tech giant has also released the logbooks used for training all their baselines: 125M through 66B. This completes a full release of logbooks detailing the development of all the OPT models and marks the first time in the AI industry that such extensive notes are released with the models and associated paper.

Meta created a buzz when it launched OPT-175B, the large language model that could match OpenAI’s GPT-3 with only 1/7th of the carbon footprint to develop. Meta was one of the first companies to release a language model of such scale on a non-commercial licence and gave access to academic researchers, those affiliated with organisations in the government, civil society, academia and industry research laboratories around the world.

During the launch of OPT-175B, Meta announced it would be releasing the baseline models as well as notes and logbooks of the whole development and training process. As promised Meta has now released baselines  – OPT-125M, OPT-350M, OPT-1.3B, OPT-2.7B, OPT-6.7B, OPT-13B, OPT-30B, OPT-66B on GitHub

For more details, click here.

Download our Mobile App

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
MOST POPULAR