On June 23, 2022, Meta announced the release of the Open Pretrained Transformer(OPT-66B), the largest unrestricted open-sourced model to date. The tech giant has also released the logbooks used for training all their baselines: 125M through 66B. This completes a full release of logbooks detailing the development of all the OPT models and marks the first time in the AI industry that such extensive notes are released with the models and associated paper.
Meta created a buzz when it launched OPT-175B, the large language model that could match OpenAI’s GPT-3 with only 1/7th of the carbon footprint to develop. Meta was one of the first companies to release a language model of such scale on a non-commercial licence and gave access to academic researchers, those affiliated with organisations in the government, civil society, academia and industry research laboratories around the world.
During the launch of OPT-175B, Meta announced it would be releasing the baseline models as well as notes and logbooks of the whole development and training process. As promised Meta has now released baselines – OPT-125M, OPT-350M, OPT-1.3B, OPT-2.7B, OPT-6.7B, OPT-13B, OPT-30B, OPT-66B on GitHub,
For more details, click here.