Active Hackathon

EleutherAI announces 20 billion-parameter language model, GPT-NeoX-20B

The company claimed it is the largest publicly accessible pretrained general-purpose autoregressive language model.

EleutherAI has announced GPT-NeoX-20B, a 20 billion parameter model trained using the GPT-NeoX framework on GPUs from CoreWeave. The company claimed it is the largest publicly accessible pretrained general-purpose autoregressive language model.

Image: EleutherAI

THE BELAMY

Sign up for your weekly dose of what's up in emerging technology.

The model will help in accelerating research towards the safe use of AI systems.

Image:  EleutherAI

The full model weights will be downloadable for free from February 9, under a permissive Apache 2.0 license from The Eye. Till then, you can try the model using CoreWeave and Anlatan’s inference service, GooseAI.

“GPT-NeoX and GPT-NeoX-20B are very much research artifacts and we do not recommend deploying either in a production setting without careful consideration,” EleutherAI said.

More Great AIM Stories

Sreejani Bhattacharyya
I am a technology journalist at AIM. What gets me excited is deep-diving into new-age technologies and analysing how they impact us for the greater good. Reach me at sreejani.bhattacharyya@analyticsindiamag.com

Our Upcoming Events

Conference, in-person (Bangalore)
Machine Learning Developers Summit (MLDS) 2023
19-20th Jan, 2023

Conference, in-person (Bangalore)
Data Engineering Summit (DES) 2023
27-28th Apr, 2023

Conference, in-person (Bangalore)
MachineCon 2023
23rd Jun, 2023

3 Ways to Join our Community

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Telegram Channel

Discover special offers, top stories, upcoming events, and more.

Subscribe to our newsletter

Get the latest updates from AIM