What BloombergGPT Brings to the Finance Table

The latest LLM by Bloomberg, trained on 700 billion tokens, is an ingredient model said to boost Bloomberg Terminal service
Last week, Bloomberg released a research paper on its large language model BloombergGPT. Trained on over 50 billion parameters, the LLM model will be a first-of-its-kind AI generative model catering to the finance industry. While the move may set a precedent for other companies, for now, the announcement sounds like a push for the data and news company to seem relevant in the AI space.  Interestingly, Bloomberg already has Bloomberg Terminal, which employs NLP and ML-trained models for offering financial data. So, naturally, the question that arises is: how much of a value-add is BloombergGPT and where does it stand in comparison to other GPT models?  Training and Parameters Bloomberg’s vast repository of financial data over the past forty years, has been used for training the GPT model. It is trained on 363 billion token proprietary datasets (financial documents) available from Bloomberg. In addition, 345 billion token public datasets were also incorporated to result
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Vandana Nair
Vandana Nair
As a rare blend of engineering, MBA, and journalism degree, Vandana Nair brings a unique combination of technical know-how, business acumen, and storytelling skills to the table. Her insatiable curiosity for all things startups, businesses, and AI technologies ensures that there's always a fresh and insightful perspective to her reporting. She now hosts her tech segment 'Point Break' on AIM Tv.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed