Listen to this story
While the craze for ChatGPT continues to skyrocket with each passing day, the chatbot has also triggered a debate on the costs involved – in terms of the use and sustainability of the chatbot. The question is, for how long will Microsoft be able to support OpenAI’s ambitions?
Microsoft chief Satya Nadella, at their Leadership Summit held today, said that the company is investing massively, backed by its infrastructure spread across 60 plus regions and over 200 plus data centres worldwide. About OpenAI, Nadella said that it is able to achieve amazing results, thanks to the training and inference infrastructure provided by Microsoft Azure.
However, Nadella missed out on updating the stakeholders present at the summit on the sustainability side of things, as to how long this will go on, and what the end goal really looks like? The concerns came to light after a user on Twitter asked if ChatGPT would be free permanently. To this, OpenAI chief Sam Altman said that the average cost was single-digit cents per chat. Earlier, too he had said that the compute costs (per API call) were eye-watering.
Tom Goldstein, associate professor at Maryland, in his tweet estimated the cost of running the chatbot at $100k per day or $3 million per month!
He broke down his calculation by explaining that ChatGPT cannot be fitted on a single GPU. One would need 580Gb A100 GPUs just to load the model and text. ChatGPT cranks out about 15-20 words per second. If it uses A100s, that could be done on an 8-GPU server (a likely choice on Azure cloud).
So what would this cost the host? On Azure cloud, each A100 card costs about $3 an hour. That’s $0.0003 per word generated. But it generates a lot of words! The model usually responds to a query with approximately 30 words, which adds up to about 1 cent per query.
Free, how long?
As per an article published in The Atlantic, making the first taste free, has been a brilliant marketing strategy by OpenAI. In the weeks since its release, more than a million users have already used the chatbot for a range of tasks. For example, if we assume that ChatGPT has 1 million active users, who make an average of 30 queries per month, and that it costs 1 cent per query, then the total monthly cost is $0.30 million.
Now, if we take the above figure, it seems sustainable for OpenAI to offer the service. But, the question for how long? Currently, there are users willing to pay on subscription basis for using the chatbot as reflected in these conversations on Twitter.
However, there is a possibility that the users may drop out if a subscription or paywall is introduced as the chatbot is available to use for free at present. This might even pave the way for quality use cases from the users with better feedback-mechanism and fine-tuning required.
For businesses: Similar to how OpenAI has been charging developers for the API usage of other models (DALL-E 2), businesses who want to build applications on top of ChatGPT will be their potential customers.
The question is how much they will charge them for.
For customers and the general public: The tool can be used to build new applications end-to-end. For example: Replit used ChatGPT to build a website in real time. The tool can also be offered on a one-time, monthly or quarterly subscription basis for advertisers and content generators.
While Nadella highlighted that Microsoft has set up over 200 data centres, with more coming up globally to enhance their offerings, the consumption of water adds to the concerns. Data centres use enormous amounts of water to cool down, and while training and inference data is important, we also need to have a robust data centre policy to address the strain on natural resources.
To give you an idea, a 1 MW data centre can use up to 25.5 million litres of water per year, and if that sounds huge to you, big data centres, like Google’s, use more than a billion litres of water every year. However, Microsoft has announced its ambitious commitment to be water positive in its direct operations by 2030. This means that by 2030, Microsoft will replenish more water than it consumes on a global basis.
What experts think of ChatGPT
Yann LeCun, VP and chief AI scientist at Meta AI said that “OpenAI has been able to deploy its systems in such a way that it has been able to use the feedback from the system to produce better outputs.”
Yoshua Bengio, a leading expert on AI shares, “Companies have pretty much exhausted the amount of data available on the internet. In other words, the current large language models are trained on everything available,” said Bengio. For instance, ChatGPT, which has managed to enthral the world by answering in a “human-adjacent” manner, is based on the GPT-3.5 architecture, having 175 billion parameters.
While many accessed ChatGPT out of sheer curiosity, many developers started playing with it and many side projects were born, even though the official API for ChatGPT is not available yet. Soon, they found ways to integrate ChatGPT with WhatsApp, Telegram and other messaging platforms to embed ChatGPT in the MacOS menu bar.
“ChatGPT reflects the emergence of a new reasoning engine, and the ways it can be augmented,” said Nadella, pointing at knowledge workers using it to be more creative, expressive and productive. He also said that frontline workers will be able to do more knowledge work with the help of Co-pilot. “We also have to consider facets of its responsible use and what displacement it may cause,” added Nadella.