Listen to this story
Sam Altman, CEO of OpenAI, took to Twitter to announce that ChatGPT clocked a million users merely a few days following its launch.
Based on the GPT-3.5 architecture, ChatGPT interacts with humans using natural language. Since its release, the chatbot has brought new life to conceiving AI-human interaction. The results have been much better than its previous model in that it offers higher quality, longer output, and better instruction-following.
Chatbots had existed before—such as Amazon’s Alexa—which had to recently pull a plug on its ‘Amazon Alexa’ voice-assisted feature. OpenAI’s success with ChatGPT is therefore something that Amazon wished it had done when it had the chance.
The model is released in its Beta version and is free to use. However, Altman has said OpenAI will look to monetise the model with an average cost of single-digit cents per chat. He also added that monetisation particularly becomes important as “the compute costs are eye-watering”.
Microsoft Azure supports OpenAI, providing them with the computational power required for running models like ChatGPT. However, Altman’s allusion to the “eye-watering” cost perhaps didn’t sit well with Microsoft. As a result, we also saw Altman salvaging the situation by referring to Azure’s important contribution to the OpenAI launches, saying that “they have built by far the best AI infrastructure out there.”
The GPT-3.5 architecture is built on the latest text-Davinci-003 released by OpenAI. However, the company also offers models like Ada, Babbage, and Curie. These models are available at a relatively lower cost than Davinci and perform as well as Davinci on a variety of tasks. For its customers, OpenAI recommends that its users use Davinci initially and later shift to cheaper models based on their requirements.