Listen to this story
When OpenAI released ChatGPT into the wild, the decision was looked down upon by most experts. To launch a flawed chatbot that “hallucinated” incorrect responses wasn’t acceptable to most AI research units until then. Not only had big tech companies like Google, the distinct frontrunner in AI research, refrained from opening up their tech to the public – they hadn’t even considered building products out of them.
OpenAI is also the company behind GPT-3 – a benchmark model among LLMs – and text-to-image generator DALL.E.
Fail fast, adapt faster
In a four-year-old YouTube video, Sam Altman, president of Y Combinator (YC), an American technology startup accelerator, discusses what it takes to build a successful startup. “You want people to have a bias towards action. Startups, especially in their early days, win by moving in very quickly. Initially, you never get as much data as you would like and you never have as much time to deliberate as you would like. And yet, you need people who will act with much less data than they’d like to have and much less certainty. If they act and it doesn’t work, they adapt really quickly,” he explained.
At Y Combinator, pace was everything – it didn’t matter if the product itself was still ‘unfinished’. To Altman, and his peers, companies simply had to launch products and if they failed, they would simply have to re-adjust and make changes.
In a blog on YC, Gustaf Alstromer, a general partner at the incubator, reiterated a similar sentiment around their experience with the online marketplace Airbnb. During its nascent days, Alstromer mentions, the startup didn’t really have the goods. “They didn’t really have, in the product, what would make Airbnb take off,” he said. But this didn’t deter the founders. Chesky and the team went ahead and assessed their listings, gradually fine-tuning them.
In the aforementioned video, Altman refers to how important it is for founders to know their money-maker. “Businesses need to have a sensible business model. This doesn’t necessarily mean that you need to know right away what exactly to do. But if founders look like it’s the first time they’ve thought about making money, then that’s a bad sign,” he added.
Altman’s beginnings at YC
There’s something different about Altman. The 37-year-old co-founder and CEO of OpenAI is giving tech bigwigs a run for their money. Known more for a perceptive sense in business than successes in scientific engineering, Altman’s startup has disrupted the AI race.
In 2011, Altman began as a part-time partner at the prestigious tech accelerator, Y Combinator. In 2014, Altman was handpicked by co-founder Paul Graham as president of Y Combinator in February 2014. The same year, a blog post by Altman noted that the total valuation of the incubator’s companies had exceeded USD 65 billion and included brands that went on to become household names (Airbnb, Dropbox and Stripe).
Altman had big plans and moved fast. In 2016, after he was appointed as the president of YC Group, Altman said that he hoped to expand the Y Combinator fund to include 1,000 new companies every year while also widening the categories of startups that YC would fund.
Altman’s tendency to move at a breakneck speed is essentially what has given OpenAI the upper hand.
AI research prior to OpenAI
The field of AI research before ChatGPT was stuck in a bizarre contest solely concerned with publishing papers. In a telling certitude, OpenAI had published the least number of papers among AI research startups while Google had published the most.
In 2015, Google had just acquired a London-based startup that built neural networks, called DeepMind. To industry insiders, it looked like DeepMind was poised to develop computer superintelligence called artificial general intelligence or AGI. OpenAI’s inception was a move by a band of confederates, including Musk and Altman, to defend a corporate entity as massive as Google from monopolising technology as pre-eminent as artificial intelligence.
But as expenses from training and deploying large language models piled up, Altman’s deep-seated sensibilities from Y Combinator came to the fore.
However, the biggest irony was that the main architecture that supports generative AI models today, including most deep learning models in NLP, was the Transformer model – a concept introduced by Google Brain researchers. While Google had been too risk-averse to capitalise on their own research, OpenAI had pounced on it.
For Altman and probably most of the world, research isn’t meant to be carried on in a vacuum – it only counts when eventually commoditised.