Listen to this story
|
US-based semiconductor technology company Qualcomm is working to help run AI models like GPT-4 and DALL-E2 on smartphones through its new generation processors, according to reports.
Currently, users are able to access these models through the web, and also through a mobile app ( currently iOS devices). However, the application still runs on the web and it’s a costly affair. Dylan Patel, chief analyst at semiconductor research firm SemiAnalysis, told the Information that running ChatGPT is costing OpenAI around USD700,000 a day.
The cost incurred is mostly for running the servers. Patel believes the cost could be even higher for running GPT-4, which is the most advanced large language model so far.
In a whitepaper released in May, Qualcomm said that generative AI-based search cost per query is estimated to increase by 10 times compared to traditional search methods. Hence, the utilisation of Hybrid AI enables generative AI developers and providers to leverage the computing power offered by edge devices, leading to cost reduction.
“A hybrid AI architecture (or running AI on device alone) offers the additional benefits of performance, personalisation, privacy, and security – at a global scale.” Qualcomm said.
AI on every device
Earlier this year, a group of Qualcomm engineers managed to run text-to-image AI model Stable Diffusion on an Android device. Interestingly, Qualcomm might not be the only company working on a technology to run AI models on devices.
Apple too is developing and promoting the use of on-device AI models for a range of tasks such as speech recognition, natural language processing, and computer vision.
The company has been investing heavily in the development of AI chips and algorithms that can run on its devices, including iPhones, iPads, and Macs. Currently, it’s not known if Apple is building any GPT models like Large Language Models ( LLM). But Apple could reveal similar technology that completely runs on Apple devices offering users faster, more responsive, and privatised experiences.
Running AI models could be the approach for many companies such as Google and Microsoft, two companies battling it out for AI supremacy. Earlier this year, a developer has already showcased how he managed to run a model like ChatGPT3.5 turbo completely on a laptop.
You will own your own AI.
— Brian Roemmele (@BrianRoemmele) April 5, 2023
Final testing on a new massively smaller 100% locally running ChatGPT 3.5 turbo type of LLM AI in your hard drive on any 2015+ laptop.
I will have pre-configured downloads and it is massively smaller than most models I have, just 4gb.
Out soon! pic.twitter.com/KnZkICmGPV