Why Apple will Build the Best Chatbot

Apple is also known for aesthetics and design thinking principles, which might help the company beat ChatGPT.
Why Apple will Build the Best Chatbot
Listen to this story

Last year, after the release of ChatGPT, when every big tech was frantically trying to adopt or build LLM-based chatbots, Apple decided to ban the use of ChatGPT internally citing privacy concerns. It even reached to the level that it even halted to build a LLM-based chatbot.

However, according to recent reports, the tech giant had a change of heart and has developed an internal chatbot—nicknamed “Apple GPT” by its employees. Though the tech giant has not yet decided how to release it to the public, the company is planning to make a significant AI-related announcement next year.

To ensure that the chatbot is better than the others in the market, Apple created its own framework called Ajax to build its LLM-based chatbot, similar to OpenAI’s ChatGPT and Google’s Bard. This framework runs on Google Cloud and was built using Google JAX, the search giant’s machine learning framework. 

Apple will build a league of its own

Apple’s ecosystem and its dedicated consumers are probably the biggest advantage that the company has over its competitors. Apple has an integrated ecosystem, which gives it a significant opportunity to leverage its M1 and M2 capabilities and develop private and personalised LLMs. It also has a massive developer ecosystem, for which, the company recently released Transformers architecture which is optimised for Apple Silicon

Apart from technology, Apple is also known for aesthetics and design thinking principles, which might help the company beat ChatGPT’s highly praised simple UI design.

Whatever the mission is, it is clear that integrating LLM technology on single devices is a difficult task.However, the task becomes more challenging when the company focuses on offering LLM technology along with maintaining privacy and security of users. Tim Cook has emphasised that the company wants to incorporate AI into its offerings thoughtfully and responsibly. 

Probably, this is the reason why chatbot is not able to produce the right outputs. According to anonymous Apple employees, the company has directed that the output from the new chatbot still cannot be used to create features for end customers. It seems like Apple does not trust its own technology at the moment. 

Apple has started caring about AI

The recent partnership announcement by Meta and Microsoft to release Llama 2 is possibly based on integrating language models for edge use cases — a wake up call for Apple. To enable this even further, Meta has also partnered with Qualcomm for possibly designing chips for Android devices. Given the smaller size of the Llama 2 model, Microsoft might be able to achieve it, something that arguably wouldn’t have been possible with OpenAI’s huge models like GPT-4.

Despite the fact that the report said that Apple’s chatbot doesn’t offer any additional distinguishing features, it is safe to say that Apple distinguishes itself from competitors in the market. At WWDC 2023, the team already announced a lot of improvements using machine learning, including on-device Transformer-based auto-fill in keyboards on iOS. 

According to a report from March, Apple had held an internal event focusing on AI and LLMs. The participants included the Siri team and reported that it is testing “language-generating concepts”. Moreover, 9to5Mac reported that Apple has introduced a framework for ‘Siri Natural Language Generation’ in tvOS 16.4. Recent reports also indicate that Apple has been actively seeking talent in generative AI, posting job openings for experts in the field with a strong understanding of large language models and generative AI.

What can Apple do next?

Following all this, the best use case for the integration of this GPT-like technology for Apple would be Siri. Even after so many announcements, Siri hasn’t really upgraded itself since its launch. Developers have been trying to integrate ChatGPT capabilities within Siri. Now, Apple has the chance to do that natively.

Currently, OpenAI has its ChatGPT app only on iOS. If Apple develops its own chatbot that it is able to run natively on its own ecosystem, OpenAI might need to start worrying about what moves it can make next as Apple can simply drop it from the App Store. Maybe the ChatGPT app was also a wake up call for Apple to get into LLMs.

Apple was shying away from LLMs, similar to how Meta was. Now the Mark Zuckerberg company is being regarded as one of the top players in the open source market, nothing less can be expected from Apple as it is arriving last. Apple has always been very careful with user data, and have maintained that they don’t want to take any unnecessary risks in this arena.

Download our Mobile App

Mohit Pandey
Mohit dives deep into the AI world to bring out information in simple, explainable, and sometimes funny words. He also holds a keen interest in photography, filmmaking, and the gaming industry.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox

6 IDEs Built for Rust

Rust IDEs aid efficient code development by offering features like code completion, syntax highlighting, linting, debugging tools, and code refactoring