Microsoft’s Phi-3 Outperforms Meta’s Llama 3 and Fits Perfectly on an iPhone

Microsoft shows who is the boss of tiny open source models.
Microsoft’s Phi-3 Outperforms Meta’s Llama 3 and Fits Perfectly on an iPhone
Image by Raghavendra Rao
“One of the things that makes Phi-2 better than Meta’s Llama 2 7B and other models is that its 2.7 billion parameter size is very well suited for fitting on a phone,” said Harkirat Behl, one of the creators of the model, who has now created Phi-3, the latest open source model by Microsoft. Phi-3-Mini is a 3.8 billion parameter language model trained on an extensive dataset of 3.3 trillion tokens. Despite its compact size, the Phi-3-Mini boasts performance levels that not just exceed the recent ones such as Mixtral 8x7B and GPT-3.5, but even surpass the recently launched Meta’s Llama 3 8B on MMLU benchmarks. Despite these high capabilities, Phi-3-Mini can run locally on a cell phone. Its small size allows it to be quantised to 4 bits, occupying approximately 1.8GB of memory. M
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM? Book here

Picture of Mohit Pandey
Mohit Pandey
Mohit writes about AI in simple, explainable, and often funny words. He's especially passionate about chatting with those building AI for Bharat, with the occasional detour into AGI.
Related Posts
AIM Print and TV
Don’t Miss the Next Big Shift in AI.
Get one year subscription for ₹5999
Download the easiest way to
stay informed