For The Sake Of Privacy: Apple’s Federated Learning Approach

Last year, Apple filed for a new patent under ‘User behaviour model development with private federated learning.'
For The Sake Of Privacy: Apple’s Federated Learning Approach

With the rise in privacy awareness among people and more device manufacturers turning to on-device machine learning, federated learning, and other privacy-focused approaches/techniques that deliver machine intelligence on edge or without collection of raw data is gaining popularity. 

Apple, Google and other tech giants have been investing heavily in federated learning for the last few years now. For those unaware, federated learning is the decentralised form of machine learning. It is typically used to train ML models to trigger the suggestion feature and rank the suggested items in the current context. 

The term federated learning was first introduced in a paper titled ‘Communication Efficient Learning of Deep Networks for Decentralised Data’ by Google researchers in 2016, and a heavily cited paper titled ‘Deep Learning with Differential Privacy,’ co-authored by Google and OpenAI researchers. 

Incidentally, Ian Goodfellow, one of the researchers of the latter, in 2019, joined Apple to become director of a machine learning special project group. In 2018, Apple also had onboarded John Giannandrea, who was previously heading AI and search at Google. Since then, Apple has been actively developing federated learning solutions for all its devices and platforms, including iPhone, iPad, Apple Watch, Apple Healthcare, HomePod, and others. 

In 2019, in collaboration with Stanford University, Apple released a research paper called ‘Protection Against Reconstruction and Its Applications in Private Federated Learning,’ showcasing practicable approaches to large-scale locally private model training that was previously impossible. The researchers also emphasised theoretical and empirical ways to fit large-scale image classification and language models with little degradation in utility. 

Further, to address the limitations of federated learning, the Apple researchers experimented with other federated systems that do evaluate and tuning (FE&T). Last year, the company also filed for a new patent – ‘User behaviour model development with private federated learning.’ All in all, most of Apple’s features today are extensions of federated learning. 

Since 2017, Apple has been using differential privacy, where it has been combined with federated learning only as of iOS 13, which rolled out to the public in September 2019. In addition to personalising Siri, both these techniques were being used for a few other applications like QuickType (Apple’s personalised keyboard) and the Found In Apps features, which scans your calendar and mail apps for the names of texters and callers whose numbers aren’t in your phone. 

Here’s a timeline of all the work done around federated learning by Apple. 

Privacy at Apple

While Apple claims to be synonymous with privacy, there have been several instances in the past which say otherwise. Earlier this year, ZecOps claimed that iPhones and iMacs have a software vulnerability that makes them easy targets for unassisted attacks, particularly with its mail app. In 2019, Google researchers discovered a data exploit that affected an unknown number of iPhones. 

Despite the setbacks, Apple has been working on various innovative ways to offer privacy-focused products and apps by leveraging federated learning and decentralised alternative techniques. In addition, it claims that all its products’ protect user privacy, thus giving them control of their data. Check out more details on Apple’s privacy features here

At NeurIPS 2019, Julien Freudiger, while introducing private federated learning, said that Apple’s privacy principles are categorised into data minimisation, on-device intelligence, transparency and control, and security. 

Final Thought 

Tech evangelist Benedict Evans once said, federated learning is appealing to Apple (We don’t have your data), useful for Google (lots of good technical use cases), but maybe strategic imperative for Facebook (now Meta). “How else to do ads, track preferences and catch abuse if everything is encrypted when it leaves the phone?” he added.  

Though federated learning is still in the early days, Apple’s attempt to bet on on-device machine learning and federated learning with a focus on privacy is commendable. But, only time can tell if it becomes the de facto leader in federated learning architecture with a privacy-centric, on-device machine learning focus.

Download our Mobile App

Amit Raja Naik
Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.

Subscribe to our newsletter

Join our editors every weekday evening as they steer you through the most significant news of the day.
Your newsletter subscriptions are subject to AIM Privacy Policy and Terms and Conditions.

Our Recent Stories

Our Upcoming Events

3 Ways to Join our Community

Telegram group

Discover special offers, top stories, upcoming events, and more.

Discord Server

Stay Connected with a larger ecosystem of data science and ML Professionals

Subscribe to our Daily newsletter

Get our daily awesome stories & videos in your inbox
How Generative AI is Revolutionising Data Science Tools

How Generative AI is Revolutionising Data Science Tools

Einblick Prompt enables users to create complete data workflows using natural language, accelerating various stages of data science and analytics. Einblick has effectively combined the capabilities of a Jupyter notebook with the user-friendliness of ChatGPT.