MITB Banner

For The Sake Of Privacy: Apple’s Federated Learning Approach

Last year, Apple filed for a new patent under ‘User behaviour model development with private federated learning.'
Share
For The Sake Of Privacy: Apple’s Federated Learning Approach

With the rise in privacy awareness among people and more device manufacturers turning to on-device machine learning, federated learning, and other privacy-focused approaches/techniques that deliver machine intelligence on edge or without collection of raw data is gaining popularity. 

Apple, Google and other tech giants have been investing heavily in federated learning for the last few years now. For those unaware, federated learning is the decentralised form of machine learning. It is typically used to train ML models to trigger the suggestion feature and rank the suggested items in the current context. 

The term federated learning was first introduced in a paper titled ‘Communication Efficient Learning of Deep Networks for Decentralised Data’ by Google researchers in 2016, and a heavily cited paper titled ‘Deep Learning with Differential Privacy,’ co-authored by Google and OpenAI researchers. 

Incidentally, Ian Goodfellow, one of the researchers of the latter, in 2019, joined Apple to become director of a machine learning special project group. In 2018, Apple also had onboarded John Giannandrea, who was previously heading AI and search at Google. Since then, Apple has been actively developing federated learning solutions for all its devices and platforms, including iPhone, iPad, Apple Watch, Apple Healthcare, HomePod, and others. 

In 2019, in collaboration with Stanford University, Apple released a research paper called ‘Protection Against Reconstruction and Its Applications in Private Federated Learning,’ showcasing practicable approaches to large-scale locally private model training that was previously impossible. The researchers also emphasised theoretical and empirical ways to fit large-scale image classification and language models with little degradation in utility. 

Further, to address the limitations of federated learning, the Apple researchers experimented with other federated systems that do evaluate and tuning (FE&T). Last year, the company also filed for a new patent – ‘User behaviour model development with private federated learning.’ All in all, most of Apple’s features today are extensions of federated learning. 

Since 2017, Apple has been using differential privacy, where it has been combined with federated learning only as of iOS 13, which rolled out to the public in September 2019. In addition to personalising Siri, both these techniques were being used for a few other applications like QuickType (Apple’s personalised keyboard) and the Found In Apps features, which scans your calendar and mail apps for the names of texters and callers whose numbers aren’t in your phone. 

Here’s a timeline of all the work done around federated learning by Apple. 

Privacy at Apple

While Apple claims to be synonymous with privacy, there have been several instances in the past which say otherwise. Earlier this year, ZecOps claimed that iPhones and iMacs have a software vulnerability that makes them easy targets for unassisted attacks, particularly with its mail app. In 2019, Google researchers discovered a data exploit that affected an unknown number of iPhones. 

Despite the setbacks, Apple has been working on various innovative ways to offer privacy-focused products and apps by leveraging federated learning and decentralised alternative techniques. In addition, it claims that all its products’ protect user privacy, thus giving them control of their data. Check out more details on Apple’s privacy features here

At NeurIPS 2019, Julien Freudiger, while introducing private federated learning, said that Apple’s privacy principles are categorised into data minimisation, on-device intelligence, transparency and control, and security. 

Final Thought 

Tech evangelist Benedict Evans once said, federated learning is appealing to Apple (We don’t have your data), useful for Google (lots of good technical use cases), but maybe strategic imperative for Facebook (now Meta). “How else to do ads, track preferences and catch abuse if everything is encrypted when it leaves the phone?” he added.  

Though federated learning is still in the early days, Apple’s attempt to bet on on-device machine learning and federated learning with a focus on privacy is commendable. But, only time can tell if it becomes the de facto leader in federated learning architecture with a privacy-centric, on-device machine learning focus.

PS: The story was written using a keyboard.
Share
Picture of Amit Raja Naik

Amit Raja Naik

Amit Raja Naik is a seasoned technology journalist who covers everything from data science to machine learning and artificial intelligence for Analytics India Magazine, where he examines the trends, challenges, ideas, and transformations across the industry.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India