Listen to this story
Apple has been touted as the brand built on privacy. But, nevertheless, it has been subject to numerous instances of data infringement. In April this year, Israeli firm Pegasus Spyware successfully hacked into iPhones belonging to human rights defenders, lawyers and journalists. In the latest round of troubles for the company, a report has brought Apple fresh allegations.
Tommy Mysk and Talal Haj Bakry, iOS developers and security researchers, detailed that contrary to what Apple wants us to believe, it in fact does collect user information. The findings have landed Apple in a new lawsuit for violating user privacy despite sharing analytics being switched off in the settings.
In their report, Mysk and Bakry showed that Apple’s analytics data includes an ID called “dsId”, which upon verification was found to be a Directory Services Identifier. The ID number is used to uniquely identify an iCloud account such that when an API call is made to iCloud, the dsId information carrying user name, email, and any data associated with the user’s iCloud account is sent to Apple.
Thus, all the user activity on the App Store—including metrics about the app viewed, and duration of viewing of the app—is sent to Apple even when personalised recommendation and sharing usage data is switched off. Additionally, they confirmed that the same identifier was used for Apple Music, and other company services.
Apple’s privacy game
However, these findings, when viewed against the backdrop of recent privacy updates by Apple present a different picture. In August 2022, Apple revealed that it would bring ads to pre-installed apps like Books, Maps, Podcasts and others on its devices. Additionally, it also tested inputting ads in Maps to recommend stores, restaurants, or businesses nearby.
The move showed that Apple intends to push hard on advertising, and leverage upon its App Tracking Transparency (ATT) policy, which gives users the choice to prevent any third-party application from collecting their data. ATT was a major blow to companies like Meta and Google, which in the recent quarterly results, reported a major fall in revenue owing to grim advertising expenditures.
In fact, Search Ads, Apple’s platform for advertisers to host campaigns in the App store, tripled in market share since the first-half of 2020, according to AppsFlyer. Apple Search Ads works on a cost-per-tap (CTP) model, which means advertisers only have to pay if users engage with the ad. Apple will promote these apps whenever users arrive on the App Store or search for something specific. One can connect the dots here and argue that Apple is most definitely tracking user activity to display only those ads which a user is more likely to click to have a profitable bet on the CTP model.
However, the rules applying to Device Analytics do not pertain to Apple’s services which have a completely different policy. For instance, in the App Store & Privacy document published in September 2022, the guidelines are pretty clear. It says that Apple will have a record of the browsing history, searches, downloads, and purchases stored with a unique identifier, IP address, and Apple ID in order to enable personal recommendation, as well as to provide relevant ads on App Store, Apple News, and Stocks. Additionally, Apple will also collect information about the number of phone calls, and emails sent and received to identify and prevent fraud.
Device analytics vs services analytics
Nick Heer, writer at Pixel Envy, also points out—device analytics is distinctive to services analytics. The device analytics policy only allows Apple to collect data on performance analytics of iPhone, and how users use their devices and applications, without this information identifying them personally. However, it is not clear if Apple collects personal data outside bug reports and crashes, which is said to fall under “privacy preserving techniques such as differential privacy”.
The federated learning based on Apple’s differential privacy is built for use case applications, and, in general, to enhance user experience. In this model, the data collected from a user input is randomised before being sent to a central server. After that, the randomised information is clubbed into batches, subject to private algorithms.
This way, Apple has been able to play a safe game, and avoid falling into grey areas of privacy by drawing a thin line between device and application privacy narration. That’s where all the confusion starts, and all hell breaks loose.