MITB Banner

Apple Lets User Privacy Slide, Chases Ad Revenues

Apple only cares about its carefully-curated brand image and stance on privacy, and not the actual privacy of users

Share

It's High Time Apple Bought Stability.AI
Listen to this story

Today, Apple announced that it would no longer scan photos scanned in users’ iCloud accounts. While this seems like a move to protect the privacy of its users, the real reason behind this appears to be motivated by self-preservation.

For context, Apple had launched a service in August 2021 with an aim to scan users’ iCloud for child sexual abuse material (CSAM). While it stated that this tool would preserve the privacy of users, many security researchers found that it could be used as a surveillance capability. This led to a huge backlash, prompting the company to roll back this feature. 

Instead of this controversial scanning feature, Apple has chosen to invest heavily in its ‘Communication Safety’ feature, which relies on parents and guardians to opt into protective measures. This feature works in various Apple services like Siri, Spotlight search, and Safari search to gauge whether someone is searching for CSAM. Apart from this, the feature also uses on-device machine learning to detect nudity in messages sent by minors.

The tech giant had also hit the news lately regarding its controversial change in Apple devices in China. During the peak of COVID-19 in the country, protestors were using Apple AirDrop to share propaganda when the Chinese government had cut off access to the internet. In response to this, Apple quietly disabled the feature to share files with everyone through AirDrop, instead adding a ‘Contacts Only’ feature. While this move has since been replicated on all Apple devices worldwide, it was initially only rolled out in China. 

All these moves and more show that Apple only cares about its carefully-curated brand image and stance on privacy, and not the actual privacy of users. 

Behind the iCloud Scanning Feature

Apple announced this feature in August 2021, along with a host of other CSA prevention measures. The scanner feature was reported to check all images uploaded to iCloud through a combination of on-device cryptography and cloud servers. These images would be checked against a database of known child sexual abuse images, and reported to the authorities if found to be match. 

While the feature was widely praised by children’s safety advocacy groups, many privacy critics strongly opposed the move. In their eyes, Apple made this move in the face of pressure from law enforcement authorities. Even as Apple quashed rumours that the feature would scan non-CSAM pictures, the backlash from the move prompted the company to cancel the feature. 

While adding image analysis to user devices does prevent misuse, it presents an interesting conundrum and undermines user privacy. Apple has historically taken a very strong stance on user privacy, standing against tech giants like Google and Microsoft which exploit user data for financial gain. The tech giant has routinely set the standard for privacy in customer devices by offering features like personal data anonymisation, on-device algorithms for processing facial data, and was the first to offer Secure Enclave on their mobile devices to store user passwords and financial information. 

However, this stance is slowly eroding away in the face of larger market pressure and a move towards monetising user data through advertising. 

Apple vs Advertisers

In 2020, Apple launched a feature that aimed to improve user’s knowledge on how apps track them. Dubbed App Tracking Transparency, it aimed to stymie advertisers by giving users the chance to opt out of data collection processes. This, along with abolishing the identifier for advertisers feature, seemed to signal to the market that Apple was gearing up for the next generation of data security practices. However, the opposite was true.

After booting advertisers from its ecosystem, Apple has since made moves to consolidate its own advertising power. The tech giant previously sold ad space on its News and Stocks apps, but has since expanded its advertising efforts to include the front page of the App Store. This brand strategy has also drawn questions of anticompetitive practices, as advertising giants like Meta lost an estimated $13 billion when Apple revamped its advertising policies. 

To serve these targeted advertisements to users, Apple collects users’ data from their other services and their Apple account details. This stands in stark contrast to its privacy-first policy, which assumes zero knowledge on the user’s part. Moreover, even if users opt out of this advertising system, it still collects information on the type of device and identity of the users’ carrier. 

These efforts paid off, as Apple increased its advertising revenue by 4x between 2021 and 2022. However, the company is looking to deflect attention away from its anti-privacy practices by introducing a slew of features designed to ‘protect’ the privacy of its users.

Apple is No Longer Privacy’s White Knight

Even beyond the purported features designed to increase the security of the users on their devices, Apple still engages in tracking for advertising purposes. A study by software company Mysk saw that Apple still tracks users, even when their settings say that they don’t. 

Moreover, in 2021, a human rights lawyer found that Apple was tracking her activity across all applications in records tracing back to 2017. This is also in line with Apple’s ad monetisation policy, which relies on a cost-per-tap model. Only highly targeted ads can provide the conversion rates required to keep this model sustainable. 

Apple might put on a show to maintain its curated privacy-focused brand image, but it has indulged in user tracking, device identification, anti-consumer practices and antitrust violations to build up its advertising. 

Share
Picture of Anirudh VK

Anirudh VK

I am an AI enthusiast and love keeping up with the latest events in the space. I love video games and pizza.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India

Subscribe to Our Newsletter

The Belamy, our weekly Newsletter is a rage. Just enter your email below.