Apple Deletes All Trace of Its Controversial Scanning Plans

Apple's controversial plan to scan all iCloud libraries.

Apple appears to have shelved its plans to scan iCloud libraries for CSAM. The world’s most valuable company quietly erased all references to the controversial plan from its website.

The program aimed to delete the last trace of child sexual abuse material (CSAM). In August, the company said it’d wade through users’ pictures to detect such unwholesome material.

It’s taken extensive criticism from experts, rights groups, and even Cupertino employees for the plan to hit the rocks. After outlining plans to collect input and improve the feature later on, Apple will necessarily make improvements to the future.

Advertisements

However, it remains unclear if the company will consider CSAM photo scanning at all. This is the scenario likely to play out if its own motives are ulterior.

What Exactly Did Apple Intend Doing?

shutterstock 1102775957 1
shutterstock.com

In August 2021, Apple announced a suite of new child safety features. Scanning users’ iCloud Photos library for CSAM was a core part of the program.

It’d also feature Communication Safety to warn children and parents when receiving or sending sexually explicit pictures. The other inclusion is expanded CSAM guidance in Siri and Search.

Advertisements

How Do CSAM Scanners Work?

shutterstock 1062915266
sutterstock.com

The workings of a CSAM scanner is interesting. It generates cryptographic “hashes” of known (identified) abusive images. A hash is a type of digital signature.

After assigning a hash to an abusive image, the scanner then wades through massive databases of data to find matches. There are several companies doing this already; Apple has an implementation in Apple for iCloud Mail.

The Apple for iCloud Mail model might have been the inspiration iCloud photo scanning. However, that’s where the company chose to bite the bullet, irking consumers in the process. Their error was in proposing to check those hashes on a user’s device too, if the user owns an iCloud account.

Advertisements

The National Center for Missing and Exploited Children provides the CSAM hashes. Introducing the ability to compare images on your device against known hashes is what many security practitioners worry about. They fear that the tool could eventually find a less noble use.

Its interesting to hear the thoughts of Riana Pfefferkorm, a research scholar at the Stanford Internet Observatory. She believes that having CSAM-scanning feature could ultimately give governments the surveillance tool to search people’s phones for other material.

Apple Once Played the Good Boy

The US government has several times, times to get Apple to build a tool to allow law enforcement to unlock and decrypt iOS devices. China and other countries where customer data sits on state-owned servers have received vital concessions from Apple.

Advertisements

Matthew Green, a Johns Hopkins University cryptographer weighs in on the matter by saying that if Apple feels the need to scan, they should scan unencrypted files on their servers. It’s the way other companies such as Facebook are doing it. He suggests that Apple make iCloud storage end-to end encrypted, meaning it cannot view those files even if it decided to.

What the Critics Said About CSAM

Various individuals and groups were vocal in persuading Apple to discontinue all CSAM scanning plans. The Electronic Frontier Foundation, security researchers, university researchers, politicians, policy groups, and a few Apple employees were some notable groups that criticized the move.

On an individual level, Meta’s (Facebook) ex-security chief and famous privacy whistleblower, Edward Snowden, were among the notable names that lent their voices to ensure that Apple’s plans would not come to fruition.

Advertisements

Most of the criticism centered around Apple’s planned on-device CSAM detection. Researchers lampooned it saying it relied on dangerous technology that was fundamentally surveillance technology. They also said it would not be effective at identifying images of child sexual abuse.

Apple’s Feeble Response

As expected of a company that’s Apple’s stature, they did what they assumed would be best by attempting to dispel some of what was misunderstood. They tried to reassure users by making detailed information available, sharing interviews with company chiefs, while sharing documents, FAQs, and everything else they felt would ease their path to implementing CSAM.

Unfortunately, rather than douse the fire, Apple’s response only seemed to fuel it.

Advertisements

Apple would proceed with rolling out the Communication Safety feature for Messages, once the company released iOS 15.2. however, Apple stalled on the rollout of CSAM considering the barrage of unanticipated criticism the public had unleashed.

A Change of Heart by Apple

Apple once again showed it ability to listen to the consuming public in forging a path for its products. It heard the cry of the public after weeks of sustained outcry. The Cupertino-based company said its decision was the result of feedback from customers, advocacy groups, researchers, and other important voices.

It acknowledged that the child safety features remained critical, in a statement on its Child Safety page. Unfortunately, the page is no longer available in what one may call a perfect display of corporate dithering.

Advertisements

However, Shane Bauer, spokesperson for Apple, stated in an interview that the absence of any mention of the CSAM detection feature on the company website should not be interpreted as a change in plans for the CSAM detection feature. In other words, users should still expect the feature sometime in the future.

Privacy Advocates Remain Cautious

Clearly, Apple’s desire to stop the spread of child sexual materials is a noble cause. Still, it needs to get its methods right. To be clear, Apple is not going scan iCloud photos alone for CSAM. It’ll also rummage through your iPhone or iPad to check for matches. Apple couldn’t have done anything better than stand down.

Facebook’s former Chief Security Officer, Alex Stamos, is now cofounder of Kreb Stamos Group, a cybersecurity consulting firm. He believes Apple made a pretty smart move, yet he’s one of those privacy advocates choosing to maintain a cautious optimism in the wake of the pause.

Advertisements

According to Alex, there are complicated trade-offs involved in the scenario, making it considerably challenging that Apple would come up with an optimal solution without satisfying as many equities as possible.

Conclusion

The suspension of the plans by Apple is big news considering the size of the company, and they continue to get big hand claps for it.

Recent Posts

Follow Us