Apple’s CSAM limiting features: a privacy backdoor?

Apple is positioning itself as one of the great tech companies against the likes of Facebook and Google for maintaining user privacy. Recently, however, the Cupertino-based giant has been surrounded by privacy enthusiasts who have challenged Apple’s CSAM prevention features.

Earlier, Apple was stuck in the defeat of the NSO. It was found that it was easier to implement spyware on the iPhone than on Android.

More recently, Apple’s new Child Sexual Abuse Content (CSAM) features have been controversial. Critics have accused Apple of devising a method that could undermine the privacy of millions.

While the effort to limit CSAM is commendable, its implementation is surprising.

What are Apple’s new child protection features?

On Monday, Apple stepped up its efforts to prevent the spread of CSAM on its platforms. The company has unveiled three new features that will be available with iOS 15.

  • First, Apple is adding new resources to Siri and Spotlight search to help users find answers to CSAM queries. This feature is completely harmless and doesn’t change anything in terms of privacy.
  • Second, Apple has redesigned its Messages app by adding a parental control option that will scan received and sent photos in messages for sexually explicit content for users under the age of 18. If detected, it blurs the photo in question and displays a warning message alongside.

Source: Apple

  • The last and most controversial feature is scanning your iCloud Photo Library for potential CSAM.

How will the CSAM function work?

If a user under the age of 12 tries to view the flagged content, a notification will be sent to their parents. It is noteworthy that Apple will scan photos on the device in order not to know about the content.

In addition, it is optional to be notified when children access sensitive content.

What’s more, Apple will add a feature to locally scan your iCloud Photo Library for sexual content and match it with CSAM data provided by the National Center for Missing and Exploited Children (NCMEC). If a match is found, it will generate a “security voucher” and upload it to iCloud.

Once the number of these security tickets reaches an undisclosed threshold, Apple moderators can decrypt the photos and scan them for CSAM. If discovered, Apple may report this account to law enforcement.

What does Apple say?

While the whole “scan through iCloud Photo Library” sounds like a bold and aggressive move, Apple emphasized that this feature will not affect users’ privacy. It will specifically scan photos uploaded to your iCloud via NeuralHash.

It is a tool that assigns each photo a unique ID, hiding the content of the photos, even for Apple. The company emphasized that users can turn off syncing with the iCloud feature to stop scanning of any type.

What’s more, Apple’s official blog says,

Messages use machine learning on the device to analyze attached images and determine if a photo is sexually explicit. The feature is designed in such a way that Apple cannot access messages.

So Why So Much Fuss ?: The Critics Story Side

Apple has repeatedly emphasized that these features are designed with privacy in mind to protect children from sexual predators, but critics disagree. While the messaging feature scans on the device, Apple is inherently creating a tracking system that can be disastrous.

Kendra Albert of the Harvard Cyberlaw Clinic argues that:

These “child protection” functions can be costly for homosexuals with parental rejection. They could be beaten and kicked out of the house. She says, for example, that the new feature could tag a fagot kid sending his transition photos to friends.

Imagine the consequences this system could have in countries where homosexuality is not yet legal. Authoritarian governments can ask Apple to add LGBTQ + content to the list of known databases. We have never seen such hand-twisting of technical agencies before.

However, Apple has a pretty good track record against such requests. For example, it is vehemently denied by the FBI’s request to decrypt data from the iPhone mass shooter in 2016.

But in countries like China, where Apple stores iCloud data locally, this could lead to Apple agreeing to their demands. Thus, a complete threat to the privacy of users.

The Electronic Frontier Foundation says that:

Apple’s efforts are “a completely built system that waits for the slightest change from outside pressure.”

The latter feature is even more important from a privacy standpoint. Scanning iCloud photos is clearly a violation of users’ privacy. Thanks to this, your iPhone will now have a feature that can scan your photos and compare them to a selection of illegal and sexually explicit content.

And all this while you just want to upload your photos to the cloud storage!

Apple’s past versus present: growing concerns

When Apple challenged the FBI and refused to unlock the mass murderer’s iPhone, the company stated, “Your device belongs to you. He does not belong to us. “

It seems that the device ultimately belongs to Apple because you have no control over who is watching your device’s content.

Users can claim that their clicked photos are theirs and that Apple is not allowed to scan them. In my opinion, for many, this is non-negotiable.

MSNBC has compared Apple’s iCloud photo scanning feature to NSO’s pegasus software to give you an idea of ​​the surveillance system being built in the name of “greater good.” The report says:

Think of the spyware capabilities that the Israeli company NSO Group provided governments with allegedly to track terrorists and criminals, which some countries then used to track activists and journalists. Now imagine that these same abilities are hard-coded into every iPhone and Mac on an airplane.

It’s not hard to imagine the disastrous consequences of Apple’s implementation of this feature.

Apple wants you to trust it!

Since the company announced these features, privacy enthusiasts have fiercely opposed them. In response, Apple released a PDF of FAQs about its CSAM prevention initiatives.

In the document, Apple mentions that it will waive any government requirement to add non-CSAM images to its hash list. The PDF says :

We’ve faced demands to create and implement government-sanctioned changes that compromise user privacy in the past, and we’ve flatly rejected those demands. We will continue to abandon them in the future.

In an interview with Techcrunch , Apple’s Chief Privacy Officer Eric Neuenschwander attempted to address feature issues.

He said: “The device is still encrypted, we still don’t have a key, and the system is designed to work with the data on the device. What we’ve developed has a device-side component – and by the way, it has a device-side component to improve privacy. The alternative to simple processing by going through and trying to evaluate user data on the server is actually more susceptible to change. [without user knowledge], and less protects the privacy of users. “

It looks like Apple wants you to entrust your phone’s personal contents to it. But who knows when the company will backtrack on those promises.

The existence of such a system that tracks the content you own before it is encrypted opens up a can of worms. Once they have prepared the consent and installed the system, it will be difficult to restrict it.

Are the benefits overshadowing the risks of Apple’s new child safety features?

While Apple is working hard to combat child sexual abuse, this is no excuse to view your data. The very existence of a surveillance system creates the possibility of a security loophole.

I believe that transparency is the key to Apple’s sweet spot. Some legal intervention can also help the company build credibility with its plan to disrupt CSAM on the Internet.

However, the implementation of this system negates the whole purpose of encryption. The company could launch a wave of experiments that could potentially prove fatal to privacy in the future.

More importantly, it started with Apple, the last company everyone expected it to be. It is true that you either die a hero or live long enough to see yourself as a villain.

by Abdullah Sam
I’m a teacher, researcher and writer. I write about study subjects to improve the learning of college and university students. I write top Quality study notes Mostly, Tech, Games, Education, And Solutions/Tips and Tricks. I am a person who helps students to acquire knowledge, competence or virtue.

Leave a Comment