Apple has been scanning iCloud Mail for CSAM since 2019

Incoming and outgoing attachments sent using iCloud Mail are scanned for CSAM

When you purchase through links on our site, we may earn an affiliate commission.Here’s how it works.

AsAppleprepares to begin scanning iPhones and iPads for child sexual assault material (CSAM) with the release ofiOS 15,new details have emerged revealing that the company already scans iCloud Mail for CSAM.

According to anew reportfrom9to5Mac, the iPhone maker has been scanningiCloud Mailfor CSAM since 2019 though it has not yet begun scanningiCloudPhotos or iCloud backups for such material.

The news outlet decided to investigate the matter further afterThe Vergespotted aniMessage threadrelated toEpic’s lawsuit against Applein which the company’s anti-fraud chief Eric Friedman said that it is “the greatest platform for distributing child porn”.

While Friedman’s statement certainly wasn’t meant to be seen publicly, it does raise the question as to how Apple could know this without scanning iCloud Photos.

Scanning iCloud Mail

Scanning iCloud Mail

9to5Mac’s Ben Lovejoy reached out to Apple to find out more regarding the matter and the company did confirm that it has been scanning both outgoing and incoming iCloud Mail attachments for CSAM since 2019. Asemailssent using iCloud Mail aren’t encrypted, scanning attachments as mail passes through Apple’s servers is not a difficult thing to do.

In addition to scanning attachments, Apple also told9to5Macthat it does some limited scanning of other data though the company did not specify what this data is. It did however say that this “other data” does not includeiCloud backups.

Back in January of last year, chief privacy officer at Apple, Jane Horvath said at atech conferencethat the company uses screening technology to look for illegal images and that it disables accounts if evidence of CSAM is found.

Are you a pro? Subscribe to our newsletter

Are you a pro? Subscribe to our newsletter

Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!

While Friedman’s statement initially sounded as if it was based on hard data, it likely wasn’t. Instead he made the inference that since the company previously did not scan iCloud Photos or iCloud backups, more CSAM would likely exist on Apple’s platform compared to othercloud computingservices that actively scan photos for CSAM.

We’ll likely find out more on how Apple plans to combat CSAM on its platform once the company rolls outApple Child Safety photo scanningwith the release of iOS 15 this fall.

Via9to5Mac

After working with the TechRadar Pro team for the last several years, Anthony is now the security and networking editor at Tom’s Guide where he covers everything from data breaches and ransomware gangs to the best way to cover your whole home or business with Wi-Fi. When not writing, you can find him tinkering with PCs and game consoles, managing cables and upgrading his smart home.

Turns out most of us really don’t mind data centers

How Agentic AI will revolutionize business operations – are you ready?

TP-Link Omada EAP783 review