Apple’s Has Already Been Scanning iCloud Mail for Child Abuse Since 2019

Published on August 23, 2021
Written by:
Bill Toulas
Bill Toulas
Cybersecurity Journalist

Apple’s image scanning system (CSAM) that is meant to help discover and stop the production and dissemination of child abuse material has reportedly already been used by the company since 2019, and 9to5mac reports that they have received an official confirmation on this. The CSAM had sparked a lot of controversies when it was announced as a “new” image scanning system, but as it appears to be the case from these recent reports, the system was already used. However, not for iCloud Photos but only for iCloud Mail attachments.

Apparently, the clues of this taking place were always out there, and Apple’s Eric Friedman has even claimed last year that they are “the greatest platform for distributing child sexual abuse material.” This gave people a hint that Apple had a way of knowing that, and they were monitoring the situation.

On Apple’s old official child safety page, which has been modified since last year, there’s a mention of Apple using image matching technology to find and report child exploitation. This works through a system of electronic signatures and match validation, similar to what Apple described recently while attempting to clarify how the CSAM scanning system will work.

So, Apple has been scrutinizing iCloud Mail attachments for child abuse, allegedly without compromising user privacy. As the iCloud Mail isn’t end-to-end encrypted, this scanning is practically possible, which is another reminder to everyone of how easily a provider can look into client communications without the user ever realizing it.

As 9to5mac further details, Apple admitted that they perform some limited scanning of other data besides the Mail, but they’d rather not say where and only clarified that it isn’t on iCloud backups. Also, the CSAM scanning isn’t taking place on the iCloud Photos yet. Still, there are no changes in the plans to eventually roll it out, and no matter what people or privacy-advocating organizations think about it. If you agree with them, there’s a petition going on which you can sign and try to make your voice heard.



For a better user experience we recommend using a more modern browser. We support the latest version of the following browsers: For a better user experience we recommend using the latest version of the following browsers: