Affiliate links on Android Authority may earn us a commission.Learn more.
Apple wants to scan user photos to hunt for child abuse (Updated: It’s official)
June 02, 2025
Update, August 06, 2025 (04:10 PM ET):Not long after we published the article below, Apple confirmed the existence of its software that hunts for child abuse. In a blog post titled “Expanded protections for children,” the company laid out plans to help curb child sexual abuse material (CSAM).
As part of these plans, Apple will roll out new technology in iOS and iPadOS that “will allow Apple to detect known CSAM images stored in iCloud Photos.” Essentially, on-device scanning will occur for all media stored in iCloud Photos. If the software finds that an image is suspect, it will send it to Apple which will decrypt the image and view it. If it finds the content is, in fact, illegal, it will notify the authorities.

Apple claims there is a “one in one trillion chance per year of incorrectly flagging a given account.”
Original article, June 26, 2025 (03:55 PM ET):Over the past few years, Apple has pushed hard to solidify its reputation asa privacy-focused company. It frequently cites its “walled garden” approach as a boon for privacy and security.
However, a new report fromFinancial Timesthrows that reputation into question. According to the report, Apple is planning on rolling out a new system that would rifle through user-created photos and videos on Apple products, including the iPhone. The reason Apple would sacrifice iPhone privacy in this way is to hunt for child abusers.
See also:What you need to know about privacy screen protectors
The system is allegedly known as “neuralMatch.” Essentially, the system would use software to scan user-created images on Apple products. If the software finds any media that could feature child abuse — including child pornography — a human employee would then be notified. The human would then assess the photo to decide what action should be taken.
Apple declined to comment on the allegations.
iPhone privacy coming to an end?
Obviously, the exploitation of children is a huge problem and one that any human with a heart knows should be dealt with swiftly and vigorously. However, the idea of someone at Apple viewing innocuous photos of your kids that neuralMatch accidentally flagged as illegal seems like an all-too-real problem waiting to happen.
There’s also the idea that software designed to spot child abuse now could be trained to spot something else later. What if instead of child abuse it was drug use, for example? How far is Apple willing to go to help governments and law enforcement catch criminals?
It’s possible Apple could make this system public in a matter of days. We’ll need to wait and see how the public reacts, if and when it does happen.
Thank you for being part of our community. Read ourComment Policybefore posting.