Apple sued for dropping CSAM detection features from services, devices

Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns


Discover more from The Doon Mozaic

Subscribe to get the latest posts to your email.

Leave a Reply

Your email address will not be published. Required fields are marked *

Discover more from The Doon Mozaic

Subscribe now to keep reading and get access to the full archive.

Continue reading