Apple sued for dropping CSAM detection features from services, devices

Announced in 2021, the plan was for Apple to scan images on iCloud for child abuse material using on-device technology. While Apple retained a nudity-detection tool for its Messages app, the CSAM detection feature was dropped in 2022 amid privacy and security concerns


Discover more from The Doon Mozaic

Subscribe to get the latest posts to your email.