The big story
90+ groups want Apple to drop CSAM plans
Open letter demands Apple drops plans to “build surveillance capabilities” into devices
A GROUP OF MORE THAN 90 INTERNATIONAL POLICY GROUPS HAS BANDED TOGETHER TO DELIVER AN OPEN LETTER TO APPLE CEO, TIM COOK, THAT DEMANDS APPLE DITCHES ITS PLANS TO CHECK iPHONES AND iPADS FOR KNOWN CSAM CONTENT IN iCLOUD PHOTOS AND INAPPROPRIATE PHOTOS SENT TO AND FROM KIDS.
The letter is in response to Apple’s CSAM efforts that involve checking on-device iCloud Photos images against known CSAM content. CSAM Detection enables Apple to accurately identify and report iCloud users who store known Child Sexual Abuse Material (CSAM) in their iCloud Photos accounts. Apple servers flag accounts exceeding a threshold number of images that match a known database of CSAM image hashes so that Apple can provide relevant information to the National Center for Missing and Exploited Children (NCMEC). Apple claims that this process is secure, and is expressly designed to preserve user privacy, but 90+ international policy groups have expressed their doubts.