From Slate
Why Some Privacy Experts Are Worried About Apple’s New Plan to Fight Child Sexual Abuse Material
Jake Dean
In 2016, Apple famously refused to create backdoor software to give the FBI access to an iPhone used by a mass shooter in San Bernardino, California. It said doing so would undermine its commitment to protecting users’ devices from law enforcement. However, Apple’s steadfast dedication to device privacy may be waning.
On Thursday, Apple announced new measures to prevent minors from viewing sexually explicit content on iMessage. It will also began scanning images uploaded to users’ iCloud accounts for child sexual abuse material, or CSAM. Eradicating CSAM is a noble goal—yet many technology experts are worried that Apple may be on the verge of undermining phone privacy forever...
Deirdre Mulligan, a professor at the University of California–Berkeley’s School of Information and faculty director for its Center for Law & Technology, notes that this iMessage system isn’t a tool for reporting or blocking—instead Apple is “keeping it ‘within the family.’ ”