Apple’s dangerous path

Hello friends, and welcome back to Week in Review. Last week, we dove into the truly bizarre machinations of the NFT market. This week, we’re talking about something that’s a little bit more impactful on the current state of the web — Apple’s NeuralHash kerfuffle. If you’re reading this on the TechCrunch site, you can get this in your inbox from the newsletter page, and follow my tweets @lucasmtny the big thing In the past month, Apple did something it generally has done an exceptional job avoiding — the company made what seemed to be an entirely unforced error. In early August — seemingly out of nowhere** — the company announced that by the end of the year they would be rolling out a technology called NeuralHash that actively scanned the libraries of all iCloud Photos users, seeking out image hashes that matched known images of child sexual abuse material (CSAM). For obvious reasons, the on-device scanning could not be opted out of. This announcement was not coordinated with other major consumer tech giants, Apple pushed forward on the announcement alone. Researchers and advocacy groups had almost unilaterally negative feedback for the effort, raising concerns that this could create new abuse channels for actors like governments to detect on-device information that they regarded as objectionable. As my colleague… Click below to read the full story from TechCrunch
Read More