Apple acknowledges 'confusion' over child safety updates - 2 minutes read




Apple is ready to acknowledge the controversy over its child safety updates, but it sees this as a matter of poor messaging — not bad policy. Senior software engineering VP Craig Federighi told the Wall Street Journal in an interview that introducing both the scans for child sexual abuse material (CSAM) on iCloud and opt-in local monitoring of iMessage sexual content was a "recipe for this kind of confusion." People conflated the two and thought that Apple might spy on messages, Federighi claimed, adding that he wished Apple had "come out a little more clearly."

The executive maintained that Apple was striking the right balance between child safety and privacy, and addressed some of the concerns that surfaced since the company announced its new measures in early August. He stressed that the scans of iCloud-destined photos would only flag existing images in a CSAM database, not any picture in your library. The system only sends an alert when you reach a threshold of 30 images, so false positives aren't likely.

The system also has "multiple levels of auditability," Federighi said. On-device scanning will reportedly make it easier for researchers to check if Apple ever misused the technology. Federighi also rejected the notion that the technique might be used to scan for other material, such as politics, noting that the database of images comes from multiple child safety groups, not just the agency that will receive any red-flag reports (the National Center for Missing and Exploited Children).

The response won't satisfy those who object to the very notion that Apple is scanning photos on their phones, even with privacy protections in place. It is, however, a recognition that it can be a challenge to properly address privacy issues — it doesn't take much to prompt an uproar.

Source: Engadget

Powered by NewsAPI.org