Apple's Child Safety measures draw flak from Big Tech, experts
Apple recently announced sweeping changes to its policies with the intention to keep children safer while using Apple ecosystem products. While the intention is noble, the implementation method has drawn criticism from eminent personalities, including WhatsApp boss Will Cathcart, privacy advocate Edward Snowden, and American political figures. An online petition asking Apple to "reconsider its technology rollout" amassed 5,000+ signatures at the time of publishing.
Messages app will use on-device machine learning to identify CSAM
According to the Communications Safety section of Apple's Child Safety measures, new communication tools will optionally alert parents if their child under 13 years of age sends or views Child Sexual Abuse Material (CSAM). The images will be deemed as CSAM based on an on-device analysis that involves comparing hashes of identified CSAM to hashes of iCloud images, thereby giving rise to privacy concerns.
Siri, Search will also restrict users' access to CSAM content
Apple claims that it won't be able to read the encrypted communications. Additionally, the company's policy changes state that Apple would provide law enforcement agencies information on collections of CSAM in iCloud photos. Siri and Search will also prevent Apple users from searching for CSAM-related topics, besides providing children and parents help if they encounter such content online.
What if the tool starts looking for other material?
The Apple Privacy Letter petition noted that Apple's policy implementation would be undoing "decades of work by technologists, academics, and policy advocates." An internal Apple memo even acknowledged that people would be "worried about the implications" of the system. People's concern isn't about Apple's intentions, but about false accusations, and pressure from government/private entities that could convert the policy into an overreaching snooping tool.
Facebook-owned WhatsApp didn't spare the opportunity to bash Apple
WhatsApp boss Will Cathcart tweeted that the messaging service won't be adopting the safety measures. He termed Apple's approach "very concerning." He laid emphasis on the Facebook-owned WhatsApp's system that helped it report 400,000+ cases of child exploitation in 2020. Facebook's subsidiary grabbed the opportunity to bash Apple since the latter's advertising policy took a significant toll on the social media company's advertising revenue.
Electronic Frontier Foundation's statement thoroughly blasted Apple's plan
NSA whistleblower Snowden's take on Apple's Child Safety measures
Politician Brianna Wu called this 'worst idea in Apple history'
Following Epic vs Apple suit, Epic CEO also shared thoughts
Apple memo called criticism 'the screeching voices of the minority'
Snowden retweeted a post by Electronic Frontier Foundation (EFF) director of cybersecurity Eva Galperin. The post claimed to show an internal Apple memo from August 7 that called the backlash "the screeching voices of the minority." On Twitter, Galperin explained that today the database scans iCloud for CSAM images, but tomorrow it could be anything else, such as "memes critical of the Chinese government."
A disturbing internal Apple memo circulated yesterday
Apple will push these changes with iOS 15, macOS Monterey
If Apple chooses to proceed with the implementation of these new policies, it will do so with an upcoming software update for devices across its ecosystem, including those running iOS 15, iPadOS 15, watchOS 8, and macOS Monterey. A timeline for the rollout remains unknown.