WhatsApp has condemned Apple’s new child safety tools as “very concerning”. . . surveillance system,” while governments around the world have welcomed the decision to proactively search for illegal photos of child sexual abuse.
The standoff is sparking a battle between other tech platforms and officials calling on them to use similar tools.
An Indian government official told the The Washington City Times on Friday that it “welcomed” Apple’s new technology that set “a benchmark for other tech companies”, while an EU official said the technology group had designed a “quite elegant solution”.
US Senator Richard Blumenthal called Apple’s new system an “innovative and bold step”.
“Time for others – especially Facebook – to follow suit,” tweeted Sajid Javid, the UK’s health secretary and former home secretary.
However, Apple’s Silicon Valley rivals are said to be “glowing” over its system to scan photos on US users’ iPhones before uploading them to iCloud, which will be launched as part of the next version of iOS.
“This approach introduces something very worrying to the world,” said Will Cathcart, head of WhatsApp. “This is an Apple-built and operated surveillance system that can be very easily used to scan private content for anything they or a government decides to monitor. It is disturbing to see how they act without bringing in experts.”
“We will not take over at WhatsApp,” he added.
The enthusiastic response from lawmakers will only add to the security and privacy community’s concerns that Apple has set a dangerous precedent that could be exploited by repressive regimes or overzealous law enforcement.
Apps including WhatsApp, Telegram and Signal, owned by Facebook, as well as Google with its Android operating system, are already being urged to replicate Apple’s model.
“To say we’re disappointed with Apple’s plans is an understatement,” India McKinney and Erica Portnoy of digital rights group the Electronic Frontier Foundation said in a blog post. “Apple’s compromise on end-to-end encryption may appease government agencies in the US and abroad, but it’s a shocking reversal for users who have relied on the company’s leadership in privacy and security.”
Jennifer Granick, surveillance and cybersecurity consultant for the American Civil Liberty Union’s Project on Speech, Privacy and Technology, added, “As altruistic as the motives may be, Apple has built an infrastructure that can be undermined for widespread monitoring of the conversations and information that we keep track of our phones.”
Political pressure on tech companies around the world has increased in recent months to grant government access to encrypted content, including messages, photos and videos.
Indian Prime Minister Narendra Modi recently passed laws that force technology platforms like WhatsApp to trace the source of illegal messages, effectively breaking end-to-end encryption. WhatsApp is currently engaged in a legal battle with the government in an effort to thwart the new rules.
Last October, officials including UK Home Secretary Priti Patel and former US Attorney General William Barr said in an open letter signed by the “Five Eyes” countries plus Japan and India that they “urged the industry to address our grave concerns where encryption is applied in a way that completely precludes any legal access to content.”
They noted that child abuse was one of the reasons they felt the tech companies should develop alternative methods of allowing authorities to access device content, and that there was “increasing consensus among governments and international institutions that action should be taken.”
Critics have expressed skepticism about Apple’s promise to limit itself to scanning for child abuse images. “I hate going on a slippery slope, but I look at the slope, and governments around the world are covering it with oil, and Apple just pushed its customers over the edge,” said Sarah Jamie Lewis, a cryptographer. researcher and executive director of Canadian NGO OpenPrivacy.
While there’s no US legislation forcing Apple to look for this type of material yet, this move comes as the UK and EU prepare new legislation – the Online Safety Bill and Digital Services Act – that would tax tech companies more heavily. to impose restrictions. the distribution of child pornography, among other forms of harmful content.
Apple’s decision to go ahead with its own individual system, rather than engage in cross-sector negotiations with regulators around the world, has rocked its Silicon Valley neighbors — especially after they banded together to support the 2016 legal battle. against the FBI over access to a terrorist iPhone belonging to the suspect.
“Part of the response I’ve heard from other competitors at Apple is that they’re glowing,” said Matthew Green, a security professor at Johns Hopkins University, during an online video discussion with researchers at Stanford University on Thursday.
Alex Stamos, the former Facebook security chief who is now director of the Stanford Internet Observatory, said during the same discussion that Apple “doesn’t care at all that everyone is trying to achieve this delicate international balance.” “Obviously there will be immediate pressure on WhatsApp,” he said.
An Apple executive on Thursday acknowledged the furore his actions had caused in an internal memo. “We’ve seen a lot of positive response today,” wrote Sebastien Marineau in a note obtained by Apple blog 9to5Mac. “We know some people have misunderstandings, and more than a few are concerned about the implications, but we’ll continue to explain and detail the features so people understand what we’ve built.”
Facebook and Google have not yet publicly responded to Apple’s announcement.
Apple has previously been criticized from some quarters for not doing more to prevent abusive material from circulating, especially on iMessage. Because the iPhone’s messaging app is end-to-end encrypted, the company has been unable to see any photos or videos exchanged between its users.
Messages exchanged between two senior Apple engineers, which were produced as evidence in the iPhone maker’s recent legal battle with Epic Games, suggest that some within the company believed it could do more.
In the exchange, dated early last year and first discovered by the Tech Transparency Project, Eric Friedman, head of Apple’s Fraud Engineering Algorithms and Risk unit, suggested that compared to Facebook, “we are the best platform for distributing child pornography”.
“We’ve chosen not to know in enough places where we really can’t say” how much child sexual abuse material there might be, Friedman added.
Additional coverage from Stephanie Findlay in Delhi, Valentina Pop in Brussels and Hannah Murphy in San Francisco