Apple employees concerned over plan to monitor consumers’ phones for evidence child abuse

Photo (c) Prykhodov - Getty Images

Surveillance and censorship by foreign countries leads the list of concerns

Apple’s blueprint to scan iPhone users' photo libraries for child sexual abuse material (CSAM) has raised the ire of some of its own employees. According to a report from Reuters, Apple staff have begun using internal company Slack channels to post hundreds of messages that voice their concerns about the proposal.

Their biggest concern is that governments that have been known to use mobile phones to spy on people will employ the software for uses other than CSAM, like finding material they could use to censor or arrest people. 

Past security changes at Apple have also prompted concern among employees, but the volume and duration of the new debate are surprising, the workers said. Some posters worried that Apple is damaging its leading reputation for protecting privacy.

Employee opinions are mixed

The pushback by Apple employees may sound earth-shattering, but not everyone at the company is up in arms. Reuters said some employees have questioned their peers’ criticism in the Slack thread devoted to the photo-scanning feature. Others said Slack wasn't the proper place to hold discussions like this.

One integral workgroup at Apple -- the security team -- was staying away from the back-and-forth on Slack, but opinions vary. 

Some couched Apple’s effort as a rational response to pressure to get tough on illegal and illicit material. Others said they hoped the scanning tool would eventually lead to the development of better encryption tools for iCloud customers who want a more powerful layer of security.

Critical employees have outside support

Other privacy advocates have also expressed their concerns about Apple's proposed scanning tool. They claim Apple is softening its stance on privacy and that the company’s willingness to do these kinds of scans could potentially start a boulder rolling downhill that would allow governments to ask for more information in the future.

One group -- the Center for Democracy and Technology (CDT) -- says Apple’s proposed changes create new risks to children and all users while marking a significant departure from long-held privacy and security protocols.

"What Apple is showing with their announcement last week is that there are technical weaknesses that they are willing to build in," CDT project director Emma Llanso said in an interview. "It seems so out of step from everything that they had previously been saying and doing."

“Apple is replacing its industry-standard end-to-end encrypted messaging system with an infrastructure for surveillance and censorship, which will be vulnerable to abuse and scope-creep not only in the U.S., but around the world,” says Greg Nojeim, Co-Director of CDT’s Security & Surveillance Project. “Apple should abandon these changes and restore its users’ faith in the security and integrity of their data on Apple devices and services.”

Take an Identity Theft Quiz. Get matched with an Authorized Partner.