Nearly 60,000 people signed online petitions to oppose Apple’s controversial plan to scan iPhones for child sexual abuse material, digital rights groups said on Wednesday.
While Apple announced last week that it would press pause on its plans, the petitions demand that the tech giant go further and completely abandon it.
Apple’s plan, which drew immediate pushback from digital rights groups and privacy organizations, would scan users photos and messages for child sexual abuse material and sexually explicit material. Specifically, Apple said a user’s phone would scan photos that are synched with iCloud Photos for known pictures of sexual abuse imagery. It also said it would scan messages of users who are under 18 for sexually explicit content, and if a user was under 13 and they tapped through a warning about the content, Apple would alert a parent about it.
Concerns about the implications of the technology—from how it could be used in the future to how it could impact children who don’t have good relationships with their parents—were raised almost immediately.
On Wednesday, digital rights groups including Fight for the Future and the Electronic Frontier Foundation (EFF) held a press conference where they unveiled the petitions they have collected from users opposed to the plan and explained their concerns further. Overall, 59,796 petitions were delivered to Apple, the groups said.
“If Apple moves forward with this plan, it will have massive consequences,” Caitlin Seeley George, a campaign director at Fight for the Future, said. “Not only on the phones of millions of people, but on everyone’s ability to communicate without being under surveillance. While Apple might have postponed the rollout of this software, we want to be perfectly clear: There is no safe way to do what they are proposing.”
Seeley George added that the groups are planning in-person protests at Apple stores on Monday.
Bruce Schneier, a security technologist, echoed concerns that have been raised about the implications the technology could have in the future.
“Once you embed this system into a phone, there’s absolutely nothing except the goodwill of people to stop the government of China from putting what they consider to be prohibited content into the system. The system exists. There’s nothing special about the images Apple wants to detect versus the ones they don’t,” Schneier said, adding: “You build the system, and it can be used for anything. You can target other content, you can target people.”
Joseph Mullin, a policy analyst at EFF, said the amount of petitions the groups collected should make it clear that users “want the devices that they have bought and paid for to work for them.”
“It doesn’t mean ‘work for the user and also be compelled to help with a law enforcement project.’ It doesn’t mean ‘work for the user and also send reports back on content to Apple employees.’ It means just work for the user,” Mullin said. “Make no mistake, if Apple decides to move forward with this phone scanning plan, it will be a brand new mass surveillance system. At the end of the day, it’s not that different than what the FBI and other agencies have been asking for for decades now, it is not that different than the types of systems that foreign governments would like to access.”
In its announcement that it was pausing the roll out of the plan, Apple said it made its decision after hearing feedback from “customers, advocacy groups, researchers, and others.”