By Rona Ankrah Camille Vasquez joins US firm Brown Rudnick’s partnership US law firm Brown Rudnick…
By AP/ABC News
Apple says it is planning to scan US iPhones for images of child abuse, an announcement that has the support of child protection groups.
But the move has raised concerns among security researchers that the system could be misused by governments looking to monitor their citizens.
Apple announced on Thursday that its messaging app would use on-device machine learning to warn about sensitive content without making private communications readable by the company.
The tool Apple calls neuralMatch will detect known images of child sexual abuse without decrypting people’s messages. If it finds a match, the image will be reviewed by a staff member who can notify law enforcement if necessary.
But researchers say the tool could be put to other purposes such as government surveillance of dissidents or protesters.
Matthew Green, a security professor at Johns Hopkins University who earlier posted his concerns on Twitter, told The Financial Times that Apple’s move will “break the dam — governments will demand it from everyone.”
Tech companies, including Microsoft, Google, and Facebook, have for years been sharing “hash lists” of known images of child sexual abuse.
Apple has also been scanning iCloud, which unlike its messages is not end-to-end encrypted, for such images.
The company has been under pressure from governments and law enforcement to allow for surveillance of encrypted data.
Apple was one of the first major companies to embrace “end-to-end” encryption, in which messages are scrambled so that only their senders and recipients can read them.
But federal authorities have long pushed for access to that information in order to investigate crimes such as terrorism or child sexual exploitation.
“Apple’s expanded protection for children is a game-changer,” said John Clark from the National Center for Missing and Exploited Children.
“With so many people using Apple products, these new safety measures have lifesaving potential for children who are being enticed online and whose horrific images are being circulated in child sexual abuse material.”