Survivors praise Apple’s new tool to detect child sexual abuse, but backlash intensifies: NPR

[ad_1]

Apple unveiled new features last week aimed at tackling child sexual abuse. The changes are celebrated by the families of survivors of sexual abuse. But privacy advocates are fighting to stop it.

Apple


hide caption

toggle legend

Apple


Apple unveiled new features last week aimed at tackling child sexual abuse. The changes are celebrated by the families of survivors of sexual abuse. But privacy advocates are fighting to stop it.

Apple

About ten years ago, a family member of Ann was arrested for taking sexually abusive photos of her child and distributing them online.

“Imagine the worst thing that ever happened to you was recorded and then shared over and over again for the enjoyment of others,” Ann told NPR. She did not want to reveal her full name to preserve the privacy of her family.

As with so many Internet crimes, the nightmare did not end with the arrest. Her child’s real name was used with the photos circulating.

“Ten years later, we still have people trying to find my child, looking for pictures, looking for new pictures,” she said. “It’s a constant, constant battle.”

For years, child welfare groups have been lobbying Apple, the world’s largest tech maker, to help stop the spread of abusive images taken and shared on its devices. Now the company is about to act.

In the coming months, Apple will be rolling out an update to its iOS operating system. It will contain a tool capable of scanning and identifying child pornography on iPhones and other Apple devices. The announcement, made last week, encouraged Ann.

“I can’t think of a family that I know that isn’t a fan of companies like Apple who are stepping up and saying, ‘Let’s help prevent children from being abused,'” she said.

Apple, which has built its reputation on the security of its devices, lags behind other big tech companies. Last year, Facebook reported over 20 million images of child sexual abuse on its platforms. Apple reported less than 300.

But the photo scan tool, which is one of the many changes made by apple to better protect children, has sparked an uproar among privacy and security experts. Through open letters and newspaper editorials, critics have argued that the technology creates a “backdoor” on Apple devices that could be used for more nefarious activities like government surveillance.

Privacy advocates fear tool may be abused

How Apple’s system works is complicated, but it boils down to this: A database of known child abuse images maintained by the National Center for Missing & Exploited Children has been distilled into pieces of encrypted code that will be stored on Apple devices.

Apple has created an automated process to compare this code to photos saved on iCloud. The company says there There must be 30 matches before notifying the nonprofit, which works with law enforcement to investigate child sexual abuse.

Facebook, Google, Twitter, Reddit and other companies are analyzing images uploaded to their platforms for possible abuse, but Apple’s tool will scan photos on personal devices, which has triggered intense resistance.

Computer programmer and writer John Gruber, who runs the Apple Blog Bold fireball, said he was surprised by the public reaction.

“Part of this may be because the processing takes place on your devices,” he said. “And it can violate a sense of personal ownership.”

Apple officials told reporters on Friday that following criticism from privacy groups, the company will let human rights organizations check how its photo scanning system is working. to make sure the tool is not misused.

“That doesn’t do an analysis, did you have a picture of your kid in the tub? Or, for that matter, did you have a picture of some pornography of any kind? It literally only matches the exact fingerprints. D ‘known child pornography images “, Federighi Recount the the Wall Street newspaper.

Yet more than 7,000 developers and security and privacy experts have signed up an online petition asking Apple to abandon the plan, calling it a backdoor that threatens the privacy of all users of Apple products.

The petition says the new feature sets “a precedent where our personal devices are becoming a radical new tool for invasive surveillance, with little surveillance to prevent possible abuse and an unreasonable extension of the scope of surveillance.”

The opposition also includes the lead the encrypted messaging service of Facebook WhatsApp and Edward snowden. Even some Apple employees have raised concerns, sparking debate within the company.

India McKinney of the Electronic Frontier Foundation says the technology is like “putting a bunch of scanners in a black box on your phone.”

She said Apple has bowed to authoritarian governments before. For example, it sells iPhones in Saudi Arabia without FaceTime because local laws prohibit encrypted calls. The fear is therefore that Apple will make similar concessions with its photo scanning technology.

Apple said such a request would be refused.

“We have already faced requests to create and deploy government-imposed changes that degrade user privacy, and we have firmly refused those requests. We will continue to refuse them in the future,” Apple said. in a press release. Faq on functionality.

The company says its tool is designed only to detect images of child sexual abuse. People could choose what amounts to opting out just by not using iCloud as a backup for their photos. And the software will only be introduced in the United States for now.

The proliferation of images of child sexual abuse is “very overwhelming”

Gruber, who studied Apple’s system for scanning photos saved in the cloud has concluded that if Apple meets the promised limits, it will not compromise user privacy.

“I really believe Apple has carved out a very carefully planned position for itself that I think preserves the privacy that people expect from their Apple devices,” he said.

Still, that hasn’t stopped critics from examination though Apple’s “What Happens on Your iPhone Stays on Your iPhone” billboards are still accurate.

And if scanning technology is misused or abused, Gruber said, it could be disastrous for Apple.

Ann, meanwhile, watches the debate and thinks of her child, who is now a young adult.

Ann said Apple’s new measures won’t completely keep those images of her child on the internet. But since the images are part of the National Center for Missing & Exploited Children’s photo database, Apple’s system would make it much harder for people to share them.

“I know my child’s images have been identified hundreds of thousands of times, so there are quite a number of them,” she said.

Whenever the National Center for Missing and Exploited Children finds an image, it informs Ann.

And to this day, Ann has said, “It can be very overwhelming.”

Editor’s Note: Apple is one of the financial backers of NPR.



[ad_2]

Source Link