Vogon Today

Selected News from the Galaxy

StartMag

Does Apple protect children or privacy more?

Does Apple protect children or privacy more?

Apple finds itself between two fires: on one side there are those who ask for more safety for children, claiming that child pornography material remains stored on iCloud, on the other there are privacy experts, who want it to maintain the promised level of confidentiality . The New York Times article

In 2021, Apple was embroiled in controversy over a plan to scan iPhones for child pornography. Privacy experts warned that governments could abuse the system, and the backlash was so strong that Apple eventually abandoned the plan. The NYT writes.

Two years later, Apple is facing criticism from child safety advocates and activist investors who are calling for the company to do more to protect children from online abuse.

A child advocacy group, the Heat Initiative, has raised $2 million for a new national advertising campaign calling on Apple to identify, report and remove child pornography from iCloud, its cloud storage platform.

Next week the group will run digital ads on websites popular with Washington politicians, such as Politico . It will also put up posters in San Francisco and New York that read: “Child pornography is stored on iCloud. Apple allows it."

The criticism refers to a problem that has plagued Apple for years. The company has made privacy protection a central part of its iPhone proposition to consumers. But this promise of security has helped make its services and devices, of which two billion are in use, useful tools for sharing images of child sexual abuse.

The company is caught between child safety groups, who want it to do more to stop the spread of these materials, and privacy experts, who want it to deliver on its promise of safe devices.

A group of two dozen investors with nearly $1 trillion in assets under management also called on Apple to publicly disclose the number of abusive images it captures across its devices and services.

Two investors – Degroof Petercam, a Belgian asset manager, and Christian Brothers Investment Services, a Catholic investment firm – will submit a proposal to shareholders this month that will require Apple to provide a detailed report on the effectiveness of its security tools in protecting children.

“Apple seems stuck between privacy and action,” said Matthew Welch, investment specialist at Degroof Petercam. “We thought a proposal would wake up management and make them take the issue more seriously.”

Apple responded quickly to child safety advocates. In early August, its privacy executives met with the investor group, Welch said. Then on Thursday, the company responded to an email from the Heat Initiative with a letter defending its decision not to scan iCloud. The company shared the correspondence with Wired .

In Apple's letter, Erik Neuenschwander, director of user privacy and child safety, said the company has concluded that it is "practically not possible" to scan iCloud photos without "jeopardizing security and privacy of our users."

“Scanning one type of content, for example, opens the door to mass surveillance and could create a desire to search other encrypted messaging systems,” Neuenschwander said.

Apple, he added, has created a new default feature for all children's accounts that warns them if they receive or try to send nude images. The goal is to prevent the creation of new child pornography and limit the risk that predators can coerce and blackmail children in exchange for money or nude images. It has also made these tools available to app developers.

In 2021, Apple said it would use a technology called image hashing to spot abusive material on iPhones and iCloud.

But the company hasn't communicated this plan widely to privacy experts, intensifying their skepticism and fueling concerns that governments could abuse the technology, said Alex Stamos, director of the Stanford Internet Observatory at the Cyber ​​Policy Center, which he opposed the idea.

Last year, the company quietly dropped its iCloud scanning plan, catching child safety groups by surprise.

Apple has won praise from child privacy and safety groups for its efforts to stem the creation of new nude images on iMessage and other services. But Stamos, who applauded the company's decision not to scan iPhones, said it could do more to prevent people from sharing problematic images in the cloud.

“You can have privacy if you store something for yourself, but if you share something with someone else, you don't have the same privacy,” Stamos said.

Governments around the world are putting pressure on Apple to take action. Last year, Australia's e-safety commissioner released a report criticizing Apple and Microsoft for not doing more to proactively monitor their services for abusive material.

In the United States, Apple made 160 reports in 2021 to the National Center for Missing and Exploited Children, a federally designated clearinghouse for abusive material. Google made 875,783 reports, while Facebook made 22 million. These reports do not always reflect truly abusive material; some parents have had their Google accounts suspended and been reported to the police for images of their children that were not criminal in nature.

The Heat Initiative previewed the campaign at Apple's annual iPhone reveal, scheduled for September 12. The campaign is led by Sarah Gardner, who was previously vice president of external affairs at Thorn, a nonprofit founded by Ashton Kutcher and Demi Moore to combat online child sexual abuse. Gardner has raised money from several child safety advocates, including the Children's Investment Fund Foundation and the Oak Foundation.

The group created a website documenting law enforcement cases in which iCloud has been named. The list will include child pornography charges leveled against a 55-year-old New York man who had more than 200 images stored on iCloud.

Gardner said the Initiative plans to run targeted advertising throughout the fall in areas where Apple customers and employees will encounter it. “The goal is to continue implementing the tactic until Apple changes its policy,” Gardner said.

(Excerpt from the foreign press review edited by eprcomunicazione )


This is a machine translation from Italian language of a post published on Start Magazine at the URL https://www.startmag.it/innovazione/apple-protegge-piu-i-bambini-o-la-privacy/ on Sat, 09 Sep 2023 06:06:53 +0000.