8.7 C
New York
November 21, 2024
News

Lawsuit Accuses Apple of Ignoring Child Sexual Abuse Content on iCloud

The Epoch Times

The company stopped using a child sexual material scanning tool, saying it posed a risk to user privacy, the complaint noted.

Apple isn’t doing enough to stop the spread of child sexual abuse material (CSAM) on its iCloud and iMessage offerings, accused a plaintiff in a recently filed lawsuit.

The complaint, filed in the U.S. District Court Northern District of California on Tuesday, claimed Apple “knew that it had dire CSAM problem but chose not to address it.”

The lawsuit was filed by a 9-year-old unnamed minor through her guardian. Between December 2023 and January 2024, the plaintiff received friend requests from two unknown Snapchat users.

The strangers asked for the plaintiff’s iCloud ID, Apple’s storage service, after which they sent five CSAM videos through iMessage, the company’s instant messaging service. The videos depicted young children engaged in sexual intercourse. The strangers then asked the minor via iMessage to make explicit videos.

“As a result of this interaction, Plaintiff is severely harmed, mentally and physically. Plaintiff is currently seeking psychotherapy and mental health care,” said the suit.

The Epoch Times reached out to Apple for a comment but didn’t immediately receive a response.

The proposed class-action lawsuit accused Apple of using the pretext of protecting privacy to “look the other way” while child sexual material proliferated on iCloud, pointing to the company’s abandonment of the NeuralHash CSAM scanning tool.

In 2021, Apple announced NeuralHash—a tool for scanning child sexual material on iCloud—to later drop the project claiming “serious unintended consequences for our users,” the lawsuit said, citing an Apple email.

The tech firm warned in the email that such scanning initiatives could lead to bulk surveillance and concerns among users that they are being screened for political or religious viewpoints, potentially having a negative effect on free speech.

The complaint accused the tech company of “engaging in ‘privacy-washing,’ a deceptive marketing tactic where Apple touts its commitment to protect the privacy of its consumers but neglect to meaningfully implement its stated ideas to practice.”

Apple has “consistently” underreported CSAM to agencies like the National Center for Missing & Exploited Children (NCMEC), according to the complaint.

Last year, leading tech companies submitted more than 35 million CSAM reports to NCMEC, while Apple submitted 267.

“Despite Apple’s abundant resources, it opts not to adopt industry standards for CSAM detection and instead shifts the burden and cost of creating a safe user experience to children and their families, as evidenced by its decision to completely abandon its CSAM detection tool,” the complaint states.

The complaint asked Apple to “invest in and deploy means to comprehensively enhance user privacy and guarantee the safety of children users.”

CSAM on Apple

A September 2023 report by nonprofit Heat Initiative identified 93 CSAM cases involving Apple products or services, with most victims being under 13 years of age, the complaint said. Out of the 93 cases, 34 involved the use of iCloud to store and distribute CSAM.

The lawsuit cited an interview given by Jon Rouse, a former child abuse investigator from Australia, to Forbes, in which he claimed Apple does not do any “proactive scanning” of its products and service to assist law enforcement to counter child exploitation.

In Epic Games’ lawsuit against Apple, an Apple employee told a colleague that Apple’s strict focus on privacy made it the “greatest platform for distributing child porn,” the lawsuit said, citing messages from the chat.

Apple’s insistence on maintenance of user privacy was questioned in the lawsuit which cited a compromising instance when the company transferred operations of iCloud Chinese users to a Chinese firm GCBD in 2018.

This action essentially negated the privacy of anonymous Chinese citizens in a nation internationally recognized for persecuting dissidents.

According to China’s Cyber Security law, GCBD is obligated to provide the personal data of users to Chinese authorities upon request. This counters Apple’s claims of prioritizing people’s privacy, said the lawsuit.

Proliferation of CSAM on Tech Platforms

In April, the National Center on Sexual Exploitation (NCOSE) released its “Dirty Dozen List,” detailing companies that allegedly are the “mainstream contributors to sexual exploitation.”

Topping the list were three of the world’s most influential firms—Apple, Microsoft, and Meta.

“Their products and policies are also undoubtedly fueling the child sexual abuse crisis and enabling the proliferation of image-based sexual abuse,” said NCOSE Vice President and Director of Corporate Advocacy Lina Nealon.

“Instead of dedicating the necessary resources to prevent exploitation of both children and adults, they are prioritizing profit and engaging in an AI arms race.”

To counter the spread of CSAM content online, Sen. Dick Durbin (D-Ill.) introduced the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act of 2023 (STOP CSAM Act) in May last year.

The Act seeks to hold tech firms accountable for CSAM by allowing victims of child sexual exploitation to bring a civil cause of action against companies that host, store, or make available such material.

The bill also allows restitution for victims of child exploitation and empowers them to ask tech firms to remove CSAM content from their platforms. There would be an administrative penalty for platforms that fail to comply with removal requests.

Source link

Related posts

Stellantis to Cut Up to 2,450 Jobs as Ram 1500 Classic Production Winds Down

Tom Ozimek

Investigation Finds 9 Employees of UNRWA May Have Participated in Oct. 7 Attacks in Israel

Tom Ozimek

US Defense Supply Chains Still Depend on China

Antonio Graceffo

Student’s paper on Mexican repatriation could lead to new L.A. statue

Angie Orellana Hernandez

Police Arrest School Director, Five Others Over Child Trafficking

Wivanda

L.A. man shoots woman, himself after confrontation in another man’s home, officials allege

Connor Sheets

Leave a Comment