The company stopped using a child sexual material scanning tool, saying it posed a risk to user privacy, the complaint noted.
Apple isn’t doing enough to stop the spread of child sexual abuse material (CSAM) on its iCloud and iMessage offerings, accused a plaintiff in a recently filed lawsuit.
The lawsuit was filed by a 9-year-old unnamed minor through her guardian. Between December 2023 and January 2024, the plaintiff received friend requests from two unknown Snapchat users.
The strangers asked for the plaintiff’s iCloud ID, Apple’s storage service, after which they sent five CSAM videos through iMessage, the company’s instant messaging service. The videos depicted young children engaged in sexual intercourse. The strangers then asked the minor via iMessage to make explicit videos.
“As a result of this interaction, Plaintiff is severely harmed, mentally and physically. Plaintiff is currently seeking psychotherapy and mental health care,” said the suit.
The Epoch Times reached out to Apple for a comment but didn’t immediately receive a response.
The proposed class-action lawsuit accused Apple of using the pretext of protecting privacy to “look the other way” while child sexual material proliferated on iCloud, pointing to the company’s abandonment of the NeuralHash CSAM scanning tool.
The tech firm warned in the email that such scanning initiatives could lead to bulk surveillance and concerns among users that they are being screened for political or religious viewpoints, potentially having a negative effect on free speech.
The complaint accused the tech company of “engaging in ‘privacy-washing,’ a deceptive marketing tactic where Apple touts its commitment to protect the privacy of its consumers but neglect to meaningfully implement its stated ideas to practice.”
Apple has “consistently” underreported CSAM to agencies like the National Center for Missing & Exploited Children (NCMEC), according to the complaint.
“Despite Apple’s abundant resources, it opts not to adopt industry standards for CSAM detection and instead shifts the burden and cost of creating a safe user experience to children and their families, as evidenced by its decision to completely abandon its CSAM detection tool,” the complaint states.
CSAM on Apple
A September 2023 report by nonprofit Heat Initiative identified 93 CSAM cases involving Apple products or services, with most victims being under 13 years of age, the complaint said. Out of the 93 cases, 34 involved the use of iCloud to store and distribute CSAM.
In Epic Games’ lawsuit against Apple, an Apple employee told a colleague that Apple’s strict focus on privacy made it the “greatest platform for distributing child porn,” the lawsuit said, citing messages from the chat.
This action essentially negated the privacy of anonymous Chinese citizens in a nation internationally recognized for persecuting dissidents.
Proliferation of CSAM on Tech Platforms
In April, the National Center on Sexual Exploitation (NCOSE) released its “Dirty Dozen List,” detailing companies that allegedly are the “mainstream contributors to sexual exploitation.”
Topping the list were three of the world’s most influential firms—Apple, Microsoft, and Meta.
“Their products and policies are also undoubtedly fueling the child sexual abuse crisis and enabling the proliferation of image-based sexual abuse,” said NCOSE Vice President and Director of Corporate Advocacy Lina Nealon.
“Instead of dedicating the necessary resources to prevent exploitation of both children and adults, they are prioritizing profit and engaging in an AI arms race.”
The Act seeks to hold tech firms accountable for CSAM by allowing victims of child sexual exploitation to bring a civil cause of action against companies that host, store, or make available such material.
The bill also allows restitution for victims of child exploitation and empowers them to ask tech firms to remove CSAM content from their platforms. There would be an administrative penalty for platforms that fail to comply with removal requests.