Apple isn’t doing enough to stop the spread of child sexual abuse material (CSAM) on its iCloud and iMessage offerings, accused a plaintiff in a recently filed lawsuit.
The complaint, filed in the U.S. District Court Northern District of California on Tuesday, claimed Apple “knew that it had dire CSAM problem but chose not to address it.”
The lawsuit was filed by a 9-year-old unnamed minor through her guardian. Between December 2023 and January 2024, the plaintiff received friend requests from two unknown Snapchat users.
The strangers asked for the plaintiff’s iCloud ID, Apple’s storage service, after which they sent five CSAM videos through iMessage, the company’s instant messaging service. The videos depicted young children engaged in sexual intercourse. The strangers then asked the minor via iMessage to make explicit videos.
“As a result of this interaction, Plaintiff is severely harmed, mentally and physically. Plaintiff is currently seeking psychotherapy and mental health care,” said the suit.
Read Full Article Here