Lawsuit: Apple Fails to Prevent Spread of Child Pornography on iCloud

By Breitbart News Network | Created at 2024-12-11 21:03:39 | Updated at 2024-12-22 12:40:12 1 week ago
Truth

A 27-year-old woman who was a victim of child sexual abuse is suing Apple for over $1.2 billion, claiming the company failed to implement its own system to detect and remove child pornography from its iCloud service.

The New York Times reports that Apple is facing a major lawsuit that could potentially cost the tech giant over $1.2 billion in damages. The suit, filed over the weekend in the U.S. District Court in Northern California, alleges that Apple failed to protect victims of child sexual abuse by not implementing its own system to identify, remove, and report child pornography, also known by the technical term child sexual abuse material (CSAM) stored on its iCloud service, the company’s popular cloud storage offering.

The plaintiff, a 27-year-old woman from the Northeast who is using a pseudonym due to the sensitive nature of the case, was a victim of child sexual abuse from infancy. The abuse was perpetrated by a relative who took photographs of the acts and shared them online with others. The woman continues to receive law enforcement notices, sometimes dozens per day, informing her that the illegal images have been found in the possession of individuals charged with crimes related to child pornography.

In 2021, Apple unveiled a tool called NeuralHash that would allow the company to scan for known child sexual abuse images by comparing their distinct digital signatures, or hashes, against photos stored in a user’s iCloud account. However, the company quickly abandoned the system after facing criticism from cybersecurity experts who warned that it could create a backdoor to iPhones and enable government surveillance.

The lawsuit argues that by introducing and then abandoning the NeuralHash system, Apple broke its promise to protect victims of child sexual abuse and allowed the illegal material to proliferate on its platform. The suit seeks to change Apple’s practices and compensate a potential group of 2,680 victims who are eligible to be part of the case. Under the law, victims of child sexual abuse are entitled to a minimum of $150,000 in damages, which could result in a total award exceeding $1.2 billion if a jury finds Apple liable.

This case is the second of its kind against Apple, following a lawsuit filed in August by a 9-year-old girl in North Carolina who was sent child sexual abuse videos through iCloud links and encouraged to film and upload her own nude videos. Apple has filed a motion to dismiss the North Carolina case, citing Section 230 of the Communications Decency Act, which provides tech companies with legal protection for content posted on their platforms by third parties.

The outcome of these lawsuits could have significant implications for the tech industry, as recent rulings by the U.S. Court of Appeals for the Ninth Circuit have determined that Section 230 shields may only apply to content moderation and do not provide blanket liability protection. This has raised hopes among plaintiffs’ attorneys that tech companies could be challenged in court over their handling of illegal content on their platforms.

Apple has defended its practices, stating that it is committed to fighting the ways predators put children at risk while maintaining the security and privacy of its users. The company has introduced safety tools to curtail the spread of newly created illegal images, such as features in its Messages app that warn children of adult content and allow people to report harmful material to Apple.

However, critics argue that Apple has prioritized privacy and profit over the safety of victims of child sexual abuse. For years, the company has reported significantly less abusive material than its peers, capturing and reporting only a small fraction of what is caught by Google and Facebook.

The plaintiff in the current lawsuit decided to sue Apple because she believes the company gave victims of child sexual abuse false hope by introducing and then abandoning the NeuralHash system. As an iPhone user herself, she feels that Apple chose privacy and profit over the well-being of people like her who have suffered immensely from the spread of illegal images.

Read more at the New York Times here.

Lucas Nolan is a reporter for Breitbart News covering issues of free speech and online censorship.

Read Entire Article