US State Sues Apple For ‘Allowing Child Sexual Abuse Materials On iCloud’ For Years

US State Sues Apple For ‘Allowing Child Sexual Abuse Materials On iCloud’ For Years

Last Updated:

Apple defended its security components to CNN asserting it supplies a feature that blurs picture when nudity is detected while receiving or attempting to ship enlighten..

Apple said it offers a feature called Communication Safety that warns children and blurs the image when nudity is detected while receiving or attempting to send content.

Apple acknowledged it supplies a feature called Conversation Safety that warns teens and blurs the picture when nudity is detected while receiving or attempting to ship enlighten.

The West Virginia of US on Thursday sued Apple alleging that the tech huge had allowed child sexual abuse supplies to be saved and distributed on its iCloud carrier.

The lawsuit, filed by the licensed expert fashioned’s office, has claimed that Apple prioritised user privacy over child security for years.

CNN quoted the office declaring that the company has tight withhold an eye on over its hardware, instrument and cloud infrastructure, which manner it would possibly maybe possibly not claim to be blind to the sing, the licensed expert fashioned’s office argued.

They acknowledged that US-based mostly completely tech companies are federally required to document such detected enlighten to the National Heart for Lacking and Exploited Children. Whereas Google filed 1.47 million reports in 2023, Apple allegedly filed fully 267, CNN reported.

Additionally Read: UK Orders Social Media Corporations To Get rid of Abusive Photos Within forty eight Hours Of Receiving Alert

“These photography are a everlasting myth of a child’s trauma, and that child is revictimized every time the arena materials is shared or viewed,” West Virginia Attorney General JB McCuskey said in a news release. “This conduct is despicable, and Apple’s inaction is inexcusable.”

“At Apple, retaining the safety and privacy of our customers, especially teens, is central to what we originate. We are innovating every day to wrestle ever-evolving threats and withhold the safest, most relied on platform for youths,” an Apple spokesperson said in a comment to CNN.

The company also pointed out that it offers a feature called Communication Safety that warns children and blurs the image when nudity is detected while receiving or attempting to send content. It works in apps like Messages and FaceTime, as well as over AirDrop and in the iPhone’s Contact Posters feature and the Photos app image selection tool. The spokesperson added that Apple’s parental controls and features “are designed with the safety, security, and privacy of our users at their core”

The West Virginia’s licensed expert fashioned’s office has sought statutory and punitive damages, injunctive reduction, to boot to requirements for Apple to implement efficient detection measures.

This comes amid elevated scrutiny on the consequences of Substantial Tech’s influence on teens.

Handpicked reviews, for your inbox

A e-newsletter with one of the best of our journalism

Spot :

Washington D.C., United States of The united states (USA)

First Published:

February 19, 2026, 23:13 IST

Recordsdata world US Assert Sues Apple For ‘Permitting Youngster Sexual Abuse Materials On iCloud’ For Years

Disclaimer: Comments replicate customers’ views, now not News18’s. Please withhold discussions respectful and optimistic. Abusive, defamatory, or illegal comments would possibly be eliminated. News18 would possibly maybe well also honest disable any comment at its discretion. By posting, you settle to our Terms of Utilize and Privacy Policy.

Read Extra

Read Extra

Leave a Reply

Your email address will not be published. Required fields are marked *

Back To Top