Apple sued over abandoning CSAM detection for iCloud

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. The suit describes […]

© 2024 TechCrunch. All rights reserved. For personal use only.

 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *