Apple keeps quiet about plans to locate known CSAMs stored in iCloud Photos

by admin
0 comment 2 views
Apple keeps quiet about plans to locate known CSAMs stored in iCloud Photos
Listen to this article

It’s now more than a year since Apple announced plans for three new child protection features, including a system for detecting known child sexual abuse material (CSAM) images stored in iCloud Photos, explicit sexual abuse in the Messages app. Includes an option to blur photos, and child abuse resources for Siri. The latter two features are available now, but Apple is silent about its plans for a CSAM detection feature.

icloud general feature
Apple initially said that CSAM detection would be implemented in updates to iOS 15 and iPadOS 15 by the end of 2021, but the company eventually shelved the feature based on “feedback from customers, advocacy groups, researchers, and others.” Gave.

In September 2021, Apple posted the following update to its child safety page,

Previously we announced plans for facilities that help protect children from predators who use and exploit communication tools and help limit the spread of child sexual abuse material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to gather input and make improvements before releasing these important child safety features.

In December 2021, Apple removed all references to the above update and CSAM detection plan from its child safety page, but an Apple spokesperson informed ledge That Apple’s plans for the feature hadn’t changed. However, to the best of our knowledge, Apple has not publicly commented on the plans since that time.

We’ve reached out to Apple to ask if this feature is still planned. Apple did not immediately respond to a request for comment.

Apple moved forward with implementing its own child safety features for the Messages app and Siri with the release of iOS 15.2 and other software updates in December 2021, and it has rolled out the Messages app feature with iOS in Australia, Canada, New Zealand and the UK. be extended. 15.5 and other software released in May 2022.

Apple said its CSAM detection system was “designed with user privacy in mind.” The system will perform “on-device matching using a database of known CSAM image hashes” from child protection organizations, which Apple will “convert to an unreadable set of hashes that is stored securely on users’ devices.”

Apple plans to report iCloud accounts containing known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a non-profit organization that works in collaboration with US law enforcement agencies. Apple said there would be a “threshold” that would ensure a “less than one in a trillion chance per year” of an account being incorrectly flagged by the system, as well as a manual review of accounts flagged by humans.

Apple’s plans were criticized by a wide range of individuals and organizations, including security researchers, the Electronic Frontier Foundation (EFF), politicians, policy groups, university researchers, and even some Apple employees.

Some critics argued that Apple’s child protection features could create “backdoors” into devices that governments or law enforcement agencies could use to survey users. Another concern was false positives, which included the possibility of someone intentionally adding CSAM imagery to another person’s iCloud account in order to flag their account.

Note: Due to the political or social nature of the discussion on this topic, the discussion thread is in our . is located in political news Forum. All forum members and site visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.

Related Posts

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More

Themynews.com Would you like to receive notifications on latest updates? Later Yes