Apple Committed To Begin Mitigating Child Abuse Issues

Apple Inc. assured that new technology would be deployed in their operating system to detect images related to child abuse. They declared that one of the features would be introduced to protect children from potential digital exposure within iOS, macOS, watchOS, and iMessage. The team plans to enforce laws against child abuse cases and protect users’ privacy with this feature.

CSAM or Child Sexual Abuse Material is one of the new features to protect children from online harm. It includes filters to block potentially sexually explicit photos sent and received via a user’s iMessage account. Similarly, one feature will help curb child abuse when a user tries to search for CSAM related keywords with Siri and Search.

One of the spokespersons of Apple Inc. said that the new CSAM detection technology operates on a user’s device. It identifies users when they upload child abuse imagery to their iCloud without even decrypting those images. This feature will work against data privacy to verify the content only when a user crosses the limit of uploading images related to child abuse.

For an early start of this project, Apple Inc. had already instructed that a new version of iOS and iPadOS will be rolling out the latest cryptography-based features applications and help limit the spread of CSAM online.

Like Gmail, Dropbox, and other renowned cloud email providers, Apple utilizes hash systems to scan child abuse imagery sent over email. The CSAM feature announced will be applying similar scan techniques over the images stored in iCloud Photos, even the photos are not shared between users.

Matthew Green, a cryptography professor at Johns Hopkins University, already hinted about the new feature in his Twitter threads. On the one hand, users are excited, while on the other, they are skeptical of their data privacy. However, Apple is trying to calm by explaining the layers of encryption. They clarified that layers are encrypted so that users’ data is completely safe and secured.

Additionally, Apple clarified on their webpage “Child Safety” that there won’t be any on-device scanning before an image is backed up in iCloud. Thereby, users must not have second thoughts related to data privacy and security. Scanning photos won’t take place from the statements until an image is dropped or stored in iCloud.

Apple explained about the working of NeuralHash Technology, which will work behind the CSAM feature. This feature will be introduced in iOS 15 and macOS Monetary, which will convert the images in users’ Apple devices into a unique alphanumeric string, known as a hash. If one modifies any image stored in the cloud, the hash will be changed to prevent matching. However, they are working on getting the same hash when an image is cropped or edited or an identical or visually similar image.

CSAM is planned to first roll out in the US, but the plan for introducing it across the globe has not started been yet. Apple said that this new feature is technically optional if users don’t deploy for cloud storage but will be required when they do. The tech giant is all set for their corporate social responsibility ad making the world a better place to live.


Your email address will not be published. Required fields are marked *