child abuse images

On Thursday, tech giant, Apple announced that it will implement a new system to Scan iPhones for Child Abuse Images that will examine images on iPhones in the United States before they are transferred on its iCloud storage services to ensure the upload does not match known child abuse images. The feature is part of the new tech advancements Apple is introducing. These new developments aim to limit the expanse of Child Sexual Abuse Material (CSAM) across Apple’s platforms and services.

The company will be able to identify known CSAM images on its mobile gadgets, like iPhone and iPad, and in photos uploaded to iCloud, while still considering user privacy. However, some privacy advocates said the system could unlock the door to the tracking of state speech or other sensitive material on iPhones. This latest Messages feature is intended to equip parents to perform a more active and informed part when it comes to supporting their children learn to operate online communication. 

Apple will not invade users’ privacy

Furthermore, through a software update launching this year, Messages will be equipped to use on-device machine learning to examine picture attachments and decide if a photo being shared is child abuse explicit. This technology does not require Apple to access or view the child’s private conversations, as all the processing occurs on the device. Nothing is transferred back to Apple’s servers in the cloud.

Read more: Elon Musk sides with Epic Games, calling Apple’s App Store fees as ‘de facto global tax on the Internet’

“Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes,” Apple said. The company claimed the system had an “extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account”. 

Using Siri to curb the spread of child abuse images

In addition to this, Apple is also updating how Siri and the Search app reply to queries concerning child abuse images. Following the new system, the apps “will explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue.”

Source: Reuters

LEAVE A REPLY

Please enter your comment!
Please enter your name here