Technology Company Apple offers technical details about iPhones scanning child abuse images. The step is taken as a safety measure for Children.
Apple Incorporation said in an announcement,
“It is launching new software later this year that will test iPhones iPhone and iPad photos for sexually explicit images of children and report any relevant findings to authorities. Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that are securely stored on users’ devices.”
Technology Company Apple has delivered additional aspects of the CSAM detection system in a technical summary. The structure practices a starting point that is customary to make available a tremendously extraordinary level of exactitude and make certain less than a one in one trillion chance per year of inaccurately weakening a known account.
The alterations will even out far along this year in modernizing iOS products. Apple will also deploy software that can analyze images in the Messages application for a new system that will “warn children and their parents when receiving or sending sexually explicit photos.”
For years, Apple has resisted pressure from the United States government to install a backdoor in its encryption systems. Apple has been lauded by security experts for this stance.
Apple further said,
“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.”
Apple’s technical summary on CSAM detection includes a few privacy promises in the introduction. Apple does not learn anything about images that do not match the known CSAM database.
Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.
Apple’s hashing technology is called NeuralHash and it analyzes an image and converts it to a unique number specific to that image.
Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value.
As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit.
Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.
If Apple detects a threshold of child abuse images in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies.
Technology Company Apple said,
“We will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.”
The system has an error rate of less than one in 1 trillion per year and it protects user privacy. Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.
Even in these cases, Technology Company Apple only learns about images that match known CSAM.
The system will test iPhones for child abuse images to promote and implement safeguard measures for Children’s security.
Sources
https://www.kron4.com/news/apple-to-scan-u-s-iphones-for-images-of-child-sex-abuse/
https://www.bbc.com/news/technology-58109748
https://www.aljazeera.com/economy/2021/8/5/apple-to-scan-iphones-ipads-for-images-of-child-sex-abuse
https://www.kron4.com/news/apple-to-scan-u-s-iphones-for-images-of-child-sex-abuse/