• 🗺️ Global
    • 🗺️ Global
    • 🇺🇸 USA
    • 🇬🇧 UK
    • 🇵🇰 Pakistan
    • 🇮🇳 India
    • 🇦🇪 UAE
Friday, May 9, 2025
Gotonews
  • Global
  • Business
  • sci & tech
  • entertainment
  • sports
  • Health
  • Blog
  • Games
Gotonews
  • Global
  • Business
  • sci & tech
  • entertainment
  • sports
  • Health
  • Blog
  • Games
No Result
View All Result
Gotonews
Home Blog

Technology Company Apple to test iPhones for child abuse images

December 19, 2022
0
Technology Company Apple, child abuse images, test iPhones

Technology Company Apple offers technical details about iPhones scanning child abuse images. The step is taken as a safety measure for Children.

Apple Incorporation said in an announcement,

“It is launching new software later this year that will test iPhones iPhone and iPad photos for sexually explicit images of children and report any relevant findings to authorities. Apple’s method of detecting known CSAM (child sexual abuse material) is designed with user privacy in mind. Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC (National Center for Missing and Exploited Children) and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that are securely stored on users’ devices.”

Technology Company Apple has delivered additional aspects of the CSAM detection system in a technical summary. The structure practices a starting point that is customary to make available a tremendously extraordinary level of exactitude and make certain less than a one in one trillion chance per year of inaccurately weakening a known account.

The alterations will even out far along this year in modernizing iOS products. Apple will also deploy software that can analyze images in the Messages application for a new system that will “warn children and their parents when receiving or sending sexually explicit photos.”

For years, Apple has resisted pressure from the United States government to install a backdoor in its encryption systems. Apple has been lauded by security experts for this stance.

Apple further said,

“Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages.”

Apple’s technical summary on CSAM detection includes a few privacy promises in the introduction. Apple does not learn anything about images that do not match the known CSAM database.

Apple can’t access metadata or visual derivatives for matched CSAM images until a threshold of matches is exceeded for an iCloud Photos account.

Apple’s hashing technology is called NeuralHash and it analyzes an image and converts it to a unique number specific to that image.

Only another image that appears nearly identical can produce the same number; for example, images that differ in size or transcoded quality will still have the same NeuralHash value.

As part of new safeguards involving children, the company also announced a feature that will analyze photos sent and received in the Messages app to or from children to see if they are explicit.

Apple also is adding features in its Siri digital voice assistant to intervene when users search for related abusive material. The Cupertino, California-based technology giant previewed the three new features on Thursday and said they would be put into use later in 2021.

If Apple detects a threshold of child abuse images in a user’s account, the instances will be manually reviewed by the company and reported to the National Center for Missing and Exploited Children, or NCMEC, which works with law enforcement agencies.

Technology Company Apple said,

“We will detect abusive images by comparing photos with a database of known Child Sexual Abuse Material, or CSAM, provided by the NCMEC. The company is using a technology called NeuralHash that analyzes images and converts them to a hash key or unique set of numbers. That key is then compared with the database using cryptography. Apple said the process ensures it can’t learn about images that don’t match the database.”

The system has an error rate of less than one in 1 trillion per year and it protects user privacy. Apple only learns about users’ photos if they have a collection of known CSAM in their iCloud Photos account.

Even in these cases, Technology Company Apple only learns about images that match known CSAM.

The system will test iPhones for child abuse images to promote and implement safeguard measures for Children’s security.

 

Sources

https://www.kron4.com/news/apple-to-scan-u-s-iphones-for-images-of-child-sex-abuse/

https://www.bbc.com/news/technology-58109748

https://www.aljazeera.com/economy/2021/8/5/apple-to-scan-iphones-ipads-for-images-of-child-sex-abuse

https://www.forbes.com/sites/kimberleespeakman/2021/08/05/apple-will-scan-iphones-for-child-sexual-abuse-images/

https://www.kron4.com/news/apple-to-scan-u-s-iphones-for-images-of-child-sex-abuse/

Next Post
United Nations Security Council, the Afghan crisis 

United Nations Security Council to talk about the Afghan crisis 

Biden Administration, vaccination for Foreign Travelers, 

Biden Administration is scheduling to require vaccination for Foreign Travelers

The state of Bahrain, travel red list

The state of Bahrain removes Pakistan, and India, from the travel red list

Renewable energy businesses, global fossil fuel projects

South Korea supported renewable energy businesses and provided $127bn in global fossil fuel projects

WhatsApp is penalized, European Union privacy rules

WhatsApp is penalized $267 million by Ireland for violating European Union privacy rules

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended News

Misinformation

Is Social Media to Blame for the Rise of Misinformation?

April 30, 2025
AI in businesses, artificial intelligence in companies, artificial intelligence in 2025

How AI Can Help Your Business Grow (Without the Tech Overload)

April 23, 2025
Final Reckoning, Mission impossible, release date Mission impossible, tom cruise movies  

Tom Cruise Returns One Last Time in ‘Mission: Impossible The Final Reckoning’ Trailer

April 16, 2025
Intrusive Thoughts, intrusive thoughts definition, intrusive thoughts and ocd, are intrusive thoughts normal, intrusive thoughts examples

Why Do I Have Intrusive Thoughts: A Mental Health Guide

March 27, 2025

BROWSE BY TOPICS

2023 2024 2025 Actor Artificial Intelligence Asia Cup 2023 Bollywood BTS Business China Cricket Cricket teams Digital digitalization Digital Marketing Elon Musk Entertainment environment Film Health Hollywood India internet Iran K-pop Kim Taehyung Love Yourself Media Mental Health Meta Microsoft Music Netflix Online Pakistan Politics Sleep Social media South Korea Space Sports Sustainability technology Tourism Travel
Gotonews

Follow us on social media:

Recent News

  • Is Social Media to Blame for the Rise of Misinformation?
  • How AI Can Help Your Business Grow (Without the Tech Overload)
  • Tom Cruise Returns One Last Time in ‘Mission: Impossible The Final Reckoning’ Trailer

Information

  • About Us
  • Contact
  • Privacy Policy
  • Terms and Conditions

Countries

  • India
  • Pakistan
  • UAE
  • UK
  • USA

Recent News

Misinformation

Is Social Media to Blame for the Rise of Misinformation?

April 30, 2025
AI in businesses, artificial intelligence in companies, artificial intelligence in 2025

How AI Can Help Your Business Grow (Without the Tech Overload)

April 23, 2025
  • en English
    • ar العربية
    • en English
    • pt Português
    • ur اردو

© 2024 Gotonews.com

No Result
View All Result
  • Global
  • USA
  • UK
  • Pakistan
  • India
  • UAE
  • Latest News
  • Global
  • Business
  • Sci & Tech
  • Enterainment
  • Sports
  • Health
  • Blog ✓
  • Videos
  • Games

© 2024 Gotonews.com