Apple on Thursday said that iPhones and iPads will soon start detecting images containing child sexual abuse and reporting them as they are uploaded to the iCloud.
The software tweak to Apple's operating systems will monitor pictures, allowing Apple to report findings to the National Center for Missing and Exploited Children, according to a statement by the Silicon Valley-based tech giant.
"We want to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of child sexual abuse material (CSAM)," Apple said.
The new technology will allow the phones' operating systems to match abusive photos on a user's phone against a database of known CSAM images provided by child safety organizations, then flag the images as they are uploaded to iCloud, Apple said.
The feature is part of a series of tools heading to Apple mobile devices, according to the company.
Apple's iPhone messaging app will additionally use machine learning to recognize and warn children and their parents when receiving or sending sexually explicit photos, the company said in the statement.
And personal assistant Siri will be taught to "intervene" when users try to search topics related to child sex abuse, according to the company.
Legal Disclaimer: MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.