Kids Can Now Report Unwanted Photos To Apple. Is It A Good Idea?


(MENAFN- The Peninsula) Washington Post

Apple is trying a new approach to protecting children, with a stepped-up feature starting in Australia and eventually available around the world.

Apple will now let kids notify the company about inappropriate images they receive in the built-in Iphone messaging app. And to identify potential abusers, Apple will peer at those chat messages - a new step from a company that stresses the privacy of messages.

Technology companies with many millions of young users have a long history of introducing children's safety and parental oversight features that shift responsibility from the companies to parents, and that often prove ineffective or little used.

Now the question is whether Apple, which has been criticized for looking the other way at child abuses through its technologies, can satisfy kids and parents who feel overwhelmed and alarmed by the online risks to young people.

What Apple's new child protection feature will do

Right now, if a child with an iPhone receives images that the phone's digital system believes are sexually explicit, the photo or video can be automatically blurred. There's a pop-up notice on screen for the child to tell an adult they trust or find other ways to get help.

What Apple is starting to do in Australia, to comply with regulations in that country, goes one step beyond. If a child receives an unwanted nude image from another iPhone user in Apple's Messages app, they can also click to notify Apple about it.

This reporting feature isn't available if the message was sent from an Android phone, Apple said.

Apple would then review the chat message, the sender's details and other information and notify law enforcement if it's warranted. (Companies have mistakenly flagged images to law enforcement before.)

This new option, which Apple said it plans to expand globally, is largely catching up to other apps like Instagram and Meta Messenger, which have options for kids (and adults) to report unsolicited images from private chats. Unlike Apple, other apps also let you report more types of unwanted interactions, such as bullying or violent threats.

Apple's new reporting for unwanted nudes is "an important step forward for Apple, but this really should have been done years ago,” said Sarah Gardner, CEO of Heat Initiative, an organization that pressures Apple for stronger child safety measures.

Gardner acknowledged that Apple is putting a lot of responsibility on children to flag unwanted images. But she also said that children are more likely to report uncomfortable interactions to an app than to a person in their life.

Kate Sim, who leads a children's online safety and privacy program at the University of Western Australia's Tech & Policy Lab, said features that let young people block, mute or report unwanted interactions can be helpful and empowering, but those options tend to be hard to find or poorly designed for children's needs. Sim is concerned that Apple's child safety features have some of those problems.

Apple said it designed a feature that's easy to use and appropriate for children.

Apple's track record on child safety

The abhorrent reality is that most technologies are overrun with child sexual abuse imagery, sometimes called child pornography. And children are regularly the targets of sexual solicitations online and extortionists who trick kids into sending nudes for blackmail.

Advocates for children say that technology companies aren't doing enough to stop any of it. But Apple has been a particular target among both children's advocacy groups and Silicon Valley technologists, who say that the company has largely evaded responsibility for children's safety.

Other companies, for example, have methods to scan people's encrypted digital photos and files for child abuse images and report them to the National Center for Missing and Exploited Children, as legally required. Apple doesn't do that, citing privacy and security concerns.

Law enforcement officials have said Apple's lack of action makes illegal child abuse material harder to find. Apple is "purposefully not looking,” Gardner said.

Apple didn't have a comment.

With this new nude reporting feature, Apple risks both not going far enough to satisfy children's advocates like Gardner, and going too far for security and privacy watchdogs. They fear a repeat of Apple's 2021 plan to scan everyone's iPhones to hunt for illegal child abuse material.

Apple scrapped that idea after privacy and security experts said it went too far in invading people's privacy.

Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, said that Apple's new child feature seems fine although he doubts it would successfully help kids.

But he worries that once a company like Apple starts to monitor secure messages and private files with good intentions, it can open the door to more aggressively monitoring everyone's private messages, photos and files.

Apple has previously expressed those concerns to explain that it can't identify inappropriate or illegal child abuse material in its iCloud file storage technology without undermining everyone's privacy and security.

One tiny win

The feature that Apple calls "communication safety” is on automatically for newer Apple devices for children under 13 who are signed into an Apple account and part of a family sharing group.

You can also turn it on for older children.

On an iPhone, go to the Settings app that looks like a mechanical gear. Then select Screen Time and tap on the name of the child in your family group. Select the Communication Safety option and tap the sliding feature so it's green.

Right now, the communication safety features will detect images that appear to contain unwanted content and blur them. Kids can block messages from the person, message a trusted adult and find other resources to help them.

If a teen chooses to view or send an inappropriate image, Apple shows messages to encourage them to think twice and suggests alternatives. Younger children are prompted to enter a passcode.

MENAFN25102024000063011010ID1108819097


The Peninsula

Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.