Tuesday, 02 January 2024 12:17 GMT

Australian Online Safety Commissioner Says Tech Giants Failing To Tackle Child Abuse


(MENAFN- IANS) Canberra, Aug 6 (IANS) The Australian government's Online Safety Commissioner has criticised technology giants for failing to prevent child sexual abuse.

In a report published on Wednesday, the federal government's eSafety Commissioner said that global technology giants have failed to prioritise the protection of children by leaving "significant gaps" in their efforts to combat the spread of abuse content on their platforms, Xinhua News Agency reported.

The report singled out Apple services and YouTube for failing to track the number of user reports they received of child sexual abuse content on their platforms and for not disclosing how long it took them to respond to those reports.

The eSafety Commissioner in July 2024 issued transparency notices to Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snap and Skype, requiring each company to report every six months for two years on how they are tackling Child Sexual Exploitation or Abuse (CSEA) material.

Wednesday's report said that none of the eight companies were using tools to detect CSEA livestreaming on all of their services.

It said that Apple, Google, Microsoft and Discord do not use hash matching to detect known CSEA material on all parts of their services and that Apple, Google and WhatsApp do not block URL links to known CSEA material on any part of their services.

The report said that minimal progress has been made by "some of the most well-resourced companies in the world" despite previous reports published in 2022 and 2023 showing that not enough was being done to protect children on their services.

"It shows that when left to their own devices, these companies aren't prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services," eSafety Commissioner Julie Inman Grant said in a statement.

"No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services."

The report noted some improvements, including Discord and Snap deploying language analysis tools to detect grooming.

MENAFN06082025000231011071ID1109892942



Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.