(MENAFN- Khaleej Times) Australia's online safety watchdog said on Monday it had fined X - the social media platform formerly known as twitter - 610,500 Australian dollars ($385,000) for failing to fully explain how it tackled child sexual exploitation content.
Australia's eSafety Commission describes itself as the world's first government agency dedicated to keeping people safe online.
The commission issued legal transparency notices early this year to X and other platforms questioning what they were doing to tackle a proliferation of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse.
eSafety Commissioner Julie Inman Grant said X and Google had not complied with the notices because both companies had failed to adequately respond to a number of questions.
Stay up to date with the latest news. Follow KT on WhatsApp Channels.
The platform renamed X by its new owner Elon Musk was the worst offender, providing no answers to some questions including how many staff remained on the trust and safety team that worked on preventing harmful and illegal content since Musk took over, Inman Grant said.
“I think there's a degree of defiance there,” Inman Grant said.
“If you've got a basic HR (human resources) system or payroll, you'll know how many people are on each team,” she added.
X did not immediately respond to a request for comment.
After Musk completed his acquisition of the company in October last year, he drastically cut costs and shed thousands of jobs.
X could challenge the fine in the Australian Federal Court. But the court could impose a fine of up to AU$780,000 ($493,402) per day since March when the commission first found the platform had not complied with the transparency notice.
The commission would continue to pressure X through notices to become more transparent, Inman Grant said.
“They can keep stonewalling and we'll keep fining them,” she said.
The commission issued Google with a formal warning for providing“generic responses to specific questions,” a statement said.
Google regional director Lucinda Longcroft said the company had developed a range of technologies to proactively detect, remove and report child sexual abuse material.
“Protecting children on our platforms is the most important work we do,” Longcroft said in a statement.“Since our earliest days we have invested heavily in the industrywide fight to stop the spread of child sexual abuse material,” she added.
ALSO READ:
Israel-Palestine conflict: EU warns Google over YouTube disinformation in wake of Hamas attack
Israel-Palestine conflict: X removes hundreds of Hamas-affiliated accounts since attack, CEO says
MENAFN16102023000049011007ID1107252651
Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.