Elon Musk's X Fined By Australia's Regulator Over Child Abuse Content

Elon Musk’s social media platform X, formerly known as Twitter, was fined A$610,500 or over $380,000 by Australia’s eSafety commissioner for failing to adequately respond to key questions about how it is tackling child abuse content.

Australia’s independent regulator in its second report highlighted serious shortfalls in companies including Twitter, Google, TikTok, Twitch, and Discord to detect, remove, and prevent child sexual abuse material. These major companies are not showing responsibility to tackle the proliferation of child sexual exploitation, sexual extortion, and the livestreaming of child sexual abuse, the regulator said.

eSafety in February issued legal notices to Twitter, which was subsequently rebranded as X, along with Google, TikTok, Twitch, and Discord under Australia’s Online Safety Act.

Meanwhile, two providers, Twitter/X and Google, did not comply with the notices given to them, with both companies failing to adequately respond to a number of questions in their respective notices.

The Commissioner said Twitter/X’s non-compliance was found to be more serious with the company failing to provide any response to certain questions, leaving a few sections entirely blank. For other questions, the company provided incomplete and/or inaccurate responses.

Key questions were not answered, including the time it takes to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material.

There were also no proper answers relating to the number of safety and public policy staff still employed at Twitter/X following the October 2022 acquisition and subsequent job cuts.

Twitter/X now has 28 days to request the withdrawal of the infringement notice or to pay the penalty.

The report noted that in the three months after Twitter/Xs change in ownership in October 2022, the proactive detection of child sexual exploitation material fell to 75% from 90%. It said its proactive detection rate had subsequently improved in 2023.

Meanwhile, Google has been issued a formal warning, notifying of its failure to comply due to the company providing a number of generic responses to specific questions and providing aggregated information when asked questions about specific services.

eSafety Commissioner Julie Inman Grant noted that Twitter/X has stated publicly that tackling child sexual exploitation is the number one priority for the company, but it can’t just be empty talk and needs to see tangible action.

Grant said, “If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinize their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.”

esafety’s first report featured Apple, Meta, Microsoft, Skype, Snap, WhatsApp, and Omegle. The report uncovered serious shortfalls in how these companies were tackling the issue.

Source: Read Full Article