Ahead of congressional hearing on child safety, X announces plans to hire 100 moderators in Austin

Ahead of congressional hearing on child safety, X announces plans to hire 100 moderators in Austin

X, formerly Twitter, is trying to placate lawmakers about the app’s safety measures ahead of a Big Tech congressional hearing on Wednesday, which will focus on how companies like X, Meta, TikTok and others are protecting kids online. Over the weekend, the social media company announced via Bloomberg that it would staff a new “Trust and Safety” center in Austin, Texas, which will include 100 full-time content moderators. The move comes more than a year after Elon Musk acquired the company, which saw him drastically reducing headcount, including trust and safety teams, moderators, engineers and other staff.

In addition to this, Axios earlier reported that X CEO Linda Yaccarino had been meeting last week with bipartisan members of the Senate, including Sen. Marsha Blackburn, in advance of the coming hearing. The executive was said to have discussed with lawmakers how X was battling child sexual exploitation (CSE) on its platform.

As Twitter, the company had a difficult history with properly moderating for CSE — something that was the subject of a child safety lawsuit in 2021. Although Musk inherited the problem from Twitter’s former management, along with many other struggles, there has been concern that the CSE problem has worsened under his leadership — particularly given the layoffs of the trust and safety team members.

After taking the reins at Twitter, Musk promised that addressing the issue of CSE content was his No. 1 priority, but a 2022 report by Business Insider indicated that there were still posts where people were requesting the content. The company that year also added a new feature to report CSE material. However, in 2023, Musk welcomed back an account that had been banned for posting CSE images previously, leading to questions around X’s enforcement of its policies. Last year, an investigation by The New York Times found that CSE imagery continued to spread on X’s platform even after the company is notified, and that widely circulated material that’s easier for companies to identify had also remained online. This report stood in stark contrast to X’s own statements that claimed the company had aggressively approached the issue with increased account suspensions and changes to search.

Bloomberg’s report on X’s plan to add moderators was light on key details, like when the new center would be open, for instance. However it did note that the moderators would be employed full-time by the company.

“X does not have a line of business focused on children, but it’s important that we make these investments to keep stopping offenders from using our platform for any distribution or engagement with CSE content,” an executive at X, Joe Benarroch, told the outlet.

X also published a blog post on Friday detailing its progress in combating CSE, noting that it suspended 12.4 million accounts in 2023 for CSE, up from 2.3 million in 2022. It also sent 850,000 reports to the National Center for Missing and Exploited Children (NCMEC) last year, more than eight times the amount sent in 2022. While these metrics are meant to show an increased response to the problem, they could indicate that those seeking to share CSE content are increasingly now using X to do so.

Source @TechCrunch

Leave a Reply