Elon Musk has only been in control of Twitter for a short time, but he’s already making big moves. Musk fired a number of key executives on day one, including Twitter CEO Parag Agrawal, but in a new tweet he claims he’ll be moving more slowly when it comes to making content moderation decisions.
Musk hasn’t said much since taking over at Twitter, but he will apparently form some kind of policy advisory body to oversee content moderation decisions. Musk said the group will reflect “diverse viewpoints,” though we’ll certainly have to wait and see on that one. “To be super clear, we have not yet made any changes to Twitter’s content moderation policies,” Musk tweeted later Friday evening.
Importantly, Musk says he won’t be making any major decisions or account reinstatements — i.e. restoring former President Donald Trump — before the council is put in place. Because it’s Musk, that might happen within hours or it might not happen at all, it’s hard to say. Hours later, Musk undermined his own claims about a formalized policy decision making system, flexing his own power to make big content moderation calls.
In a reply to controversial right wing academic and self help author Jordan Peterson’s daughter, Musk made the sweeping claim that “anyone suspended for minor & dubious reasons will be freed from Twitter jail.” Peterson’s Twitter account was limited after going on a transphobic tirade about actor Elliot Page earlier this year, so in this context tweeting hate about a trans person and his doctor counts as “minor and dubious” in Musk’s book, apparently.
On Thursday, Musk also let go of Vijaya Gadde, a well-respected top policy executive at the company who helped navigate complex legal and moderation issues for more than 11 years. Getting rid of Gadde was a signal that a new era with different decision making is beginning, for better or worse.
The tweet is likely more balm for skittish advertisers wary of Musk immediately turning the platform into an anything-goes mess of harassment, hate and misinformation. While Twitter arguably already meets that description with existing levels of moderation in place, advertisers are watching for any major shifts in the kind of content allowed on the platform and how it might adversely affect their brands.
Musk might think this is an original idea, but Twitter already consults a trust and safety council to advise its product and policy decisions. The council — it’s already called a council — consisted initially of 40 organizations and experts that advised it in challenging policy areas. That group served in more of an advisory capacity, and unlike with Meta’s Oversight Board, it wasn’t designed to create binding decisions.
First announced in 2016, Twitter expanded the entity in 2020 to form groups dedicated to specific difficult topics, including safety and online harassment, digital rights, child sexual exploitation and suicide prevention. “A lot of what we currently do, such as ongoing meetings with NGOs, activists and other organizations is always part of our process, but we haven’t done enough to share that externally,” Twitter wrote at the time.
It’s possible Musk has something more like the Oversight Board in mind when it comes to content moderation decision making, but everything from the people who wind up serving on a hypothetical council to the nature of the group’s impact is likely to be controversial.
Source @TechCrunch