Just 2,294 content moderators are working on Elon Musk’s X platform, which is thousands fewer than the 16,974 working on Google’s YouTube, 7,319 working on Google Play, and 6,125 working on TikTok, according to what the European Union is claiming.
According to Reuters, these data were derived from reports that were submitted by the social media platforms in September of last year. The reports were presented in compliance with the EU’s Digital Services Act (DSA), which requires corporations to do more in order to combat unlawful and harmful content.
According to a source who wished to remain anonymous, regulators anticipate that X would feel driven to raise the number of content moderators in order to stay up with its competitors since “there is an important aspect of the DSA, and that is peer pressure.”
According to the report, this comes after concerns have been made about X as a result of the fact that Musk recently laid go multiple workers who were responsible for monitoring and regulating content despite the surge in the amount of disinformation that can be found on the platform.
X Filled With Misinformation
According to a previous article published by Tech Times, an “unprecedented” number of bogus news stories, dated films, video game footage, and phony links that appeared to be legitimate made it difficult for X users, including journalists and fact-checkers, to verify material regarding the ongoing conflict between Israel and Hamas.
At the beginning of the conflict between Israel and Hamas, a large number of false information was disseminated as graphic footage of military operations, and kidnappings went viral on social networking sites like X. This contributed to the transmission of a large amount of false information.
On the other side, back in August, X announced that it would be increasing the size of its content moderation crew in preparation for the 2024 elections.
According to a report in the New York Post, X, which has already hired a handful of new staff members and is in the process of employing dozens more to improve the policing of fake statements on the site, fired off 80% of its old people when Elon Musk took it private and is in the process of hiring new employees.
According to those with knowledge of the situation, the objective is to grow X’s so-called “Trust and Safety” team, which keeps an eye out for false information. This is a major cause for concern for the election in 2024, given the growing sophistication of artificial intelligence and its ability to create deepfakes.
The restoration of content moderators is an essential component of the approach, and there are currently openings for full-time posts, among which is a director of civic integrity. Insiders claim that X will also look to contractors for electoral support in the upcoming election.
X Former Employees’ Fears, Now a Reality
The laying off process proved to be less than ideal, according to the Associated Press, with former employees earlier suggesting a now fully realized consequence of decreasing X’s content moderation team. Elon Musk’s savage layoffs of X’s content moderators began in November 2022. X’s content moderation staff was reduced as a consequence.
Melissa Ingle, who had been working as a contractor for Twitter for more than a year, was one of several contractors who reported being fired on Saturday. She voiced her concern that the large number of staff members leaving could result in an upsurge in abuse on Twitter.
“I love the platform, and I had a great time working at the company and trying to improve it.” She remarked on Sunday, “And I’m just really afraid of what will fall through the cracks,”
According to the report, Sarah Roberts, a content-moderation specialist who is an associate professor at the University of California, Los Angeles and who worked as a staff researcher at Twitter in 2022, stated that she believes cutting contracted content moderation employees will have a “tangible impact on the experience of the platform.”
Read More: Best Ways to Save Twitter Threads in 2023