In this photo illustration, the image of Elon Musk is displayed on a computer screen and the logo of twitter on a mobile phone in Ankara, Turkiye on October 06, 2022.
Muhammed Selim Korkutata | Anadolu Agency | Getty Images
The move affects most employees who are part of Twitter’s Trust and Safety organization, Bloomberg reported on Tuesday, citing unnamed sources. The staffers are unable to address and discipline user accounts that violate Twitter’s rules around hate speech and misinformation unless they involve harm, the report said.
Twitter is still using automated content moderation tools and third-party contractors to prevent the spread of misinformation and inflammatory posts while Twitter employees review high-profile violations, Bloomberg said.
Twitter didn’t immediately respond to CNBC’s request for comment. Yoel Roth, Twitter head of safety, reacted to the Bloomberg News in a tweet.
“This is exactly what we (or any company) should be doing in the midst of a corporate transition to reduce opportunities for insider risk,” he wrote. “We’re still enforcing our rules at scale.”
On Friday, after closing his acquisition of Twitter, Musk said he plans to form a “content moderation council,” without disclosing specifics such as who would be a part of it or what it would do. The Tesla CEO added that he would not make any “major content decisions” or reinstate previously banned accounts before the council begins its work.
Image and article originally from www.cnbc.com. Read the original article here.