[ad_1]

Twitter today dispersed the Trust & Safety Council, which was an advisory group consisting of roughly 100 independent researchers and human rights activists. The group, formed in 2016, gave the social network input on different content and human rights-related issues such as the removal of Child Sexual Abuse Material (CSAM), suicide prevention, and online safety. This could have implications for Twitter’s global content moderation as the group consisted of experts around the world.

According to multiple reports, the council members received an email from Twitter on Monday saying that the council is “not the best structure” to get external insights into the company product and policy strategy. While the company said it will “continue to welcome” ideas from council members, there were no assurances about if they will be taken into consideration. Given that the advisory group designed to provide ideas was disbanded, it just feels like saying “thanks, but no thanks.”

A report from the Wall Street Journal notes that the email was sent an hour before the council had a scheduled meeting with Twitter staff, including the new head of trust and safety Ella Irwin, and senior public policy director Nick Pickles.

This development comes after three key members of the Trust & Safety council resigned last week. The members said in a letter that Elon Musk ignored the group despite claiming to focus on user safety on the platform.

“The establishment of the Council represented Twitter’s commitment to move away from a US-centric approach to user safety, stronger collaboration across regions, and the importance of having deeply experienced people on the safety team. That last commitment is no longer evident, given Twitter’s recent statement that it will rely more heavily on automated content moderation. Algorithmic systems can only go so far in protecting users from ever-evolving abuse and hate speech before detectable patterns have developed,” it said.

After taking over Twitter, Musk said that he was going to form a new content moderation council with a “diverse set of views,” but there has been no development on that front. As my colleague, Taylor Hatmaker noted in her story in August, not having a robust set of content filtering systems can lead to harm to underrepresented groups like the LGBTQ community.



[ad_2]

techcrunch.com

Previous articleSen. Lummis still ‘very comfortable’ with Bitcoin in retirement plans
Next articleBitcoin Price and Ethereum Prediction; Get Ready for Today’s US CPI Figures