top of page

Share, *Rate this post & leave your comment down below!

Black social media profiles more likely to be taken down more often than whites

Black people's profiles and content are more likely to be taken down or suppressed than those of white people. This disparity is attributed to biases in both automated moderation systems and human review processessource


Black social media profiles and content are disproportionately more likely to be taken down or suppressed than those of white users. This disparity is attributed to a combination of biased algorithms and human moderation practices that reflect societal biasessource


Instagram users whose online activity suggested they were Black were about 50% more likely to be subjected to automatic account suspension by the moderation system, compared to their White counterparts. source


Black Facebook users have reported being silenced when discussing racism on the platform, resulting in account suspension for weeks or even months. Criticism has also been directed at TikTok algorithms for disadvantaging and banning Black creators’ work. source


We center our attention on racial discrimination disclosures: instances where individuals from historically marginalized racial groups share their own or their close others’ experiences with discrimination and inequality. source


While there is variation in flagging rates across algorithms, all five models are more likely to flag racial discrimination disclosures as toxic than negative interpersonal experience disclosures. source


Despite significant recent improvements in language processing technology, our work shows that even the latest content moderation systems misclassify personal narratives by victims of racism as toxic. These systematic flaws can lead to disproportionate removal of online content produced by historically marginalized groups. source


Across Studies 1a to 2b, we found that both prominent algorithms and humans alike disparately suppress discussions about racial discrimination. source


Algorithmic Bias: Automated content moderation tools often disproportionately flag content from Black users. For example, one internal Facebook study from 2019 found that an automated tool on Instagram was 50% more likely to automatically ban Black users than white users.


Impact on Creators: Black content creators have reported that their work is often disadvantaged by algorithms, experiencing shadow banning (hiding content from followers) and a lack of support, which can impact their livelihoods. source



Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

Groups

SIGN UP AND STAY UPDATED!
Connect to all wtrcip social media sites!

© 2023 by Talking Business.  Proudly created with Wix.com

bottom of page