In a new study published in Journal of Quantitative Description: Digital Media, researchers examined the impact of deplatforms on online discourse and alternative social media platforms. The study focused on the aftermath of the January 6th Capitol riots and analyzed users’ reaction to anticipating being deplatformed from mainstream social media sites such as Twitter and Facebook.
“My previous research on content moderation and these kinds of interventions suggests that heavy-handed bans may improve certain platforms, but how these interventions affect the larger ecosystem remains to be seen. It is an open question and we need answers if we are to improve the online information space,” said the study authors. Cody BanteinAssistant Professor at the University of Maryland.
“Suppressing certain audience segments could improve Twitter for the rest of the population. While this is a good result, large-scale platform retirements may lead to further polarization. again It pushes audiences into more extreme spaces where these audiences and their messages are more welcome. ”
“If the latter is true, and de-platforming worsens the information space as a whole, the finding is that these extreme spaces, in particular, are extremely hateful, racist, and toxic platforms in general. If we know that, it has important implications for how we manage the wider information space.”
“Previous work in this space was Eshwar Chandrasekaran and othersReddit shows that such bans are useful on certain platforms, but recent research suggests that Ribeiro et al. We found that similar de-platforming did not suppress negative behaviors in the larger space. ”
The study analyzed data from multiple sources to find out how users who expected to leave the platform moved their communications elsewhere after leaving the platform. The researcher used his Crowdtangle tool from Brandwatch, Google Trends, and Meta to describe temporal trends in social media users’ interest in alternative platforms.
They found interest in parlors increased around the time of the US presidential election in November 2020, and experienced a dramatic surge in January. But when his website at Parler.com went offline following the attack on the US Capitol, link sharing effectively stopped.
The research also found that the communities directly involved in the debate in favor of the January 6th event responded to the platform’s actions in a strategic way, providing signposts to where the debate could continue. got it.
“One of the things that is important for the average person is if they use a mainstream platform like Twitter or Facebook and see someone sharing a link to their profile on another platform, It’s that you should be hesitant to join that space because you’re not used to it,” Buntain told PsyPost.
“In a political context, these marginal spaces are often worse places than mainstream platforms. (with the migration to Mastodon), we should be careful with the platforms we access after content moderation interventions.”
Many users who expected it to be deplatformed turned to Gab. Gab has seen greater engagement in multiple spaces since January 6th compared to other alternative platforms. But as new users flooded in, Gab became more toxic, with much higher levels of hate speech than in previous months.
“We were surprised to find that the increase in engagement with Gab as a platform was consistent across the three methods we measured (Twitter, Reddit and Google Trends),” said Buntain. “The results suggest a broader push for the platform after the more mainstream space made it clear that voter fraud-style discussions were unwelcome.”
The study also found a dramatic spike in hate speech, especially anti-black hate speech, on Twitter during the week of January 6, 2021 compared to the previous month. A similar pattern was also observed on Reddit. Overall hate speech on these platforms eventually returned to baseline levels, but many specific categories remained elevated on Twitter.
These findings suggest that deplatforming can have complex effects. In a previous study, one of her authors argued that deplatforming could increase the dissemination and distribution of content by deplatformed individuals. The results of current research show that platform departures drive a shift to niche platforms, resulting in negative changes in the tone of content on those platforms, but no major shifts in representation on mainstream platforms. does not occur, consistent with the dynamics of displacement and diffusion.
“My research here should not be taken to say that content moderation is wrong,” Buntain explained. , we need to have a better understanding of how they work and what their unintended consequences can be.”
“Relatedly, when YouTube announced that it was removing potentially harmful content from recommendations, but allowed it to remain on the platform, engagement with these videos declined on Twitter and Reddit (See my paper on this from 2021), which is probably a good result. That said, the subtle “shadow ban” might resemble its deprecation, which is problematic. ”
the study, “Cross-platform reaction to de-platform after January 6thwas written by Cody Buntain, Martin Innes, Tamar Mitts and Jacob Shapiro.