A study of YouTube comments suggests how it’s turning people on to the alt-right – MIT Technology Review

Posted By on January 29, 2020

A new study suggests what weve suspected for years is right: YouTube is a pipeline for extremism and hate.

How do we know that? More than 330,000 videos on nearly 350 YouTube channels were analyzed and manually classified according to a system designed by the Anti-Defamation League. They were labeled as either media (or what we think of as factual news), alt-lite intellectual dark web, or alt-right.

The groups: The alt-right is whats traditionally associated with white supremacy, pushing for a white ethnostate. Those who affiliate with the intellectual dark web justify white supremacy on the basis of eugenics and race science. Members of the alt-lite purport to not support white supremacy, though they believe in conspiracy theories about replacement by minority groups.

Gateway: The studys authors hypothesized that the alt-lite and intellectual dark web often serve as a gateway to more extreme, far-right ideologies. So they tested that by tracing the authors of 72 million comments on about two million videos between May and July of last year. The results were worrying. More than 26% of people who commented on alt-lite videos tended to drift over to alt-right videos and subsequently comment there.

Blame game: Its fairly easy to get to alt-lite and intellectual dark web content with a simple search, but alt-right videos tend to be harder to find for first-time users. Yet the researchers found that YouTubes algorithm often directed users who searched for specific keywords toward increasingly violent, extreme content.

And its getting worse: The team, from the Swiss Federal Institute of Technology Lausanne, also found evidence that the overlap between alt-righters and others who dabble in intellectual dark web and alt-lite material is growing. The authors estimate that about 60,000 people who commented on alt-lite or intellectual dark web content got exposed to alt-right videos over a period of about 18 months. The work was presented at the 2020 Conference on Fairness, Accountability, and Transparency in Barcelona this week.

We still dont know a lot about YouTube radicalization: For one thing, we arent quite sure exactly what makes people move from alt-lite material to the far-right stuff. Thats partially because YouTube restricts access to recommendation data. Its also possible some people are coming to YouTube already having already been radicalized by some external, non-YouTube source. But this research suggests that YouTubes recommendation algorithms may play a significant role.

The background: YouTube has long struggled with the balance between maintaining free speech and addressing hate speech. The company has taken some initial steps by banning some channels, most notably Alex Joness Infowars. But critics argue that YouTube hasnt done enough.

In a statement, YouTube said its working through these issues: Over the past few years ... We changed our search and discovery algorithms to ensure more authoritative content is surfaced and labeled prominently in search results and recommendations and begun reducing recommendations of borderline content and videos that could misinform users in harmful ways."

A spokesperson added that YouTube disputes the methodology and that it doesnt take into account more recent updates to its hate speech policy or recommendations. "We strongly disagree with the methodology, data and, most importantly, the conclusions made in this new research," the spokesperson said.

Editors note: This story has been edited to include YouTube's comment and dispute.

Link:
A study of YouTube comments suggests how it's turning people on to the alt-right - MIT Technology Review

Related Posts

Comments

Comments are closed.

matomo tracker