x
Breaking News
More () »

YouTube will start displaying Wikipedia articles next to conspiracy theory videos

YouTube says it will soon use Wikipedia to help fight videos that promote conspiracy theories and misinformation. In the coming months, conspiracy videos posted...
YouTube, video sharing website

YouTube says it will soon use Wikipedia to help fight videos that promote conspiracy theories and misinformation.

In the coming months, conspiracy videos posted on YouTube will begin to display text boxes called “information cues,” which link to Wikipedia and other third-party sources to discredit a hoax. YouTube CEO Susan Wojcicki announced the effort at South by Southwest (SXSW) earlier this week.

However, Wikimedia — which hosts the Wikipedia site — released a statement on Wednesday saying it wasn’t given advanced notice of YouTube’s announcement.

“Neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube,” it said on Wednesday.

YouTube told CNN the announcement was not a partnership with Wikipedia and that it’s a part of a broader effort to tackle misinformation.

The move comes amid criticism that YouTube and other tech companies have allowed misinformation and conspiracy theories to spread on their platforms. Most recently, tech companies came under fire for promoting conspiracy theories about David Hogg, a student who survived a mass shooting at a Florida high school.

The top trending video on YouTube at the time indicated that Hogg was actually an “actor.” Similar theories about Hogg were trending on Facebook. These types of conspiracies often emerge after mass shootings, alleging the tragedies are a hoax and victims are paid crisis actors.

This isn’t the first time the company said it would use Wikipedia articles to fact-check videos. Last month, Google-owned YouTube said it would add a label to videos from state-funded media outlets, and include a link to the Wikipedia article about that news source to give viewers more information.

“Finding ways to counter conspiracy theories and media manipulation efforts is critical and I applaud YouTube’s acknowledgment of the problem,” said Whitney Phillips, a professor at Mercer University who studies online trolling and digital culture.

However, Phillips noted YouTube’s approach could likely lead to a number of issues, including conspiracy theorists planting misinformation in Wikipedia pages.

“The push to address the issue is a good one. Whether or not it will work is a totally different question, with many more variables than we currently are able to assess,” Phillips said.

Some experts also expressed concern over YouTube using Wikipedia as a fact-based source because its content can be edited by the general public.

“Wikipedia articles are group sourced and in my experience many of them are either ignorant of the subject at hand or for ideological reasons, subtly supportive,” said Jacob Cohen, a professor emeritus at Brandeis University who studies conspiracy in American history and culture.

Mercer’s Phillips also wondered whether those Wikipedia pages linked to YouTube would now be put through a more rigorous moderation process or if the company would deploy paid editors to look at those pages.

“Pages linked for the purposes of fact checking are likely the first thing manipulators will try to weaponize,” she said. “I am not sure how well Wikipedia’s current editing and moderation models would stand up to targeted networked manipulation efforts.”

It’s unclear if Wikipedia will make any changes to its process. It has not respond to a request for comment.

But the company stressed in a statement that it relies on the participation of hundreds of thousands of users to write and update its articles in real time, and that its content is freely licensed for reuse by everybody.

“Anyone can edit Wikipedia,” the statement said. “While Wikipedia will always be a work in progress, our open, distributed model is one of our greatest strengths.”

It also said its users work to ensure information is neutral and supported by reliable sources.

“Research shows that as more people contribute, articles become more accurate and more balanced,” Wikimedia said.

Rob Brotherton, a psychology professor at Barnard College who is an expert in modern conspiracy theories, said that because anyone can contribute to Wikipedia, it’s in theory a more neutral source.

He also said YouTube is taking a more subtle approach to the issue than Facebook, which has partnered with fact-checking sites to tag posts as false.

“[Facebook’s] approach invites speculation about smear-campaigns against people who see themselves as merely questioning mainstream narratives, and the fact-checkers can always be accused of being biased,” Brotherton said. “Taking a relatively light touch like [YouTube] might be a good way to reach people on the fence.”

Facebook has also rolled out “Related Articles” that give additional perspectives and information on stories shared on users’ News Feeds.

Before You Leave, Check This Out