YouTube will start showing links to fact-check articles to viewers in the U.S., part of its effort to tame the spread of misinformation on the world’s largest video site.
The feature will display snippets from select news publishers and organizations that follow standards from fact-checking bodies, the company said Tuesday. These links will appear above videos when people type in searches for certain topical claims, such as “covid and ibuprofen.”
Two years ago, YouTube began posting links to Wikipedia below videos about known conspiracy theories. But the whirlwind speed of events such as political elections and the novel coronavirus prompted the company to add more safeguards, said
“We’re literally seeing science happen hour-by-hour,” he said. “That’s where fact-checking comes in.”
Google’s YouTube already introduced this tool in India and Brazil. For the U.S. version, more than a dozen publishers are participating at the start. YouTube didn’t say how many search terms would trigger the fact-check articles, but Mohan said the feature would “roll out pretty narrowly and expand from there.”
YouTube saw a
The fact-check feature appears in search results on YouTube, but the company has said in the past a bulk of its traffic comes from video recommendations, not searches. Mohan noted that YouTube has trained its software system to detect videos with misinformation and then demote them in recommendations.
To contact the reporter on this story:
To contact the editors responsible for this story:
Andrew Pollack
© 2020 Bloomberg L.P. All rights reserved. Used with permission.
To read more articles log in.
Learn more about a Bloomberg Law subscription.