This article is over 4 months old
English-language version of popular video site shuts down views posting links to ‘dangerous’ misinformation.
YouTube bans all anti-vaccine misinformation.
YouTube has become the latest platform to crack down on misinformation surrounding vaccination, with its English-language version banning all anti-vaccine views posting links to “dangerous” misinformation.
YouTube announced the action Tuesday, with a warning that false information had led to a rise in measles cases in the US, and warned it would continue to block similar videos.
The social media site joins Facebook, Apple, and Amazon in recently clamping down on the proliferation of anti-vaccine videos.
The US Center for Disease Control has found that parents who watch anti-vaccine videos are five times more likely to avoid vaccinating their own children than those who don’t.
‘Bothersome’: how anti-vaccine YouTube content led to measles spike Read more
“We’ve learned that misinformation around vaccination on YouTube is a leading driver of unvaccinated children,” the company said in a blog post.
YouTube said it had created a range of tools for members who found anti-vaccine videos they found inappropriate, including warning notifications and “quality filters” that make content more difficult to find.
YouTube searches on English-language versions of YouTube found that such videos – which frequently claim vaccines cause autism, a debunked idea – had been replaced in top 10 results by “unvaccinated children” and “vaccine health resources”.
Videos had been removed after an algorithm automatically detected that they contained misleading or inaccurate information, the company said.
The original video had been removed from the site, and were replaced with a warning message, according to the Google-owned company.
One such video, published Tuesday by the site behind the popular prank video channel PrankVsPrank, had more than 70,000 views before being removed from the site.
YouTube said anti-vaccine videos violated its policy on misinformation, which states that the video hosts must not misinform their audience.
“When we detect posts that violate this policy, we warn users before displaying them to our community and our partners,” the company said.
“If the post isn’t clearly in violation of our policy, we don’t remove it, but the creator has the option to make the post compliant with our policies or share their findings with us,” it added.
The “anti-vaxxers” continued to live-stream during the Google-controlled advertising system for ad-related videos.
The video host of the “Anti-Vax” Livecast had “846 likes, and they were making just $1 a few hours ago,” said YouTube on Twitter.
“[It] could get cut off if they don’t take this warning more seriously,” the company said.
Viral videos, also known as vlogs, launched a new wave of online entertainment on YouTube in 2006, harnessing the availability of cheap editing tools to allow viewers to follow their favorite personalities in a range of different genres.
Vloggers began sharing videos of made-up home-made movies, but in the following decade their style expanded to involve a video diary of a person’s daily life and a stream of video clips of events.
Socially based broadcasting tools, like those used by YouTube for its “multi-channel networks” or MCNs, allowed viewers to regularly comment on the vlog, helping the content grow virally and win wider attention.
Modelling shows that claim to help users lose weight by dieting with less meat and more vegetables proved especially popular, as did vloggers that included everyday home videos or videos from the holiday seasons.
The popularity of these videos encouraged brands to buy “sponsorship” slots from these creators to benefit from the viral power of their videos.