Google-owned YouTube removed ads from videos promoting anti-vaccination rhetoric that violated its ad policies and Pinterest blocked all searches on vaccination to stem the flow of misinformation amid a measles outbreak in Washington state.
The tech giant said the anti-vax videos fall under its policy prohibiting the monetization of videos with "dangerous and harmful" content.
"We have strict policies that govern what videos we allow ads to appear on, and videos that promote anti-vaccination content have been and remain a violation of our longstanding harmful or dangerous advertising policy. We enforce these policies vigorously, and if we find a video that violates them we immediately take action and remove ads," a YouTube spokesperson told Fox News via email.
YouTube has enforced this policy on videos promoting anti-vax content in the past.
YouTube is waging war against anti-vax content in a number of ways. Besides demonetizing the videos, the platform also introduced new information panels on anti-vaccine videos that link out to the Wikipedia page for "vaccine hesitancy," referred to as one of the top 10 global threats of 2019 by the World Health Organization.
YouTube also announced in January that it was tweaking its algorithms to recommend fewer conspiracy videos. The ubiquitous video platform, which users upload 400 hours of content to per minute, has been slammed by critics for providing a haven to bad actors and conspiracy theorists. In January 2018, YouTube established more stringent criteria for monetization on the platform by setting a higher bar for channels in terms of subscribers and watch time.
A recent YouTube search for the phrase "are vaccines safe" pulled up results from Mayo Clinic, Johns Hopkins Medicine, Cleveland Clinic, UCLA Health, the Children's Hospital of Philadelphia, the CDC and several mainstream news outlets.
Last week, BuzzFeed News found that while YouTube tends to return authorized sources for queries such as "are vaccines safe," the platform's Up Next algorithm would frequently suggest anti-vaccination videos as follow-up recommendations. In response to inquiries from BuzzFeed News, several advertisers pulled their ads from the videos.
Social media platforms have taken a range of different steps to slow the spread of vaccine disinformation.
Pinterest, which has 250 million monthly active users, blocked all searches for vaccines or vaccinations last year, applying its health misinformation policy to pins from community members and advertisements.
"We want Pinterest to be an inspiring place for people, and there's nothing inspiring about misinformation. That's why we continue to work on new ways of keeping misleading content off our platform and out of our recommendations engine," a spokesperson told Fox News via email.
Facebook and Google came under fire last week for not doing enough to stop anti-vax content. Rep. Adam Schiff (D-Calif.) sent a letter to both companies, urging them to take action against misinformation regarding vaccines.
"The algorithms which power these services are not designed to distinguish quality information from misinformation or misleading information, and the consequences of that are particularly troubling for public health issues," Schiff wrote in his letter.