Церква

Can YouTube’s vaccine misinformation ban work?

A young woman touches her chin while looking down at her laptop

YouTube has announced it will no longer allow content containing misinformation about any vaccines that health authorities have approved and confirmed to be safe and effective.


The new guidelines include some notable exceptions, allowing for publishers to post “content about vaccine policies, new vaccine trials, and historical vaccine successes or failures,” and “personal testimonies relating to vaccines.”


Just as soon as the COVID-19 global pandemic spread like wildfire in early 2020, so did about how the virus spreads, who’s most susceptible, and more.


In response, YouTube updated its community guidelines to prohibit content creators from publishing videos containing COVID-19 misinformation. Now, the social media titan is once again revamping its community guidelines, this time taking aim at content creators trying to push propaganda.


, assistant professor of computer and electrical engineering and a research fellow at Boston University’s Rafik B. Hariri Institute for Computing and Computational Science & Engineering, researches the spread of malicious activity on the internet. , a postdoctoral associate at the Boston University School of Public Health, uses machine learning to study digital data and how it impacts society.


Here, the two experts in cybersecurity and online communities explain what YouTube’s latest decision to censor vaccine misinformation means, and whether it could backfire:


The post appeared first on .

[fixed][/fixed]