Abstract
For this memo, we identified all Covid-related videos which circulated on social media, but which YouTube eventually removed because they contained false information. Between October, 2019 and June, 2020 there were 8,105 such videos - less than 1% of all YouTube videos about the coronavirus. We find that:
• It took YouTube on average 41 days to remove videos containing false information, based on a subset of videos for which this data was available.
• Surprisingly, Covid-related misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook.
• Facebook placed warning labels about false information only on 55 videos, less than 1% of the misinformation videos shared on the platform.
• Misinformation videos were shared almost 20 million times on social media, which is more than the shares gathered by the five largest English-language news sources on YouTube combined (CNN, ABC News, BBC, Fox News and Al
Jazeera)
• It took YouTube on average 41 days to remove videos containing false information, based on a subset of videos for which this data was available.
• Surprisingly, Covid-related misinformation videos do not find their audience through YouTube itself, but largely by being shared on Facebook.
• Facebook placed warning labels about false information only on 55 videos, less than 1% of the misinformation videos shared on the platform.
• Misinformation videos were shared almost 20 million times on social media, which is more than the shares gathered by the five largest English-language news sources on YouTube combined (CNN, ABC News, BBC, Fox News and Al
Jazeera)
Original language | English |
---|---|
Place of Publication | Oxford |
Publisher | Oxford Internet Institute |
Edition | 2020.5 |
Volume | Comprop Data Memos |
Number of pages | 7 |
Publication status | Published - 2020 |
MoE publication type | D4 Published development or research report or study |
Fields of Science
- 5141 Sociology