Vogon Today

Selected News from the Galaxy

StartMag

YouTube’s war against Covid disinformation and vaccines

YouTube's war against Covid disinformation and vaccines

Here's how Google's video platform is fighting fake news on pandemics, vaccines and more

Since the pandemic began, more 850 thousand videos have been removed from YouTube for violating the disinformation and Covid policy, of which 30 thousand in the last quarter of 2020 for violating the vaccine policy.

Google's video platform is open but the information must be useful and of quality, especially when it comes to the pandemic and the health crisis. That's why accountability is a priority for YouTube.

500 hours of uploaded content per minute, a worldwide audience, with direct opportunities for interaction between content creators and users, the only way to find the right balance between the open platform and the possible consequences for content creators is to have clear policies, explained Marco Pancini, Director of Public Policy EMEA at YouTube.

That is, firm rules of conduct that fix what can be loaded and what cannot be loaded.

Disinformation policies that existed before Covid-19 today follow the activities of world and local authorities.

Furthermore, in Europe everything takes place in collaboration and under the impetus of the European institutions that assist the platform. The goal is to ensure users have a safe experience.

In fact, YouTube's job is to ensure, in light of these policies, that any report that violates these rules is taken into consideration and carefully analyzed.

All the details.

THE FOUR PILLARS OF RESPONSIBILITY FOR YOUTUBE

The four main actions that YouTube has taken in relation to content can be summarized with the "four Rs", linked to the theme of responsibility: remobe, raise, reduce and reward. Remove, to remove content in violation of YouTube's policies. Raise because the idea is to give priority and precedence to authoritative information, even more so in times of pandemics. Reduce, reduce the dissemination of content that, while not exceeding the policies, is very close to the threshold of the guidelines. And finally reward, or rather reward users who make a positive contribution to the community.

According to Pancini, “if we imagine the policy limit as what is accepted and what is not, the threshold when it comes to monetization is even higher. We want to make sure that content can only monetize if it brings positive value to both the community and advertising investors ”.

AT THE TIME OF COVID

With the emergence of Covid, YouTube has introduced additional features to guarantee people accurate and authoritative information.

The latter provided by institutional sources: such as WHO, the Ministry of Health in Italy and the various health authorities in the countries where the platform operates.

In this way the authorities had the additional opportunity to communicate directly with the YouTube audience.

Thanks to these information panels, Google's video platform has provided 400 billion impressions worldwide to health authorities. The box relating to the coronavirus was the most viewed ever.

Since 2018, YouTube has launched a Tranparency Report in which data relating to removed content is published every quarter, because it is contrary to the platform's policies or the law.

In addition to the quantitative one, the platform has added a qualitative dimension: for this reason the YouTube team has created a new metric, the rate of inappropriate views (Vvr).

This metric can help you understand, not only how much content is removed, but also how often YouTube users have been exposed to content that is not in line with the policy.

In addition to removing content that is not in line with the platform's policy, it is important to understand how many people that content reached. Since the monitoring of the Vvr rate has been started, there has been a 70% decrease in exposure to content contrary to the rules of conduct.

THREE SOURCES OF REPORTING POTENTIALLY UNLAWFUL CONTENT

But how is control over content that potentially violates YouTube's policies? It happens in three ways. In the first case, a user reports the content deemed illegal through a tool. In the second case, the report comes through trusted flaggers and authorities that collaborate with YouTube. These in particular are of the highest quality as the report comes from experts. The third source of reporting is through machine learning tools, algorithms based on the first two sources (reports from users and trusted flaggers).

In the last period, YouTube was able to identify 94% of content in violation of the policies through automatic reporting systems. And 75% of those were removed before individual videos exceeded 10 views.

RULES ON DISINFORMATION IN THE MEDICAL SECTOR

Even before Covid, YouTube applied the same policy regarding vaccine content. The platform has thus adapted its policy to the guidelines dictated by the health authorities. Therefore, the video platform removes content that supports theses about Covid-19 and vaccines that contradict the opinion of experts such as local health authorities or the WHO.

For example, Pancini underlined how it was essential to follow the debate that arose around Covid in particular in the case of the connection made between 5G and the pandemic, two events that cannot be related to any scientific information to confirm.

ABOUT THE ADS

The platform does not allow ads promoting anti-vax content, including ads discouraging users from obtaining an anti-Covid vaccine.

Furthermore, content that contradicts the opinion of experts from local health authorities cannot be monetized on the platform.

But there is one exception. In fact, videos are allowed if it is clear that I intend to be educational, documentary, scientific or artistic (Edsa). This policy applies to videos, video comment descriptions, live streams, and any other YouTube products or features.

THE EU CONDUCT OF CONDUCT: "TACKLING COVID-19 DISINFORMATION"

Youtube's work against disinformation is also part of the collaboration with European institutions.

At the time of the European elections, the EU code of conduct on disinformation was born, the result of collaboration between the European Commission, online platforms such as Facebook, Google, Microsoft, Twitter and TikTok and civil society aimed at creating guidelines to be able to offer political communication on online platforms in line with specific criteria.

With the outbreak of the pandemic, Brussels institutions have encouraged online platforms to contribute to the fight against fake news and other disinformation attempts by removing illegal or fake content. Efforts to counter misinformation on COVID-19 vaccines have intensified with the launch of vaccination campaigns across the EU. YouTube is therefore one of the online platforms that are signatories of the code of good practices on disinformation and periodically report to the European Commission on their actions and measures aimed at limiting disinformation.


This is a machine translation from Italian language of a post published on Start Magazine at the URL https://www.startmag.it/innovazione/la-guerra-di-youtube-contro-la-disinformazione-su-covid-e-vaccini/ on Wed, 21 Apr 2021 05:00:05 +0000.