It’s an important time to talk about “fake news.” The European Parliament will hold elections in May 2019, and there are legitimate concerns that online disinformation and propaganda could impact the outcome of the elections.
With this in mind, the European Commission set up a working group on fake news and online disinformation. Organizations, businesses and policy-makers with were invited to give input to the working group. Human rights organizations, however, were completely shut out.
Unsurprisingly, the working group’s report on “fake news” overlooks the human rights angle.
Because of this, Liberties, together with Access Now and EDRi, responded with a report of our own, highlighting the threats to human rights and urging businesses and politicians alike to uphold their obligations under human rights law.
Download the full report here
What is fake news?
One of our main arguments is that fake news is not a homogenous group of content. Policy-makers should only focus on disinformation that is false, inaccurate and misleading, and designed to harm society by deceiving people.
Information that is merely false doesn’t automatically fall under this definition. Just consider this video of a talking rabbit, or this conspiracy theorist’s take on chemtrails.
Despite the first video’s title, that rabbit isn’t talking. Nice try, little girl. And it takes only a minute on Google and some common sense to know that chemtrail lady needs a new hobby.
But just because both of these videos contain false information, they are still examples of free speech. Why aren’t they disinformation, or “fake news”? Because they are not intentionally trying to mislead you.
We should not live in a world where conversations about certain topics cannot happen online due to over-strict, vague, or unpredictable regulation or the terms of services of internet service providers.
Meaningful solutions
Liberties, Access Now and EDRi believe that benchmarking is needed to define when public or private action against disinformation would be necessary and proportionate.
Our report also stresses the importance of conducting more research to understand the impact of disinformation on the public in order to develop and implement effective and proportionate responses to the problems.
Measures to foster online accountability must not undermine data protection, privacy and freedom of expression, and they must respect the right to anonymity.
With regard to the upcoming elections, we recommend transparency and limiting the use of behavioral advertising for political purposes. We also support the use of sanction or penalty for the use of data that is illegally acquired.
We agree with some of the Commission’s approach, such as the idea of better media education.
The Commission’s plan will not solve the problem of online disinformation and propaganda, nor will it change the need for media pluralism and more reliable public broadcasting services. But its ill-considered approach very well could undermine our fundamental rights.