Since the controversial Facebook scandal, which was uncovered at the beginning of 2018, the topic of disinformation on the internet has become the focus of many discussions in the US election campaign. Even before the German federal election in September 2017, there were warnings to be vigilant against possible disinformation campaigns and propaganda incitement.
Recently, three experts discussed the topic of “disinformation before the European elections – what is the danger?” during a press conference of the Science Media Center Germany. They focussed on what science can do in this context. This blog post reports the main findings of that discussion.
Taking on personal responsibility
Dr. Christian Grimme, a lecturer at the Institute for Information Systems at the University of Münster, Germany, and director of the “PropStop” program for the detection of online propaganda, gave insight into the field of manipulation via social media: These would be used intentionally to create confusion in the web, to prevent people from voting, to spread false information or to arouse mistrust. Preventive measures are difficult because full protection is not possible through either a system or intelligent self-protection. However, Grimme mentions one solution: “Develop your own strategies on how to critically question the truth of social media and non-editorially checked online media.” One must ask oneself whom one could trust and of whom one could expect protection.
Dr. Axel Bruns, a member of Queensland’s University of Technology’s Digital Media Research Center in Brisbane, Australia, confirmed this and went on to ask who would commit such attacks and manipulations. “Of course, lately, suspicion has fallen especially on Russia and Russian players,” said the media and communications research professor. “But certainly other actors will participate there.” Bots, advertising campaigns, social media platforms and human-controlled accounts posting memes are possible attack platforms. The focus is on the roles of European institutions, issues such as migration, cultural change or financial issues. The instigation of confusion or disagreement is not only a preliminary stage, but also the target of the manipulators.
For science it is very difficult to see the range of the information disseminated, one can only estimate. “We know from some recent studies that the majority of fake news is usually forwarded only by highly active fringe groups,” said Bruns. ”The forwarding itself is therefore a niche phenomenon.” But only after spreading will the result be visible for the “society of the middle.” Often it is revealed as false information, sometimes it isn’t. However, sensitizing people to uncovering the falseness of information is a long-term process. Countermeasures such as EU Factchecking Services, the platform “EU vs Disinfo” or similar ones, are already at work and contribute to creating media literacy – because this is of critical importance in contemporary issues.
“Have we lost the will to seek the truth?”
According to Dr. Oliver Zöllner, Professor of Media Research and Digital Ethics at the Stuttgart Media University (Hochschule der Medien), social media have become the primary source of information for many people over the last few years. “Nevertheless, one must also hold the companies to account,” he said, as they served a great human need and took advantage of the herd mentality of the people. Here, according to Zöllner, a societal debate is needed. However, there is no system that can provide full protection from disinformation. Especially in connection with the elections, one has to make the effort to compare things, check the truth, and also deal with things that might not correspond to one’s own world view.
What you should not close your eyes to, Bruns said, is that while the debate on disinformation is useful, it should not be blamed on the media. The basic suspicion that all media would lie to us anyway would be dangerous.
But what kind of attacks are conceivable for the EU election campaign? As an example, Zöllner mentioned television channels such as “Russia Today” which are financed by the Kremlin and broadcast in Spanish, Arabic or German. Here it is easy for the Russian government or a “fan” to spread misinformation at the European level. “Of course, Twitter and Facebook are the biggest players,” said Grimme. But also platforms like “Gab,” on which extreme groups gather, should not be left out, since on these so-called “alternative networks” radical and often wrong information are posted and disseminated in a coordinated way to the “big players.” These alternative platforms have the advantage of being largely unperceived: “We must not limit ourselves too much to the mainstream platforms,” warned Bruns.
Fringe groups to power?
According to Bruns, the main danger in Germany is that potential voters are told not to vote because their vote supposedly doesn’t make a difference. If this strategy should be successful, the middle-class voters would no longer be heard and the voices of the marginalized groups would be more important: “The fringe groups, who may be more susceptible to propaganda, are then particularly encouraged to vote because they have the chance of changing the composition of Parliament.” The generation of displeasure with the EU is also a means to frustrate voters.
With respect to social bots Grimme was very confident: They do play a role but not a central one. Eventhough it is easy to post and disseminate information through social media, the development of these bots is not yet advanced enough that they actually post in an intelligent way. So the dissemination of false contents is easily spotted. However, for Grimme, academics should have a better access to the data and statistics of the networks.
In addition to the societal debate Zöllner demanded an individual responsibility to track things down and to stick to the facts. He asked for codes of conduct such as the “Code of Practice on Disinfo”, i.e. an attempt by the European Commission to combat fake news, that should be used by companies or product developers to ensure maximum protection.
Leave your comments, thoughts and suggestions in the box below. Take note: your response is moderated.
RESEARCH | ARTICLE © Zita Hille, Hochschule der Medien Stuttgart, DE