Print
Rate this article
(votes: 2, rating: 5)
 (2 votes)
Share this article
Roman Mayka

RIAC Expert

In March 2019, under the aegis of the United States Department of State, a group of researchers released a report called "Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age." The report mostly focused on foreign states’ propaganda, disinformation and fake news. Taking into account the upcoming US elections, the report can provide practical recommendations for policymakers and stakeholders.

The report begins with a horrific story broadcasted on the Russian state-owned “Channel One” in 2014. The story covered how Ukrainian soldiers crucified a child in front of its mother's eyes. Later, this story was proved to be fake, and there was neither a killed child, nor shocked mother. Still, the story went viral. It had reached a much broader audience on social mediathan it did on television.

The authors refer to that story as "an example of Kremlin-backed disinformation campaign." The authors of the report continued to state that "in subsequent years, similar tactics would again be unleashed by the Kremlin on other foreign adversaries, including the United States during the lead-up to the 2016 presidential election."

Undoubtedly, the fake story did a lot of damage to the reputation of Channel One and other state-funded media. It is clear why authors begin with that story — it was poorly done, obviously faked and quickly exposed. However, it showed how effective and powerful social media could be (despite all of the reputation risks). There is also an important point highlighted in the report, particularly that "the use of modern-day disinformation does not start and end with Russia. A growing number of states, in the pursuit of geopolitical ends, are leveraging digital tools and social media networks to spread narratives, distortions, and falsehoods to shape public perceptions and undermine trust in the truth." We are used to research, dedicated to propaganda and fake news issues, that establishes only Russia is responsible for disinformation and fake news. This report, on the other hand, addresses propaganda and disinformation as a comprehensive problem.

What can we do about it? According to the report, technology will change, but the problem will not be solved within the next decade. And the fact is, we should learn how to live with the disinformation. At the same time, public policies should focus on mitigating disastrous consequences while maintaining civil liberties, freedom of expression and privacy.

The report provides readers with quite a balanced approach to the problem. While other research projects attach labels on countries or technologies, the authors of the report "Weapons of Mass Distraction" admit the solution will not be easy. It is a complex problem that will require a complex solution.


In March 2019, under the aegis of the United States Department of State, a group of researchers released a report called "Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age." The report mostly focused on foreign states’ propaganda, disinformation and fake news. Taking into account the upcoming US elections, the report can provide practical recommendations for policymakers and stakeholders.

The report begins with a horrific story broadcasted on the Russian state-owned “Channel One” in 2014. The story covered how Ukrainian soldiers crucified a child in front of its mother's eyes. Later, this story was proved to be fake, and there was neither a killed child, nor shocked mother. Still, the story went viral. It had reached a much broader audience on social mediathan it did on television.

The authors refer to that story as "an example of Kremlin-backed disinformation campaign." The authors of the report continued to state that "in subsequent years, similar tactics would again be unleashed by the Kremlin on other foreign adversaries, including the United States during the lead-up to the 2016 presidential election."

Undoubtedly, the fake story did a lot of damage to the reputation of Channel One and other state-funded media. It is clear why authors begin with that story — it was poorly done, obviously faked and quickly exposed. However, it showed how effective and powerful social media could be (despite all of the reputation risks). There is also an important point highlighted in the report, particularly that "the use of modern-day disinformation does not start and end with Russia. A growing number of states, in the pursuit of geopolitical ends, are leveraging digital tools and social media networks to spread narratives, distortions, and falsehoods to shape public perceptions and undermine trust in the truth." We are used to research, dedicated to propaganda and fake news issues, that establishes only Russia is responsible for disinformation and fake news. This report, on the other hand, addresses propaganda and disinformation as a comprehensive problem.

In the introduction, the authors claim that disinformation is a problem that consists of two major factors: technology giants and their impact and the psychological element of how people consume information on the Internet. Technology giants have disrupted disinformation and propaganda, and the proliferation of social media platforms made the information ecosystem vulnerable to foreign, state-sponsored actors. "The intent [of bad foreign actors] is to manipulate popular opinion to sway policy or inhibit action by creating division and blurring the truth among the target population."

Another important aspect of disinformation highlighted in the report is the abuse of fundamental human biases and behaviour. The report states that "people are not rational consumers of information. They seek swift, reassuring answers and messages that give them a sense of identity and belonging." The statement is proved by the research showing that, on average, a false story reaches 1 500 people six times more quickly than a factual account. And indeed, conspiracy stories have become something usual these days. We see it has become even more widespread during the current pandemic — 5G towers, Bill Gates and "evil Chinese scientists" who supposedly invented the coronavirus became scapegoats. And there are a lot more paranoid conspiracy stories spreading on the Internet.

What is the solution? Authors do not blame any country, tech giants or the behavior of people. Rather the opposite, they suggest that the solution should be complex: "the problem of disinformation is therefore not one that can be solved through any single solution, whether psychological or technological. An effective response to this challenge requires understanding the converging factors of technology, media, and human behaviours."

Define the Problem First

What is the difference between fake news and disinformation? How does disinformation differ from misinformation? It is a rather rare occasion that reports give a whole chapter dedicated to terminology. And the report "The Weapons of Mass Distraction" definitely provides readers with a vast theoretical background. Authors admit that there are a lot of definitions, and it is difficult to ascribe the exact parameters to disinformation. However, it states that "misinformation is generally understood as the inadvertent sharing of false information that is not intended to cause harm, just as disinformation is widely defined as the purposeful dissemination of false information."

Psychological Factors

As it was mentioned in the beginning, authors do not attach labels and do not focus on one side of the problem. A considerable part of the report is dedicated to psychological factors of disinformation. The section helps readers understand behavioural patterns of how humans consume information, why it is easy to fall for a conspiracy theory, and how to use this information to prevent the spread of disinformation.

The findings are surprising. There are several cognitive biases that make disinformation easy to flourish. And the bad news is that there is little we can do about it.

First of all, confirmation bias and selective exposure lead people to prefer information that confirms their preexisting beliefs make information consistent with one's preexisting beliefs more persuasive. Moreover, confirmation bias and selective exposure work together with other naïve realism that "leads individuals to believe that their perception of reality is the only accurate view and that those who disagree are simply uninformed or irrational."

In reality, these cognitive biases are widely used by tech giants. That doesn't mean that there is a conspiracy theory behind it. That means that it is easy for big tech companies to sell their products using so-called “filter bubbles.” Such a bubble is an algorithm that selectively guesses what information a user would like to see based on information about the user, such as location, past click-behaviour and search history. Filter bubbles work well on such websites like YouTube. A Wall Street Journal investigation found that YouTube's recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven't shown interest in such content.

These days, the most popular way to counter misinformation is fact-checking and debunking the false information. In the report, the researchers presented some evidence that the methods we are used to employing, may not be that effective. "Their analysis determined that users are more active in sharing unverified rumours than they are in later sharing that these rumours were either debunked or verified. The veracity of information, therefore, appears to matter little. A related study found that even after individuals were informed that a story had been misrepresented, more than a third still shared the story."

The other research finding is that "participants who perceived the media and the word "news" negatively were less likely than others to identify a fake headline and less able to distinguish news from opinion or advertising." Obviously, there is a reason for that. It's a lack of trust. The public has low trust towards journalists as a source of information about the coronavirus, says the latest research. Additionally, according to the American Press Institute, only 43 per cent of people said they could easily distinguish factual news from opinion in online-only news or social media. Thus, the majority of people can hardly distinguish news from opinions in a time when trust towards journalism is at its historical minimum. It is therefore no surprise that people perceive news that negatively.

This can have implications for news validation. The report states it can differ from country to country. "Tagging social media posts as "verified" may work well in environments where trust in news media is relatively high (such as Spain or Germany), but this approach may be counterproductive in countries where trust in news media is much lower (like Greece)."

A vast research basis also reveals the following essential findings. First, increasing online communities' exposure to different viewpoints is rather counterproductive. The research presented in the report found that conservative people become more conservative and liberals become more liberal.

Second, the phenomenon called belief perseverance, which is the inability of people to change their minds even after being shown new information, means that facts can matter little in the face of strong social and emotional dynamics.

Third, developing critical thinking skills and increasing media literacy may also be counterproductive or have minimal use. Research shows us that "many consumers of disinformation already perceive themselves as critical thinkers who are challenging the status quo." Moreover, even debunking false messages cannot be that effective. Showing corrective information did not always reduce the participant's belief in misinformation. Besides, "consumers of fake news were presented with a fact-check, they almost never read it."

What can be done here? Authors provide the reader with a roadmap for countering misleading information. Although the roadmap, which is also based on researches, can have very limited use, according to the report.

The main idea is to be proactive. While debunking false messages, developing critical thinking, and other tools have minimal potential, some psychological interventions can help in building resilience against disinformation. Authors compare disinformation and misinformation as a disease, and they propose we need a vaccine that builds resilience to a virus. This strategy means that people should be warned "that they may be exposed to information that challenges their beliefs, before presenting a weakened example of the (mis)information and refuting it."

Another aspect of the roadmap is showing different perspectives, "which allows people to understand and overcome the cognitive biases that may render them adversarial toward opposing ideas." According to the authors, this approach should focus less on the content of one's thoughts and more on their structure. The fact that certain factors can make humans susceptible to disinformation can also be used as part of the solution.

What About the Tech Giants?

The authors admit that social media platforms should be playing a central role to neutralize online disinformation. Despite the fact that tech giants demonstrated their willingness to address disinformation, their incentives are not always prioritized to limit disinformation. Moreover, their incentives are aligned with spreading more of it because of its business model. "Users are more likely to click on or share sensational and inaccurate content; increasing clicks and shares translates into greater advertising revenue. The short-term incentives, therefore, are for the platforms to increase, rather than decrease, the amount of disinformation their users see."

The technological section of the report is split into three parts dedicated to three tech companies — Facebook, Twitter and Google. While the report focuses on what companies have already done to counter disinformation, we will highlight only the recommendations and challenges that still remain.

Despite all the incentives that have been implemented by Facebook in recent years, the social media platform still remains vulnerable for disinformation. The main vulnerability is behind its messaging apps. WhatsApp has been a great source of disinformation during the Rohingya crisis in 2018 and during the Brazilian presidential elections in the same year. The second vulnerability lies in third-party fact-checking services staffed by human operators. Human operators are struggling to handle the volume of the content: "fake news can easily go viral in the time between its creation and when fact-checkers are able to manually dispute the content and adjust its news feed ranking."

Despite all the vulnerabilities, including a colossal bot network, Twitter became more influential in countering the threat using such technologies like AI. The question of how proactive the company will be countering the threat still remains. Yet, Twitter now uses best practices, according to the report.

With its video-sharing platform YouTube and ad platform, YouTube might be the most vulnerable platform. The website, with its personalized recommendation algorithm (filter bubbles), has faced strong criticism for reinforcing the viewers' belief that the conspiracy is, in fact, real. However, YouTube announced in 2019 that it would adjust its algorithms to reduce recommendations of misleading content.

However, it is not just the tech giants who should take responsibility for disinformation. According to the report, it's countries who should bear the ultimate responsibility for "defending their nations against this kind of disinformation." Yet, since the situation is still in private hands, what can the government do here?

For example, they could play a more significant role in engaging in regulating social media companies. According to the report, it doesn't mean total control of social media companies. However, authors admit that this solution may have some implications for possible restriction of freedom of speech and outright censorship, and there is no easy and straightforward way to solve this complex problem.

****

What can we do about it? According to the report, technology will change, but the problem will not be solved within the next decade. And the fact is, we should learn how to live with the disinformation. At the same time, public policies should focus on mitigating disastrous consequences while maintaining civil liberties, freedom of expression and privacy.

The report provides readers with quite a balanced approach to the problem. While other research projects attach labels on countries or technologies, the authors of the report "Weapons of Mass Distraction" admit the solution will not be easy. It is a complex problem that will require a complex solution.


Rate this article
(votes: 2, rating: 5)
 (2 votes)
Share this article
For business
For researchers
For students