Review – The Misinformation Age

The Misinformation Age: How False Beliefs Spread
By Cailin O’Connor and James Weatherall
Yale University Press, 2020

In 2017, the Collins Dictionary declared “Fake News” to be the word of the year — a much deserved honor. But this was only the beginning. Discussions of misinformation, conspiracy theories, and rumors are seemingly everywhere — in academic research as well as popular discourse. Over the last decade, scores of articles and books have been written on the topic of false beliefs and how they spread. In such an environment, it is difficult for authors to shed new light on understanding this widespread problem. But in The Misinformation Age: How False Beliefs Spread, Cailin O’Connor and James Owen Weatherall do just that.

O’Connor and Weatherall make the argument that to truly understand the spread of false beliefs (in their words, “beliefs that are inconsistent with the available evidence, and which are even widely known to be inconsistent with that evidence” (p.7)), we must consider both the nature of those beliefs and the social system in which they spread. That is, we must turn our attention from a sole focus on the content of particular beliefs to a focus on the social dynamics by which all beliefs — both true and false — spread. 

This move is an important one for the social sciences. The determinants of false beliefs at the individual level have been the subject of a great amount of work in psychology, communications, and political science. From this work, we know that people can arrive at false beliefs through both cognitive biases and a lack of expertise or education. But a focus on the individual can only take us so far. To appreciate how such beliefs can gain hold in a larger society, O’Connor and Weatherall convincingly make the case that the very same social systems that enable societies to collectively learn the truth from observed experiences can be corrupted by malicious actors to spread false information.

O’Connor and Weatherall craft their argument by first introducing simple models of communication networks in which a set of individuals or agents (nodes) are connected to each other via social ties (edges). Agents in these networks try to determine which of two contrary courses of action are better — which option reflects the “truth” — and, through a sequences of testing and communication with other agents in the system, come to a consensus over future action. Here, “true” and “false” are defined by the relation of evidence to belief. The key insight is that there is always some uncertainty about the state of the world; we don’t ever “know” the truth for certain. But that doesn’t mean that we are without guidance. Uncertainty can range on a continuum from low to high based on the strength of available evidence Thus, the presence of uncertainty does not preclude us from gathering evidence and making the best-informed decisions we can. As O’Connor and Weatherall argue, “we make our beliefs as good as we can on the basis of the evidence we have, and often enough, things work out.” (p. 30)

Over the course of the book, further modifications are added to the system to reflect real-word characteristics of social systems, such as varying levels of trust between agents and the addition of propagandists who are more interested in pushing one particular point of view than they are in determining which course of action leads to the best results. O’Connor and Weatherall then explore the dynamics of social systems through computer models to see conditions under which bad information can overwhelm good information, thereby fostering the spread of false beliefs through society.

I found much to admire in this book. The authors do an excellent job of presenting the logic of the computer models in a clear and accessible way. And the intuitions are illuminated with lively illustrative stories. This is a book that is appropriate for a wide audience. 

The lessons of The Misinformation Age are manifold. First, the blame for the spread of seemingly irrational beliefs is not necessarily the result of irrational actors. Perfectly rational agents who learn from others in their social network can fail to form true beliefs about the world, even in the face of adequate evidence. As O’Connor and Weatherall aptly note, “Individually rational agents can form groups that are not rational at all” (p.14). 

The collective can fail in a number of ways. In the basic model O’Connor and Weatherall begin with, connections between actors lead to convergence to truth over time. But the world is not so simple, and they add a number of potential complications to the model. For one, there could be polarization of experts in the system. The agents (scientists in the presentation in Chapter 2) might stop listening to each other and balkanize into different factions. As result, the collections of experts could split into polarized camps holding different beliefs, with each side trusting the evidence of only those who already agree with them. Polarization over the view of the best course of action therefore results not from bias, but from mistrusting people with different beliefs. Suboptimal outcomes arise via the normal flow of information and evidence, not the wilful rejection of messages from the other side. Here, reputation and trust play a key role. How actors change their beliefs in light of evidence depends on the reputation of the evidence’s source. The “truth” does not necessarily speak for itself, thereby opening the possibility of manipulation.

Perhaps more important for the dynamics of the spread of misinformation is the role played by various bad actors in an information environment. O’Connor and Weatherall note that not everyone is interested in arriving at the truth. There are also individuals and groups whose “interest depend on public opinion and who manipulate the social mechanisms we have just described to further their own agendas” (p.92). These people with a vested interest in a specific outcome can take actions to slow and even stop correct information from becoming the received wisdom. In this way, a few bad actors can subvert the functioning of the entire information system.

All told, The Misinformation Age provides an outstanding framework for understanding misinformation. The book is not an empirical study of the spread of misinformation, and it is not meant to be. The examples of scientific misinformation in areas of smoking and climate change are used as illustrative examples, and to demonstrate important points. But though The Misinformation Age is not an empirical work of social science, it is an excellent theoretic launching point to more systematic study of the spread of misinformation in the current day. 

That said, I do have a few quibbles with the book. The Misinformation Age provides a stylized model of the world. In this case, their model is useful because it greatly clarifies the issues at stake and provides critical insights on the dynamics of information transmission. But the model also greatly simplifies reality. In the real world, the nature of the “truth” might be a little more muddled, even accounting for the presence of uncertainty. Consider, for example, political rumors and misinformation. In certain cases, the truth is clear. Pope Francis never endorsed Donald Trump for President in 2016. But this clarity may be the exception rather than the rule. How can we evaluate ongoing information flows in cases where the normative underpinning of factual claims is less clear? This is a difficult question.

My second quibble actually speaks to the strength of the argument in The Misinformation Age. I am more troubled by the implications of this book than the authors seem to be. O’Connor and Weatherall try to end their book on a hopeful note with some suggestions about how to move social systems to better arrive at the truth. For example, they suggest using a strategy of building trust within particular actors across a variety of issues. In essence they seek to leverage credibility in one area across others. While some of these ideas are intriguing, I am left not as hopeful as them. O’Connor and Weatherall did such a good job identifying the features of the current information system which lead to suboptimal outcomes, that I am skeptical that any other outcome is possible. Given the incentives and prevalence of bad actors to hijack the social system to spread bad information, what is the hope for us as a society? All told, the lessons of The Misinformation Age about the perverse consequences of the social system made me, a school of individual political cognition, even more concerned about the future of democracy. As a scholar, I am grateful for these insights. But as a citizen, I am troubled by their implications.

Further Reading on E-International Relations

Editorial Credit(s)

Vaishnav Rajkumar

Please Consider Donating

Before you download your free e-book, please consider donating to support open access publishing.

E-IR is an independent non-profit publisher run by an all volunteer team. Your donations allow us to invest in new open access titles and pay our bandwidth bills to ensure we keep our existing titles free to view. Any amount, in any currency, is appreciated. Many thanks!

Donations are voluntary and not required to download the e-book - your link to download is below.

Subscribe

Get our weekly email