By Gordon Pennycook and David Rand

What makes people susceptible to fake news and other forms of strategic misinformation? And what, if anything, can be done about it?

These questions have become more urgent in recent years, not least because of revelations about the Russian campaign to influence the 2016 United States presidential election by disseminating propaganda through social media platforms. In general, our political culture seems to be increasingly populated by people who espouse outlandish or demonstrably false claims that often align with their political ideology.

The good news is that psychologists and other social scientists are working hard to understand what prevents people from seeing through propaganda. The bad news is that there is not yet a consensus on the answer. Much of the debate among researchers falls into two opposing camps. One group claims that our ability to reason is hijacked by our partisan convictions: that is, we’re prone to rationalization. The other group — to which the two of us belong — claims that the problem is that we often fail to exercise our critical faculties: that is, we’re mentally lazy.

However, recent research suggests a silver lining to the dispute: Both camps appear to be capturing an aspect of the problem. Once we understand how much of the problem is a result of rationalization and how much a result of laziness, and as we learn more about which factor plays a role in what types of situations, we’ll be better able to design policy solutions to help combat the problem.

The rationalization camp, which has gained considerable prominence in recent years, is built around a set of theories contending that when it comes to politically charged issues, people use their intellectual abilities to persuade themselves to believe what they want to be true rather than attempting to actually discover the truth. According to this view, political passions essentially make people unreasonable, even — indeed, especially — if they tend to be good at reasoning in other contexts. (Roughly: The smarter you are, the better you are at rationalizing.)

Some of the most striking evidence used to support this position comes from an influential 2012 study in which the law professor Dan Kahan and his colleagues found that the degree of political polarization on the issue of climate change was greater among people who scored higher on measures of science literary and numerical ability than it was among those who scored lower on these tests. Apparently, more “analytical” Democrats were better able to convince themselves that climate change was a problem, while more “analytical” Republicans were better able to convince themselves that climate change was not a problem. Professor Kahan has found similar results in, for example, studies about gun control in which he experimentally manipulated the partisan slant of information that participants were asked to assess.

The implications here are profound: Reasoning can exacerbate the problem, not provide the solution, when it comes to partisan disputes over facts. Further evidence cited in support of this of argument comes from a 2010 study by the political scientists Brendan Nyhan and Jason Reifler, who found that appending corrections to misleading claims in news articles can sometimes backfire: Not only did corrections fail to reduce misperceptions, but they also sometimes increased them. It seemed as if people who were ideologically inclined to believe a given falsehood worked so hard to come up with reasons that the correction was wrong that they came to believe the falsehood even more strongly.

But this “rationalization” account, though compelling in some contexts, does not strike us as the most natural or most common explanation of the human weakness for misinformation. We believe that people often just don’t think critically enough about the information they encounter.

A great deal of research in cognitive psychology has shown that a little bit of reasoning goes a long way toward forming accurate beliefs. For example, people who think more analytically (those who are more likely to exercise their analytic skills and not just trust their “gut” response) are less superstitious, less likely to believe in conspiracy theories and less receptive to seemingly profound but actually empty assertions (like “Wholeness quiets infinite phenomena”). This body of evidence suggests that the main factor explaining the acceptance of fake news could be cognitive laziness, especially in the context of social media, where news items are often skimmed or merely glanced at.

(nytimes.com, Dr. Pennycook and Dr. Rand are psychologists.)