Skip to content
Home > Blog > Misinformation Has Got You By . . . the Gut

Misinformation Has Got You By . . . the Gut

There’s a science to misinformation.

Keith Raymond Harris is a doctoral research fellow in Philosophy at Ruhr University Bochum. He specializes in applied epistemology and metaphysics.

(A quick reminder: “epistemology” is the term for how we know what we know and why we think we know what we think we know. “Metaphysics” in philosophy has to do with those things that are unchanging, such as gravity.)

Dr. Harris’s His research at the moment centers around conspiracy theories. For example, a recent article of his is titled “Real Fakes: The Epistemology of Online Misinformation.

Harris points out that the gut is more convincing than the head. Consequently, even if someone doesn’t buy into a conspiracy theory, the mere fact that they know about it can influence behavior. Even an idea rejected by the conscious mind lodges in the gut, if you will.

One example Harris present is the US elections of 2020. Despite the fact that most Americans don’t believe the conspiracy theories, the mere existence of these theories is swaying many Americans to doubt the integrity of the US voting system, and therefore efforts to “fix” voting are seen by some as a good thing, despite the fact that these efforts are largely politically motivated to gain an advantage for a particular side.

In other words, misinformation often operates in much the same way as advertising — hinting that products will increase status or make us sexy. Even when consciously rejected, conspiracy theories hint that something isn’t right.

Consider the anti-vaccination movement. Some people believe the conspiracy theories, most don’t. But enough confusion has been created by the conspiracy theories that confusion is affecting the vast middle of people making decisions about vaccinations. Dr. Harris writes,

Even one who is confident that there are means to distinguish between real and fake science may regard the work that would be required to do so as unacceptably costly. In this way, awareness of the threat of fakes may subtly discourage would-be knowers.

We already knew about confirmation bias: if a person is already oriented toward believing a set of propositions, then that person is predisposed to believing information that supports that suspicion. Even if the information is false, the mere existence of the information adds to the pre-existent belief.

Now it is becoming clear that even those who do not buy into a conspiracy often have a nagging, gut-level feeling of . . . queasiness, if you will.

The research also shows another problem with the information deluge we live with nowadays: It’s difficult to avoid getting tired and giving up: “Let the firehose spew! I’m outta here!”

I don’t think I am exaggerating to say that differentiating the true from the false has never in human history been more difficult or more important than it is right now. (One possible exception being the Gutenberg revolution.)

It’s time to sharpen those philosophical skills and get epistemological about it: What do I really know and why do I think I really know that?

Thinking for yourself . . . it’s hard-won and it’s difficult.

It begins in the courage to doubt.

Everything.

Share this...