"Tell me about your bullshit study," I asked Gord Pennycook, a PhD candidate in psychology at the University of Waterloo in Canada.
Pennycook is the lead author of a new study wonderfully titled "On the reception and detection of pseudo-profound bullshit." In it, he and his colleagues asked questions no psychologists have touched on before. Such as: What makes a person a good bullshit detector? Why are some people more susceptible to bullshit than others?
Pennycook's inquiry into bullshit started with a visit to WisdomofChopra.com, which lampoons Deepak Chopra, the writer and spiritualist known for obtuse sayings like "attention & intention are the mechanics of manifestation." The site randomizes Chopraisms to create nonsense sentences:
There's something uncanny and compelling about these randomly generated nothing statements. They feel substantive at first glance. "I wondered if people actually thought these were profound," Pennycook tells me. So as part of his study, he had people rate these randomly generated tweets. Surprisingly, the average participant rated the gibberish as being between "somewhat profound" and "fairly profound." (Responses to these fake tweets were nearly indistinguishable from the responses to Chopra's real tweets).
Digging deeper, Pennycook found that acceptance of bullshit statements appears to be related to personal traits such as lower intelligence, belief in the paranormal, and the likelihood of believing in conspiracies.
Up until now, this hasn't been a major focus of psychological research. Pennycook thinks it should be. "Bullshit is everywhere, so I think we need to know more about it," he says.
Here's an edited transcript of our conversation.
Brian Resnick: What is bullshit?
Gord Pennycook: Bullshit is different from nonsense. It’s not just random words put together. The words we use have a syntactic structure, which implies they should mean something.
The difference between bullshit and lying is that bullshit is constructed without any concern for the truth. It’s designed to impress rather than inform. And then lying, of course, is very concerned with the truth — but subverting it.
BR: What makes people willing to consider bullshit?
GP: There seem to be two factors that underlie why someone might be particularly receptive to bullshit.
The first is response bias. What that means is some people are just more open to anything they come across. From the outset, they are less skeptical. This isn’t specific to bullshit. It’s kind of like gullibility, although gullibility has more to do with social settings.
[The second:] The anterior cingulate cortex triggers when there’s something off about a problem, and that allows us to detect when there might be bullshit. With bullshit, [this trigger] may only sometimes be pulled, and perhaps only for certain types of people, particularly analytic ones.
BR: Are there certain people who are more receptive to bullshit than others?
GP: Those who give higher profundity ratings to the bullshit are less analytic, less intelligent, higher in religious belief, and higher in what’s called ontological confusion.
BR: Ontological confusion?
GP: An ontological confusion is when you mistake two ontological [existential] categories. An example: ESP [extra sensory perception] is when you think you can control something with your mind. So you’re confusing the mental and the physical.
BR: Some people would take offense to the idea that more religious people are more susceptible to bullshit. How would you respond to that?
GP: I’d say, "That’s what the data says." It doesn’t imply that religion is bullshit.
In cognitive terms, believing in angels is not that different from believing in ghosts. If people are okay with saying that people who believe in the paranormal are more prone to bullshit, they should be okay with saying the same thing with respect to religious beliefs.
BR: Let's say I'm interested in becoming a better bullshitter. What are some principles I should follow?
GP: A good way to do it is insert a lot of buzzwords and be vague.
If you say something direct, the people who agree with you will like it and the people who don’t won’t like it. But if you say something vague, people will bring what they think it means to it. And then everyone will like it — if you hit the perfect spot.
BR: Anything about your results surprise you?
GP: I was surprised that for the [real] Chopra tweets — I selected ones that I thought were particularly vague — the correlation between the profundity ratings for those, and the [fake] ones we got from the websites was .88. [A perfect correlation is 1. Study participants couldn’t tell the difference between fake Chopra tweets and real ones.]
I’m not so cynical to think that would have happened.
BR: From the outset, did you have anything against Deepak Chopra?
GP: I think he was a casualty. ... I wasn’t sitting here brooding about Chopra.
BR: Is there a wrong conclusion that people might mistakenly take away from your research?
GP: One would be that everything that Chopra says is bullshit. Or because if there’s some factor that correlates with bullshit [like religion], everything those people believe is bullshit. It’s still preliminary [research]. The data is pretty clear, but there’s still a lot that we don’t know about bullshit.
BR: Like what?
GP: We might be able to detect when there's bullshit in a sentence [like when a buddy is telling a bullshit story at the bar], but we have no idea how that would actually happen. We don't know what features of a sentence would cause you to detect it.