
Have you ever had one of those bizarre conversations with people that simply won't "understand your factual point of view"? Hehe, me too. How can these imbeciles be so wrong? Especially when you come with facts. Hard. Concrete. Immutable facts... Hold on to your horses... or elephants (!) and you may be surprised.
In the wake of a previous post about the public's perception of science, I'd like to share some thoughts from Jonathan Haidt's The Righteous Mind. Haidt's main thesis is that rationalism (loosely: understanding the world through facts and logic) is not the central drive of peoples' morality. We are hard wired as people to be guided by intuition, rather than weigh up moral conundrums solely on the basis of rationalism.
He calls this the "rationalist delusion" (certainly a poke at Dawkins) and likens human moral decision-making to an elephant (aka intuition) and a rider (aka rational being). The elephant (intuition) is running the show; and although the rider (rational beings) can guide the elephant to some extent, the elephant is really in control. If an elephant leans, even our rational self leans. Throw in some built in instincts of disgust and motor reflexes and you've got yourself one complicated situation!
Haidt quotes from many scientific studies that show that peoples' decision making can change markedly with external stimuli. A funny, but illuminating example being: "those told to stand near a sanitizer became temporarily more conservative [in their moral decision making]", haha!
Regarding science, we can also be swayed. Humans (and animals) are very good at falling in to the trap of 'confirmation bias', which is "the tendency to interpret new evidence as confirmation of one's existing beliefs or theories" i.e. people will see what they want to see.
Regarding science, Haidt writes: "If people can literally see what they want to see - given a bit of ambiguity - is it any wonder that scientific studies often fail to persuade the general public? Scientists are really good at finding flaws in studies that contradict their own views, but it sometimes happens that evidence accumulates across many studies to the point where scientists must change their minds. I've seen this happen in my colleagues (and myself) many times, and it's part of the accountability system of science - you'd look foolish clinging to discredited theories. But for non-scientists, there is no such thing as a study you must believe. It's always possible to question the methods, find an alternative interpretation of the data, or, if all else fails, question the honesty or ideology of the researchers." He goes on to say that nowadays people can just go online, and select from a myriad of 'facts' that will show them exactly what they wanted to see in the first place, if it aligns with their beliefs. So how are your facts working out for you now?!
That is to say... 'scientific facts' often aren't enough to convince people of a 'scientific fact' if it goes against a person's basic intuition. So next time you're trying to explain something scientific to someone, about how electrons don't occupy any space, or that dark matter is invisible, or that the cat is dead and alive, or that vaccinations are of utmost importance, and people "just won't believe your facts"... take a step back and breathe, crack open a can of coke (or lilt, for the old timers), and relax. It's ok! Be patient. People will be interested in hearing interesting things in any case. They might learn something and so might you. And Haidt gives a few reasons why there is still hope for the "elephant" within us all, to learn something new and change our moral compass.
I'll leave you with this. In weighing up scientific facts where there is a lot at stake, that conversation will look very different. What if someone wont believe a fact that may get themselves hurt? How would you approach that situation?
In sum: people will see what they want to see. Warning: So will you!
You can find Jonathan Haidt's book The Righteous Mind, here.
In the wake of a previous post about the public's perception of science, I'd like to share some thoughts from Jonathan Haidt's The Righteous Mind. Haidt's main thesis is that rationalism (loosely: understanding the world through facts and logic) is not the central drive of peoples' morality. We are hard wired as people to be guided by intuition, rather than weigh up moral conundrums solely on the basis of rationalism.
He calls this the "rationalist delusion" (certainly a poke at Dawkins) and likens human moral decision-making to an elephant (aka intuition) and a rider (aka rational being). The elephant (intuition) is running the show; and although the rider (rational beings) can guide the elephant to some extent, the elephant is really in control. If an elephant leans, even our rational self leans. Throw in some built in instincts of disgust and motor reflexes and you've got yourself one complicated situation!
Haidt quotes from many scientific studies that show that peoples' decision making can change markedly with external stimuli. A funny, but illuminating example being: "those told to stand near a sanitizer became temporarily more conservative [in their moral decision making]", haha!
Regarding science, we can also be swayed. Humans (and animals) are very good at falling in to the trap of 'confirmation bias', which is "the tendency to interpret new evidence as confirmation of one's existing beliefs or theories" i.e. people will see what they want to see.
Regarding science, Haidt writes: "If people can literally see what they want to see - given a bit of ambiguity - is it any wonder that scientific studies often fail to persuade the general public? Scientists are really good at finding flaws in studies that contradict their own views, but it sometimes happens that evidence accumulates across many studies to the point where scientists must change their minds. I've seen this happen in my colleagues (and myself) many times, and it's part of the accountability system of science - you'd look foolish clinging to discredited theories. But for non-scientists, there is no such thing as a study you must believe. It's always possible to question the methods, find an alternative interpretation of the data, or, if all else fails, question the honesty or ideology of the researchers." He goes on to say that nowadays people can just go online, and select from a myriad of 'facts' that will show them exactly what they wanted to see in the first place, if it aligns with their beliefs. So how are your facts working out for you now?!
That is to say... 'scientific facts' often aren't enough to convince people of a 'scientific fact' if it goes against a person's basic intuition. So next time you're trying to explain something scientific to someone, about how electrons don't occupy any space, or that dark matter is invisible, or that the cat is dead and alive, or that vaccinations are of utmost importance, and people "just won't believe your facts"... take a step back and breathe, crack open a can of coke (or lilt, for the old timers), and relax. It's ok! Be patient. People will be interested in hearing interesting things in any case. They might learn something and so might you. And Haidt gives a few reasons why there is still hope for the "elephant" within us all, to learn something new and change our moral compass.
I'll leave you with this. In weighing up scientific facts where there is a lot at stake, that conversation will look very different. What if someone wont believe a fact that may get themselves hurt? How would you approach that situation?
In sum: people will see what they want to see. Warning: So will you!
You can find Jonathan Haidt's book The Righteous Mind, here.