People are «consistently inconsistent» in how they reason about controversial scientific topics

People are «consistently inconsistent» in how they reason about controversial scientific topics

by Евгений Волков -
Number of replies: 0

People are «consistently inconsistent» in how they reason about controversial scientific topics

https://digest.bps.org.uk/2018/11/13/people-are-consistently-inconsistent-in-how-they-reason-about-controversial-scientific-topics/

GettyImages-475407733.jpg

By Christian Jarrett

There are various issues on which there is a scientific consensus but great public controversy, such as anthropogenic climate change and the safety of vaccines. One previously popular explanation for this mismatch was that an information deficit among the public is to blame. Give people all the facts and then, according to this perspective, the public will catch up with the scientists. Yet time and again, that simply hasn’t happened.

new paper in Thinking and Reasoning explores the roots of this problem further. Emilio Lobato and Corinne Zimmerman asked 244 American university students and staff whether they agreed with the scientific consensus on climate change, vaccines, genetically modified (GMO) foods and evolution; to give their reasons; and to say what would convince them to change their position.

Past research has already done a good job of identifying the individual characteristics – such as having an analytical thinking style and being non-religious – that tend to correlate with accepting the scientific consensus, but this is the first time that researchers have systematically studied people’s open-ended reasoning about controversial scientific topics. The results show that for many people, there are certain issues for which the truth is less about facts and more about faith and identity.

Lobato and Zimmerman found that the most common justifications people gave for their positions on the controversial issues were to simply re-state or qualify their belief – what the researchers called a non-justification, which made up 34 per cent of all responses. Among the actual justifications, the most common kind, making up 33 per cent of all responses, was to cite evidence – a promising result. However, 20 per cent of justifications were subjective and involved making a reference to one’s cultural identity, personal experience or fallacious reasoning.

The specific kinds of subjective justification given tended to vary according to topic. For instance, sceptical attitudes toward the scientific consensus on genetically modified foods (as safe) tended to involve fallacious reasoning of a conspiratorial bent (such as “I am hesitant to believe there are NO concerns because the multinational agricultural corporations such as Monsanto have profits as the basis of their existence so any information they put out is suspect”), or they invoked the naturalistic fallacy (such as, “GMO foods are unnatural and therefore not safe for our bodies”). In contrast, subjective justifications for disbelieving evolutionary theory were more likely to make reference to the importance of one’s cultural or religious identity.

Probably the most significant finding, though, was people’s inconsistency in how they reasoned about the four different scientific topics. For example, while many participants did cite data and evidence to justify their stance on some occasions, only 27 participants (11 per cent) did this consistently across all the topics. “It seems that people are consistently inconsistent in how they reason about scientific topics,” the researchers said.

It was a similar story when it came to the kind of reasons that participants cited that would cause them to change their minds. While it was promising that new evidence or data was mentioned more often than any other reasons, still 45 per cent of participants explicitly denied, at least once, that anything could change their mind on a particular topic; 17 per cent said this of more than one topic. And while 80 per cent of participants indicated that new evidence or data would change their mind on a particular topic, not a single participant took this position on all four controversial topics. Again, the main message is people’s inconsistency in how they reason about different scientific topics.

Confirming prior research on individual characteristics that correlate with scientific reasoning, participants with more of an analytical reasoning style (as measured by agreement with questionnaire items like “I enjoy problems that require hard thinking”) and a stronger liberal political orientation, were more likely to agree with the scientific consensus on the four topics, and to make reference to evidence when reasoning about their position.

Lobato and Zimmerman cautioned that it remains to be seen whether their findings would generalise to scientific topics that have not (yet) been politicised. Of course the current results are also based on a narrow US sample and may be different elsewhere. Notwithstanding these limitations, the researchers said their findings could have implications for scientific advocates: they suggest it may be beneficial to discuss and present topics in a way that reduces any association with your audience’s socio-political identity; and they highlight the importance of better training and education in an analytical thinking style.

“Being able to tailor education about science to the manner in which people think about science may improve scientific literacy,” the researchers concluded, “but doing so requires more research into why people hold the beliefs they do about science.”

Examining how people reason about controversial scientific topics

Christian Jarrett (@Psych_Writer) is Editor of BPS Research Digest

832 words