Ключевые концепты совершения информированных выборов

Ключевые концепты совершения информированных выборов

by Евгений Волков -
Number of replies: 0
COMMENT 
 

Key concepts for making informed choices

https://www.nature.com/articles/d41586-019-02407-9

Teach people to think critically about claims and comparisons using these concepts, urge Andrew D. Oxman and an alliance of 24 researchers — they will make better decisions.
 
A child holds a sign during a demonstration against genetically modified organisms in Sofia, Bulgaria

A child holds a sign protesting against genetically modified crops during a demonstration in Bulgaria. Credit: Vassil Donev/EPA/Shutterstock

Everyone makes claims about what works. Politicians claim that stop-and-search policing will reduce violent crime; friends might assert that vaccines cause autism; advertisers declare that natural food is healthy. A group of scientists describes giving all schoolchildren deworming pills in some areas as one of the most potent anti-poverty interventions of our time. Another group counters that it does not improve children’s health or performance at school.

Unfortunately, people often fail to think critically about the trustworthiness of claims, including policymakers who weigh up those made by scientists. Schools do not do enough to prepare young people to think critically1. So many people struggle to assess evidence. As a consequence, they might make poor choices.

To address this deficit, we present here a set of principles for assessing the trustworthiness of claims about what works, and for making informed choices (see ‘Key Concepts for Informed Choices’). We hope that scientists and professionals in all fields will evaluate, use and comment on it. The resources were adapted, drawing on the expertise of two dozen researchers, from a framework developed for health care2 (see ‘Randomized trial’).

Ideally, these concepts should be embedded in education for citizens of all ages. This should be done using learning resources and teaching strategies that have been evaluated and shown to be effective.

KEY CONCEPTS FOR INFORMED CHOICES

CLAIMS

Claims about effects should be supported by evidence from fair comparisons. Other claims are not necessarily wrong, but there is an insufficient basis for believing them.

Claims should not assume that interventions are safe, effective or certain.

• Interventions can cause harm as well as benefits.

• Large, dramatic effects are rare.

• We can rarely, if ever, be certain about the effects of interventions.

Seemingly logical assumptions are not a sufficient basis for claims.

• Beliefs alone about how interventions work are not reliable predictors of the presence or size of effects.

• An outcome may be associated with an intervention but not caused by it.

• More data are not necessarily better data.

• The results of one study considered in isolation can be misleading.

• Widely used interventions or those that have been used for decades are not necessarily beneficial or safe.

• Interventions that are new or technologically impressive might not be better than available alternatives.

• Increasing the amount of an intervention does not necessarily increase its benefits and might cause harm.

Trust in a source alone is not a sufficient basis for believing a claim.

• Competing interests can result in misleading claims.

• Personal experiences or anecdotes alone are an unreliable basis for most claims.

• Opinions of experts, authorities, celebrities or other respected individuals are not solely a reliable basis for claims.

• Peer review and publication by a journal do not guarantee that comparisons have been fair.

COMPARISONS

Studies should make fair comparisons, designed to minimize the risk of systematic errors (biases) and random errors (the play of chance).

Comparisons of interventions should be fair.

• Comparison groups and conditions should be as similar as possible.

• Indirect comparisons of interventions across different studies can be misleading.

• The people, groups or conditions being compared should be treated similarly, apart from the interventions being studied.

• Outcomes should be assessed in the same way in the groups or conditions being compared.

• Outcomes should be assessed using methods that have been shown to be reliable.

• It is important to assess outcomes in all (or nearly all) the people or subjects in a study.

• When random allocation is used, people’s or subjects’ outcomes should be counted in the group to which they were allocated.

Syntheses of studies should be reliable.

• Reviews of studies comparing interventions should use systematic methods.

• Failure to consider unpublished results of fair comparisons can bias estimates of effects.

• Comparisons of interventions might be sensitive to underlying assumptions.

Descriptions should reflect the size of effects and the risk of being misled by chance.

• Verbal descriptions of the size of effects alone can be misleading.

• Small studies might be misleading.

• Confidence intervals should be reported for estimates of effects.

• Deeming results to be ‘statistically significant’ or ‘non-significant’ can be misleading.

• Lack of evidence for a difference is not the same as evidence of no difference.

CHOICES

What to do depends on judgements about the problem, the relevance (applicability or transferability) of evidence available and the balance of expected benefits, harm and costs.

Problems, goals and options should be defined.

• The problem should be diagnosed or described correctly.

• The goals and options should be acceptable and feasible.

Available evidence should be relevant.

• Attention should focus on important, not surrogate, outcomes of interventions.

• There should not be important differences between the people in studies and those to whom the study results will be applied.

• The interventions compared should be similar to those of interest.

• The circumstances in which the interventions were compared should be similar to those of interest.

Expected pros should outweigh cons.

• Weigh the benefits and savings against the harm and costs of acting or not.

• Consider how these are valued, their certainty and how they are distributed.

• Important uncertainties about the effects of interventions should be reduced by further fair comparisons.

 

Trustworthy evidence

People are flooded with information. Simply giving them more is unlikely to be helpful, unless its value is understood. A 2016 survey in the United Kingdom showed that only about one-third of the public trusts evidence from medical research; about two-thirds trust the experiences of friends and family3.

Not all evidence is created equal. Yet people often don’t appreciate which claims are more trustworthy than others; what sort of comparisons are needed to evaluate different proposals fairly; or what other information needs to be considered to inform good choices.

For example, many people don’t grasp that two things can be associated without one necessarily causing the other. The media sometimes perpetuates this problem by using language suggesting that cause and effect has been established when it has not4 — for instance, statements such as ‘coffee can kill you’ or ‘drinking one glass of beer a day can make you live longer’. Worse, exaggerated causal claims often pepper press releases from universities and journals5.

RANDOMIZED TRIAL

Pupils at a school in Uganda hold up their hands in class

Pupils at a school in Uganda.Credit: Mikkel Ostergaard/Panos

The Informed Health Choices (IHC) Project was initially developed between 2012 and 2017 by a collaboration including some of the co-authors of this article (A.D.O., A.D., I.C. and M.O.). The project includes its own set of key concepts2, learning resources and a database of multiple-choice questions to assess how well users can apply the concepts.

In 2016, a randomized trial involving 120 schools and more than 10,000 schoolchildren in Uganda showed that these resources improved the ability of 10–12-year-old children to apply 12 of the key concepts7. These concepts included, for example, recognizing that personal experiences alone are an insufficient basis for claims about effects, and that small studies can be misleading.

In this trial, 69% of schoolchildren who were taught the key concepts passed a multiple-choice test of their ability to think critically about health claims. By comparison, just 27% of children who were not told about the concepts passed the same test.

 

Studies that make fair comparisons are crucial, yet people often don’t know how to appraise the validity of research. Systematic reviews that synthesize well-designed studies that are relevant to clearly defined questions are more trustworthy than haphazard observations. This is because they are less susceptible to biases (systematic distortions) and the play of chance (random errors). Yet results from single studies are often reported in isolation, as facts. Hence the familiar flip-flopping headlines such as ‘chocolate is good for you’, followed the next week by ‘chocolate is bad for you’.

To make good choices, other types of information are needed too — for example, about costs and feasibility. Judgements must also be made about the relevance of information from research (how applicable or transferable it is), and about the balance between the likely desirable and undesirable effects of a drug, therapy or regulation.

When it comes to carbon taxes, for example, policymakers need to consider evidence about the environmental and economic effects of such taxes, judge how comparable their context is with that of the studies and weigh how onerous the administrative difficulties are. They also need to model how tax burdens will be distributed across socio-economic groups and think about whether the taxes will be accepted in their jurisdictions.

Critical thinking

Individuals and organizations across many fields are working to enable people to make informed decisions. These efforts include synthesizing the best available evidence in systematic reviews; making that information more accessible, such as through plain-language summaries or open access; and teaching people how to use such resources. Examples of such review organizations are Cochrane (previously called the Cochrane Collaboration), which focuses on health care; the Campbell Collaboration, which looks at the effects of social policies; the Collaboration for Environmental Evidence; and the International Society for Evidence-Based Health Care. Others include the Center for Evidence-Based Management, the Africa Centre for Evidence, the International Initiative for Impact Evaluation (known as 3ie) and Britain’s What Works Centres.

Unfortunately, academics tend to work in silos and can miss opportunities to learn from others. The expertise of the authors of this article spans 14 fields: agriculture, economics, education, environmental management, international development, health care, informal learning, management, nutrition, planetary health, policing, speech and language therapy, social welfare, and veterinary medicine.

We have identified many concepts that apply across these fields (see ‘Key Concepts for Informed Choices’ and ‘Key concepts in action’). Some further concepts are more relevant in some fields than in others. For example, it is often important to consider potential placebo effects when assessing claims about medical treatments and nutrition; these are rarely relevant to interventions in the environment.

KEY CONCEPTS IN ACTION

Mothers and their babies on a maternity ward in Tanzania

A maternity ward in Dar es Salaam, Tanzania.Credit: Gary Carlton/Alamy

Claims

Beliefs alone about how interventions work are not reliable predictors of the presence or size of effects.

Most people feel that it is hard to influence parents’ engagement with their children’s education. The assumption is therefore that more intensive (and more costly) interventions would be more likely to be effective. However, studies of intensive interventions have often failed to show effects on pupils’ attainment, as measured using standard tests (see go.nature.com/2gfy8io).

Meanwhile, a recent evaluation of the effects of simply text-messaging parents weekly with updates about their child’s schooling had positive effects on children’s attendance, homework submission and mathematics attainment (see go.nature.com/2t7ormy). These effects were small, but the cost was very low. This illustrates that — contrary to our hunches — inexpensive interventions can be helpful, and expensive ones can fail.

Comparisons

Conditions should be as similar as possible.

‘Scared Straight’ programmes take young offenders on prison visits on the assumption that this experience and listening to inmates’ descriptions of life inside will deter juvenile delinquency. Some studies have found that such prison visits were followed by large reductions in delinquent behaviour. But a lot can change in a group of youngsters over time, including their becoming older and more mature. How can anyone know that the prison visits caused the reduction?

Fairer experiments were done in which youths were randomly assigned to visit prison or not, creating groups that were more comparable. Comparisons between these groups showed more delinquency in the youngsters who had been exposed to prisons than among those who had not8,9.

Choices

When there are important uncertainties about the effects of interventions, those should be reduced by fair comparisons.

In the health sector, financing schemes in which funds are released only if a specific action is taken or performance target is met have become popular. Billions of dollars have been invested in promoting these schemes in low- and middle-income countries, with the aim of achieving international development goals10. For example, health providers have been offered cash rewards for increasing the percentage of births in clinics (rather than at home), with the intention of improving maternal and newborn health and survival.

But performance-based financing schemes can have unintended adverse effects, such as encouraging health-care workers to falsify records or to neglect other activities. In Tanzania, some health facilities threatened new mothers with fines or denial of vaccinations for their children10. For interventions in which there is much uncertainty about the pros and cons, further fair comparisons should be done before or while rolling out such schemes.

 

Our collaboration has already prompted many of us to develop frameworks for specific fields and to suggest improvements to the original Informed Health Choices framework2. There is power in identifying an issue that resonates across different domains; it provides momentum to align efforts.

The Key Concepts for Informed Choices is not a checklist. It is a starting point. Although we have organized the ideas into three groups (claims, comparisons and choices), they can be used to develop learning resources that include any combination of these, presented in any order. We hope that the concepts will prove useful to people who help others to think critically about what evidence to trust and what to do, including those who teach critical thinking and those responsible for communicating research findings.

Next steps

Evidence-informed practice is now taught to professionals in many different fields, and these efforts must grow. It is also crucial that schoolchildren learn these key concepts, rather than delaying acquisition of these skills until adulthood. Young people who have been explicitly taught critical thinking make better judgements than those who have not6. Educating people about such concepts at a young age sets an important foundation for future learning.

An important part of the work of encouraging critical thinking is learning and sharing strategies that promote healthy scepticism, but which avoid unintended adverse consequences. These include inducing nihilism (extreme scepticism); allowing for disingenuous claims that uncertainty is a defensible argument against action (on climate change, for example); or encouraging false beliefs — such as that all research is untrustworthy because of competing interests among those who promote particular interventions.

Competing interests take various forms in different fields, but the challenges and remedies are similar: recognition of potential conflicts, transparency and independent evaluations. Achieving these depends on improved public understanding of the need for independent evaluation, and public demand for investment in it, as well as unbiased communication of findings.

Further development and specialization of the Key Concepts for Informed Choices is needed, and we welcome suggestions. For example, more consideration needs to be given to how these concepts can be applied to actions to address system-wide changes, taking into account complex, dynamic interactions and feedback loops, such as in climate-change mitigation or adaptation strategies.

We have therefore created a website (www.thatsaclaim.org) on which our key concepts can be adapted to different fields and target users, translated into other languages and linked to learning resources.

Nature 572, 303-306 (2019)

doi: 10.1038/d41586-019-02407-9

References

  1. 1.

    Bouygues, H. L. The State of Critical Thinking: A New Look at Reasoning at Home, School, and Work (Reboot Foundation, 2018).

  2. 2.

    Oxman, A. D., Chalmers, I., Austvoll-Dahlgren, A. & Informed Health Choices group. F1000Research 7, 1784 (2018).

  3. 3.

    Academy of Medical Sciences. Enhancing the Use of Scientific Evidence to Judge the Potential Benefits and Harms of Medicines (Academy of Medical Sciences, 2017).

  4. 4.

    Haber, N. et al. PLoS ONE 13, e0196346 (2018).

  5. 5.

    Sumner, P. et al. PLoS ONE 11, e0168217 (2016).

  6. 6.

    Abrami, P. C. et al. Rev. Educ. Res. 85, 275–314 (2015).

  7. 7.

    Nsangi, A. et al. Lancet 390, 374–388 (2017).

  8. 8.

    Petrosino, A., Turpin-Petrosino, C. & Finckenauer, J. O. Crime Delinq. 46, 354–379 (2000).

  9. 9.

    Petrosino, A., Turpin-Petrosino, C., Hollis-Peel, M. E. & Lavenberg J. G. Cochrane Database Syst. Rev. CD002796 (2013).

  10. 10.

    Renmans, D., Holvoet, N., Orach, C. G. & Criel, B. Health Pol. Plan. 31, 1297–1309 (2016).

Download references

show less

2772 words