Show me the evidence!

Sometimes, evidence-based decision-making isn’t based on much evidence. There are lots of reasons why, but when things go wrong, we shouldn’t be afraid to say so and make a change

What’s best for bees . . . a case of trial and error? Photograph: Cyril Byrne
What’s best for bees . . . a case of trial and error? Photograph: Cyril Byrne

One of the biggest motivations for me as a scientific researcher is to provide the evidence for other people to make decisions. It’s even on my webpage and up front on my CV. But what does this really mean? And how much “evidence” do we need before we can make a decision, and from whom should this evidence come? And what happens when the evidence seems conflicting? These questions have been debated repeatedly.

Recently, I was asked to comment on the evidence basis for the bee conservation measures in Ireland’s Green, Low-carbon, Agri-environmental Scheme, known as Glas. Why were these conservation measures chosen, and based on what evidence?

Evidence-based decision-making is supposed to be a process for making decisions, based on the best research data available and informed by “experiential evidence from the field” and “relevant contextual evidence”. Which basically means the results of scientific studies, coupled with expert opinion and practical advice. While that all sounds reasonable, the devil is in the detail. Ecosystems are notoriously variable: different things can happen in different places at different times, and it’s hard to make generalised predications. Also, who is an expert? If the experts disagree, who do you believe? And if the experts all agree on one thing, but it’s utterly impractical so you do something else, can it still be classed as “evidence-based”?

I've been telling anyone who will listen that the guidelines that we produce for pollinator conservation in Ireland (via the All-Ireland Pollinator Plan, www.pollinators.ie) are all evidence-based. This means that we scoured the peer-reviewed scientific literature for evidence, borrowed it from overseas when we were short of it at home, and weighed up the studies that agreed with each other and those that didn't. For example, if a bunch of published scientific studies show that there are more pollinators on farmland that has more flowers, then this is evidence that having more flowers on farmland must be a good thing for pollinator populations. And so we recommend it as a measure.

READ SOME MORE

But what do we do when there’s a gap in the scientific literature? Take the Glas conservation measures: bee boxes or bee hotels. There are only a handful of Irish bee species that will actually nest in them, and we have no evidence that they will boost pollinator populations. Yet they are recommended by Glas for bee conservation. Why? Because expert opinion suggests that they could be used by that handful of species that nest in cavities (and better to provide a home to a few species than none, right?), they are used widely overseas (albeit in places with different species of bees) and because they are easy to implement. In this case, the lack of scientific evidence doesn’t stop us from deciding to give the bee boxes a try. By implementing them across Irish farmland, we can check whether they work and change the decision later on if they don’t.

The other Glas bee-conservation measure, sand piles, has no scientific evidence that bees will actually nest in them, virtually no support from bee experts and are not used anywhere else in the world for bee conservation. But they are easy to implement. So here, the weight of evidence is not in favour of them, and only the ease of implementation. I’m keen to find out whether they work over the coming years, and – if my gut scepticism about this measure is correct – how long it will take to change the policy.

And this is the key point: if we realise that particular measures don’t work and still don’t have the evidence for what does, what do we do? What about all the farmers who have installed sand piles in good faith? Do they get disheartened and give up, because it seems like no-one knows what they are doing? No, we need to make sure our decisions are flexible and responsive to new evidence. That when we don’t have all the answers, we are honest about it, and acknowledge that we need to continually update the evidence base. Happily, this keeps me in a job. Maybe I should add that to my CV?

Jane Stout is professor of botany at TCD School of Natural Sciences