kirstyevidence

Musings on research, international development and other stuff

Evidence synthesis – what has it ever done for us?

5 Comments

I have talked before about the danger of using results from single research studies to push for policy change. A more balanced view of the whole body of evidence can be gained by carrying out evidence synthesis.  Systematic reviews (or other rigorous synthesis approaches) attempt to gather, appraise and summarise bodies of evidence in a transparent way. By looking at a whole body of evidence, and appraising the rigour of the studies you are looking at, you can get more certainty about what is really going on.

Systematic reviews have long been used in the medical field and have been shown to provide more accurate results than relying on clinical ‘expertise’. In (non-medical) international development topics, rigorous synthesis is much less established; there are relatively few people with expertise in synthesis and the methodologies for synthesising social science research results, and in particular qualitative data, are still being developed.

Nevertheless, synthesised evidence is starting to reveal new and important information about international development topics. Here I summarise three important roles that synthesised evidence can play in improving development interventions.

1. It can tell us that something is true, which we didn’t realise was true

*OK so maybe everyone else knew about the existence of narwhals but it came as a bit of a surprise to me when I discovered them in a David Attenbourgh documentary last year...

*OK so maybe everyone else knew about the existence of narwhals but it came as a bit of a surprise to me when I discovered them in a David Attenborough documentary last year…

Evidence synthesis can sometimes reveal something to be true which an ‘unweighted’ or non-systematic view of the literature would not have revealed. A good example is this paper about decentralisation of services in developing countries. The authors conclude the following:“Many influential surveys have found that the empirical evidence of decentralization’s effects on service delivery is weak, incomplete and often contradictory. Our own unweighted reading of the literature concurs. But when we organize the evidence first by substantive theme, and then – crucially – by empirical quality and the credibility of its identification strategy, clear patterns emerge. Higher quality evidence indicates that decentralization increases technical efficiency across a variety of public services, from student test scores to infant mortality rates.” In other words, only by taking the evidence together and organising it by quality were the authors able to reveal the real role that decentralisation is playing.

george

Back in the 80s, we thought we knew it all…

2. It can tell us that something that we all thought was true is actually not true

A classic example of this was this systematic review published last year which showed that, contrary to popular belief in the development community, routine deworming has little impact on school attendance or school performance. Unsurprisingly, this finding was pretty controversial – development ‘experts’ had been waxing lyrical about deworming as a means for educational improvement for years. However, by looking at the evidence together and, crucially, looking at the quality of the evidence, the authors revealed a different story; much as we liked the idea of being able to improve educational outcomes with an inexpensive pill, the evidence revealed, it’s just not that simple.

This is me in high school – looking smug and not suspecting that quantum physics would make my head explode.

3. It can tell us that something we thought we fully understood, we actually don’t have a clue about

The Justice and Security Research Consortium recently carried out a synthesis of the evidence on the media and conflict (it is not quite published yet but a summary can be found here). They found a lot of papers which make big claims about the media’s role either in promoting and preventing conflict. The large body of literature making these claims could easily fool a busy policy maker into assuming that the links between the media and conflict were well-established. However, when the evidence was assessed for rigour, it was found that many of these papers were based only on opinion or theory and that the number of high quality research papers in this area was low. They summarised that, at present, it is not possible to confirm or refute the claims about the media’s role in conflict based on the available evidence. Now some might say that this is the problem with synthesis – it often just tells us that we don’t know anything much! But in fact for a policy maker it is very important to know whether an intervention is tried and tested in multiple contexts or whether it is an innovative strategy which may have impact but which it would be sensible to monitor closely.

So, synthesised evidence – it might not sound exciting, but it is actually revealing lots of exciting new things. To find out more about what synthesised evidence can tell us check out this database of international development systematic reviews. And watch out for a follow-up post on how synthesised evidence can be communicated effectively.

Advertisements

5 thoughts on “Evidence synthesis – what has it ever done for us?

  1. Hello Kirstie,

    Interesting, but how does this differ from research using literature review as the methodology?

    • Thanks for that question. A literature review can mean many things to many people. The type of synthesis I am talking about uses a rigorous methodology (to ensure that as much of the relevant literature as possible is included), is transparent about methodology and makes an attempt to consider the quality of the studies found. Some literature reviews meet all these criteria – in some cases people refer to these as ‘rigorous literature reviews’. However some other outputs which are also called literature reviews might be very subjective and/or incomplete. This doesn’t mean that less rigorous literature reviews are bad – I think there is a role for them and indeed I have authored some. But it is just important to be aware of what it is you are reading so you know how to interpret the messages found in it.

  2. Pingback: Community Driven Development – would it work in Vauxhall? | kirstyevidence

  3. Pingback: The art and science of presenting synthesised evidence | kirstyevidence

  4. Pingback: Unintended consequences: When research impact is bad for development | kirstyevidence

Leave a Reply (go on, you know you want to...)

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s