kirstyevidence

Musings on research, international development and other stuff


6 Comments

Guest post: Louise Shaxson on advising governments… and ugly babies

I have known Louise Shaxson for many years and have always valued her advice and insight. However, when she wrote to me recently to tell me that she had written a blog about how to talk to new parents about their ugly babies… I was seriously concerned that we might be heading for a fall-out. Turns out I had no need to worry. For a start, the article is actually about giving advice to governments (although I think it is relevant to people giving advice to any organisation). But also, on reflection, I remembered that MY baby is totally ADORABLE. So it’s all good.

Right then, here’s the blog – and I couldn’t resist adding some pics. Hope you like it!

Being nice about an ugly baby… three tips for presenting research to governments

Presenting research results to government can be like talking to a new parent whose baby isn’t, perhaps, the best looking on the planet, (read on to find out why).

Even if a government department has commissioned your research, it can be hard to write a final report that is well received and acted on.  I’ve heard numerous researchers say that their report was politely received and then put on a shelf. Or, that it was badly received because it exposed some home truths.

A long time ago, I submitted the first draft of a report that the client didn’t like. He told me it was too confrontational. But he recognised the importance of the message and spent time explaining how to change its presentation to make the message more helpful.

I was grateful for this guidance and redrafted the report.  Consequently, it was not just well received; it helped instigate a series of changes over the next two years and was widely referenced in policy documents.

It’s not easy—I still don’t always get it right—but here are my three tips for crafting your research report, so that it is more likely to be read and used:

  1. Be gentle – government departments are sensitive to criticism.

All parents are proud of their baby, even if he doesn’t look like HRH Prince George and no parent wants to be told that their baby is ugly in public.  You can still appreciate chubby cheeks, a button nose or a wicked grin.

The media is constantly on the lookout for policy ‘failures’ – both real and perceived.  Even if there’s no intention to publish, things can leak.  If the media picks up your research and the coverage is unflattering, your client will have to defend your findings to senior managers, maybe even to the Minister, and spend a considerable amount of effort devising a communication strategy in response. 

Begin by recognising what they have achieved, so that you can put what they haven’t yet achieved into context.

  1. Observations might work better than recommendations.
tired mum

Don’t comment on how badly the baby’s dressed without recognising how difficult it was for an exhausted mother just to get her and the baby out of the house.

No matter how much subject knowledge you have, you don’t fully understand the department’s internal workings, processes and pressures.  Your client will probably be well aware of major blunders that have been made and won’t thank you for pointing them out yet again.
Framing recommendations as observations and constructive critiques will give your client something to work with.

  1. Explain why, not just what should be done differently.
messy baby

If you are telling a parent that their baby’s dressing could be improved, they may have to sell the idea to other family members – even if they themselves agree with you. Make their life easier by explaining why the suggested new approach will work better.

Your client will have to ‘sell’ your conclusions to his/her colleagues.  No matter how valid your criticisms, it’s difficult for them to tell people they’re doing it wrong.

Try not to say that something should be done differently without explaining why.  It allows your clients to work out for themselves how to incorporate your findings.

Taking a hypothetical situation in the agriculture sector, here are some examples of how to put these tips into practice:

More likely to cause problems More likely to be well received
Recommendation 1: If the agricultural division requires relevant evidence, it needs to clearly define what ‘relevant’ means in the agricultural context before collecting the evidence.

Implication: you haven’t really got a clue what sort of evidence you want.

Observation 1: Improving our understanding of what constitutes ‘relevant evidence’ means clarifying and communicating the strategic goals of the agricultural division and showing how the evidence will help achieve them.

Implication: there are some weaknesses in specific areas, but here are some things you can do about it. Using ‘our understanding’ rather than ‘the division’ is less confrontational

Recommendation 2: Relationships with the livestock division have been poor. More should be done to ensure that the objectives of the two divisions are aligned so the collection of evidence can be more efficient.

Implication: you haven’t sorted out the fundamentals.  ‘Should’ is used in quite a threatening way here.

 

 

Observation 2: Better alignment between the objectives of the agricultural and livestock divisions will help identify where the costs of collecting evidence could be shared and the size of the resulting savings.  The current exercise to refresh the agricultural strategy provides an opportunity to begin this process.

Implication: we understand that your real problem is to keep costs down.  Here is a concrete opportunity to address the issue (the strategy) and a way of doing it (aligning objectives). Everyone knows the relationship is poor, you don’t need to rub it in.

Recommendation 3: The division has a poor understanding of what is contained in the agricultural evidence base.

Recommendation 4: More work needs to be done to set the strategic direction of the agricultural evidence base.

Implication: wow, you really don’t have a clue about what evidence you’ve got or why you need it. 

Observation 3: An up to date understanding of what is contained in the agricultural evidence base will strengthen the type of strategic analysis outlined in this report.

Implication: having current records of what is in the evidence base would have improved the analysis we have done in this report (i.e. not just that it’s poor, but why is it poor?). Recommendation 4 is captured in the rewritten Observation 1.

 

This guest post is written by Louise Shaxson, a Research Fellow from the Research and Policy in Development (RAPID) programme at ODI.

Advertisements


7 Comments

Implementation science: what is it and why should we care?

imp sci pie chart

The 30 participants were mostly members of DFID’s Evidence into Action team plus a few people who follow me on twitter – admittedly not a very rigorous sampling strategy but a useful quick and dirty view!

Last week I attended a day-long symposium on ‘implementation science’ organised by FHI 360. I had been asked by the organisers to give a presentation, and it was only after agreeing that it occurred to me that I really had no idea what implementation science was. It turns out I was not alone – I did a quick survey of colleagues engaged in the evidence-informed policy world and discovered that the majority of them were also unfamiliar with the term (see pie chart). And even when I arrived at the conference full of experts in the field, the first couple of hours were devoted to discussions about what implementation science does and does not include.

To summarise some very in-depth discussions, it seems that there are basically two ways to understand the term.

The definitions that seem most sensible to me describe implementation science as the study of how evidence-informed interventions are put into practice (or not) in real world settings. These definitions indicate that implementation science can only be done after efficacy and effectiveness studies have demonstrated that the intervention can have a positive impact. As @bjweiner (one of the conference speakers) said, implementation science aims to discover ‘evidence-informed implementation strategies for evidence-informed interventions’.

A second category of definitions take a much broader view of implementation science. These definitions include a wide variety of additional types of research including impact evaluations, behaviour change research and process evaluations within the category of implementation science. To be honest, I found this latter category of definitions rather unhelpful – they seemed to be so broad that almost anything could be labelled implementation science. So, I am going to choose to just go with the narrower understanding of the term.

Now I have to mention here that I thoroughly enjoyed the symposium and found implementation scientists to be a really fascinating group to talk with. And so, as a little gift back to them, and in recognition of the difficulties they are having in agreeing on a common definition, I have taken the liberty of creating a little pictorial definition of implementation science for them (below). I am sure they will be delighted with it and trust it will shortly become the new international standard ;-).
implementation science

So what else do you need to know about implementation science?

Well, it tends to be done in the health sector (although there are examples from other sectors) and it seems to focus on uptake by practitioners (i.e. health care providers) more than uptake by policy makers. In addition it is, almost by definition, quite ‘supply’-driven – i.e. it tends to focus on a particular evidence-informed intervention and then study how that can be implemented/scaled up. I am sure that this is often a very useful thing – however, I suspect that the dangers of supply-driven approaches that I have mentioned before will apply; in particular, there is a risk that the particular evidence-informed intervention chosen to be scaled up, may not represent the best overall use of funds in a given context. It is also worth noting that promoting and studying the uptake of one intervention may not have long-term impacts on how capable and motivated policy makers/practitioners are to take up and use research in general.

A key take home message for me was that implementation science is ALL about context. One of my favourite talks was given by @pierrembarker who described a study of the scale-up of HIV prevention care in South Africa. At first the study was designed as a cluster randomised controlled trial; however, as the study progressed, the researchers realised that, for successful implementation, they would need to vary the approach to scale-up depending on the local level conditions, and thus an RCT, which would require standardised procedures across study sites, would not be practical. Luckily, the researchers (and the funders) were smart enough to recognise that a change of plan was needed and the researchers came up with a new approach which enabled them to tailor the intervention to differing contexts, and at the same time generate evidence on outcomes which was as robust as feasible. Another great talk was given by Theresa Hoke of @FHI360 who described two programmes to scale up interventions that almost completely failed (paper about one of them here). The great thing about the implementation science studies were that they were able to demonstrate clearly that the scale-up had failed and to generate important clues for why this might be the case.

One final cool thing about implementation science is how multi-disciplinary it is; at the symposium I met clinicians, epidemiologists, qualitative social scientists and – perhaps most intriguingly – organisational psychologists. I was particularly interested in the latter because I think it would be really great if we could get some of these types involved in evaluating/investigating ‘demand-side’ evidence-informed policy work funded by organisations, including DFID, (the department formerly known as-) AusAID and AHSPR. These programmes are really all about driving organisational change, and it would be very useful to get an expert’s view on what approaches (if any!) can be taken by outside actors to catalyse and support this.

Anyway, sorry for such a long post but as you can tell I am really excited about my new discovery of implementation science! If you are too, I would strongly recommend checking out the (fully open access) Implementation Science Journal. I found the ‘most viewed’ articles a good place to start. You will also soon be able to check out the presentations from the symposium (including my talk in which I call for more unity between ‘evidence geeks’ like me and implementation scientists) here.


3 Comments

Can an outsider help solve your problems?

My sister, who knows about these things, tells me that most great innovations happen when someone from one sector/area of expertise moves to a new sector/area of expertise and introduces a new way of dealing with a problem.

Face-palm moment

Face-palm moment

This kind of surprises me – my experience is that when new people arrive in my sector, they quite often make lots of the same mistakes that those of us who have been around for a while have long ago tried and discarded. But my sister’s revelation made me wonder whether this slightly negative attitude towards newbies is doing me harm? Is my snootiness depriving me of lots of valuable opportunities to learn?

The answer is probably yes, but I think ‘outsider’ input into problem solving does need to be well managed. It is possible that someone with a new perspective will identify a fabulous and innovative new way to solve a problem – but there is also a high risk that they will jump to the same naive assumptions that you used to have before you became so jaded I mean… experienced.

So here are my top tips to both sides of equation – and, as usual, my advice is gathered from my own experience of messing this type of thing up!

If you are the highly experienced expert who is getting some ‘outsider’ perspective….

1. Stop being so bloomin’ grumpy! Yes of course you know lots about this and of course the outsider will appear ignorant – but if you can attempt to engage with them enthusiastically – even gratefully – and provide evidence for why certain ideas might not work (rather than rolling your eyes!) you might well get a useful new perspective.

2. Build your credibility as an expert by summarising important bodies of knowedge that you have learnt from – including your own experiences, books, experts, research evidence etc. This will be more helpful and more persuasive that just expecting people to realise that you know best (even if you do!).

3. Don’t be afraid to pinpoint parts of the problem which you already feel well-placed to solve – and other parts where you would welcome some input.

If you are the bright-eyed bushy tailed outsider who has been brought in to advise…

1. Make sure it is clear that you want to listen – this usually reduces people’s resistance. And try to spend as much time understanding what the problem is that people are trying to solve before jumping in with solutions. I find the ‘Action Learning’ approach really useful for forcing you to stop trying to solve a problem before you actually really understand it.

2. Be respectful to people’s knowledge and experience and take the time to listen to how they think the problem should be solved (even if they do seem grumpy!). You may eventually decided to provide constructive challenge to their proposed solutions, but this will never be effective unless you really understand why they are proposing them.

3. Repeatedly  invite the experts to challenge any new ideas you have – and develop a thick skin!

.

And, just in case none of this works, you may also want to check out this post on dealing with disagreements…!


4 Comments

Adapt!

Over the holidays I read Tim Harford‘s book ‘Adapt‘. In it, he discusses how various tricky problems (from running a sucessful business to solving world poverty) are best tackled using an evolutionary approach. He discusses three steps to sucessful ‘evolution’: first you need a variety of possible solutions, you need to make sure that if any solution fails (and many will) you can survive it and finally you need to identify which of the many solutions works best in the context you are dealing with.

I sometimes find books which attempt to explain all sorts of things using a central model or metaphor a bit annoying and contrived – but Adapt really rang true for me. When I was pondering this, I realised that I liked it so much because I already have a tendency to view life throught the lens of evolution. I used to research immunology and the processes that go on in our bodies as we fight diseases are remarkably good metaphors for understanding the world. By looking at the immune system you can see the results of ‘classical’ evolution (i.e. the selection of sucessful genes over long time periods). But the immune system also has a special adaptive arm that displays an accelerated form of adaptation – kind of like evolution on speed.

So, my dear readers, I thought that I would go ahead and give you a wee immunology lesson (you’re welcome) because I think it can help us all to understand the world!

In your blood there are lots of cells including some special ones which form the adaptive immune system. These cells are called T-cells and B-cells and they are like adaptation Jedi masters. There are a few different sub-types of both T and B cells but essentially they all work in a similar way. To start with, your body produces a whole load of different cells with sticky ‘receptors’ on them. Each cell has only one kind of receptor but the receptors on different cells are slightly different.

cells

Now, when a germ comes along, because there are so many different immune cells with so many different sticky receptors on them, it will eventually stick to one of them. Even better, the cells have a built-in feedback system so they recognise ‘danger signals’ which are like little red flags that indicate that whatever the cell has stuck to is a ‘baddie’. When a B-cell or T-cell sticks to a dangerous germ, the feedback mechanism kicks in, and very quickly that one lucky cell multiplies into a immune system army – each member of which is specific for the particular germ that the first one encountered.

cells -2

And the army of cells sets off to kill the germs in a variety of interesting ways.

cells -3

Once all the germs have been killed, the immune system more or less goes back to normal but with one key difference – a small sleeper ‘cell’ (no pun intended) is maintained with specificity for that particular germ. It means that if you encounter the same germ again, the body can get rid of it much faster. This explains why vaccines work – the vaccine promotes an inital reaction against a germ (or a part of a germ) so that when you encounter the actual germ you have the ability to get rid of it quickly.

I think the adaptation of immune cells is a good metaphor for many of the stories that Tim Harford describes in ‘Adapt’. Whether you are running a business or trying to solve world poverty, it is a good idea to try out a whole variety of solutions. In fact, if you want to be like the immune system, you might want to be designing and trying out new ways of doing things before you even think you have a problem! You also need to make sure that when one of your solutions ‘works’, you have a good feedback system in place so that this is recognised and that there is capacity to rapidly scale it up. And crucially, if the particular ‘problem’ disappears, you want to make sure that you don’t forget about the solution you came up with. You should make sure that the capacity to deal with a similar problem in the future is maintained.

So, what did I learn from ‘Adapt’? Although it focussed on some rather ‘macro’ issues, it actually made me think a lot about organisations and the qualities of a good leader or manager. The book makes it clear that taking an evolutionary approaches to problem solving is not an alternative to good leadership. In fact, enabling an organisation (whatever it may be) to adapt and deal with problems in this evolutionary manner could be seen as the hallmark of trully great leaders. Without such leadership, the natural tendency of people to innovate and experiment becomes crushed by mindless adherance to rules and lack of delegated authority. This point is illustrated brilliantly in the book by examining different approaches to leading the US military in recent years but I am sure we can all also think of organisations where we have encountered the same problem on a smaller scale.

Luckily, Harford is not the only person to be thinking along these lines. I was lucky to have the opportunity to do a diploma in management a couple of years ago and I discovered that there is a huge body of  work on organisational development examining how organisations can become more adaptive and healthy. Essentially, much of this work is promoting a similar approach to that described by Harford: enable local-level innovation, make sure you can recognise success and then have systems which allow successful solutions to be scaled up efficiently.

In conclusion, I highly recommend reading Adapt. And, I liked it so much that I have nominated it as book of the month on this business book club that I am a member of – its free to join so feel free to pop in to join in the discussion.