kirstyevidence

Musings on research, international development and other stuff


6 Comments

Guest post: Louise Shaxson on advising governments… and ugly babies

I have known Louise Shaxson for many years and have always valued her advice and insight. However, when she wrote to me recently to tell me that she had written a blog about how to talk to new parents about their ugly babies… I was seriously concerned that we might be heading for a fall-out. Turns out I had no need to worry. For a start, the article is actually about giving advice to governments (although I think it is relevant to people giving advice to any organisation). But also, on reflection, I remembered that MY baby is totally ADORABLE. So it’s all good.

Right then, here’s the blog – and I couldn’t resist adding some pics. Hope you like it!

Being nice about an ugly baby… three tips for presenting research to governments

Presenting research results to government can be like talking to a new parent whose baby isn’t, perhaps, the best looking on the planet, (read on to find out why).

Even if a government department has commissioned your research, it can be hard to write a final report that is well received and acted on.  I’ve heard numerous researchers say that their report was politely received and then put on a shelf. Or, that it was badly received because it exposed some home truths.

A long time ago, I submitted the first draft of a report that the client didn’t like. He told me it was too confrontational. But he recognised the importance of the message and spent time explaining how to change its presentation to make the message more helpful.

I was grateful for this guidance and redrafted the report.  Consequently, it was not just well received; it helped instigate a series of changes over the next two years and was widely referenced in policy documents.

It’s not easy—I still don’t always get it right—but here are my three tips for crafting your research report, so that it is more likely to be read and used:

  1. Be gentle – government departments are sensitive to criticism.

All parents are proud of their baby, even if he doesn’t look like HRH Prince George and no parent wants to be told that their baby is ugly in public.  You can still appreciate chubby cheeks, a button nose or a wicked grin.

The media is constantly on the lookout for policy ‘failures’ – both real and perceived.  Even if there’s no intention to publish, things can leak.  If the media picks up your research and the coverage is unflattering, your client will have to defend your findings to senior managers, maybe even to the Minister, and spend a considerable amount of effort devising a communication strategy in response. 

Begin by recognising what they have achieved, so that you can put what they haven’t yet achieved into context.

  1. Observations might work better than recommendations.
tired mum

Don’t comment on how badly the baby’s dressed without recognising how difficult it was for an exhausted mother just to get her and the baby out of the house.

No matter how much subject knowledge you have, you don’t fully understand the department’s internal workings, processes and pressures.  Your client will probably be well aware of major blunders that have been made and won’t thank you for pointing them out yet again.
Framing recommendations as observations and constructive critiques will give your client something to work with.

  1. Explain why, not just what should be done differently.
messy baby

If you are telling a parent that their baby’s dressing could be improved, they may have to sell the idea to other family members – even if they themselves agree with you. Make their life easier by explaining why the suggested new approach will work better.

Your client will have to ‘sell’ your conclusions to his/her colleagues.  No matter how valid your criticisms, it’s difficult for them to tell people they’re doing it wrong.

Try not to say that something should be done differently without explaining why.  It allows your clients to work out for themselves how to incorporate your findings.

Taking a hypothetical situation in the agriculture sector, here are some examples of how to put these tips into practice:

More likely to cause problems More likely to be well received
Recommendation 1: If the agricultural division requires relevant evidence, it needs to clearly define what ‘relevant’ means in the agricultural context before collecting the evidence.

Implication: you haven’t really got a clue what sort of evidence you want.

Observation 1: Improving our understanding of what constitutes ‘relevant evidence’ means clarifying and communicating the strategic goals of the agricultural division and showing how the evidence will help achieve them.

Implication: there are some weaknesses in specific areas, but here are some things you can do about it. Using ‘our understanding’ rather than ‘the division’ is less confrontational

Recommendation 2: Relationships with the livestock division have been poor. More should be done to ensure that the objectives of the two divisions are aligned so the collection of evidence can be more efficient.

Implication: you haven’t sorted out the fundamentals.  ‘Should’ is used in quite a threatening way here.

 

 

Observation 2: Better alignment between the objectives of the agricultural and livestock divisions will help identify where the costs of collecting evidence could be shared and the size of the resulting savings.  The current exercise to refresh the agricultural strategy provides an opportunity to begin this process.

Implication: we understand that your real problem is to keep costs down.  Here is a concrete opportunity to address the issue (the strategy) and a way of doing it (aligning objectives). Everyone knows the relationship is poor, you don’t need to rub it in.

Recommendation 3: The division has a poor understanding of what is contained in the agricultural evidence base.

Recommendation 4: More work needs to be done to set the strategic direction of the agricultural evidence base.

Implication: wow, you really don’t have a clue about what evidence you’ve got or why you need it. 

Observation 3: An up to date understanding of what is contained in the agricultural evidence base will strengthen the type of strategic analysis outlined in this report.

Implication: having current records of what is in the evidence base would have improved the analysis we have done in this report (i.e. not just that it’s poor, but why is it poor?). Recommendation 4 is captured in the rewritten Observation 1.

 

This guest post is written by Louise Shaxson, a Research Fellow from the Research and Policy in Development (RAPID) programme at ODI.

Advertisements


8 Comments

Unintended consequences: When research impact is bad for development

Development research donors are obsessed with achieving research impact and researchers themselves are feeling increasingly pressurised to prioritise communication and influence over academic quality.

To understand how we have arrived at this situation, let’s consider a little story…

Let’s imagine around 20 years ago an advisor in an (entirely hypothetical) international development agency. He is feeling rather depressed – and the reason for this is that despite the massive amount of money that they are putting into international development efforts, it still feels like a Sisyphean task. He is well aware that poverty and suffering are rife in the world and he wonders what on earth to do. Luckily this advisor is sensible and realises that what is needed is some research to understand better the contexts in which they are working and to find out what works.

Fast-forward 10 or so years and the advisor is not much happier. The problem is that lots of money has been invested in research but it seems to just remain on the shelf and isn’t making a significant impact on development. And observing this, the advisor decides that we need to get better at promoting and pushing out the research findings. Thus (more or less!) was born a veritable industry of research communication and impact. Knowledge-sharing portals were established, researchers were encouraged to get out there and meet with decision makers to ensure their findings were taken into consideration, a thousand toolkits on research communications were developed and a flurry of research activity researching ‘research communication’ was initiated.

dfid advisorBut what might be the unintended consequences of this shift in priorities? I would like to outline three case studies which demonstrate why the push for research impact is not always good for development.

First let’s look at a few research papers seeking to answer an important question in development: does decentralisation improve provision of public services. If you were to look at this paper, or this one or even this one, you might draw the conclusion that decentralisation is a bad thing. And if the authors of those papers had been incentivised to achieve impact, they might have gone out to policy makers and lobbied them not to consider decentralisation. However, a rigorous review of the literature which considered the body of evidence found that, on average, high quality research studies on decentralisation demonstrate that it is good for service provision. A similar situation can be found for interventions such as microfinance or Community Driven Development – lots of relatively poor quality studies saying they are good, but high quality evidence synthesis demonstrating that overall they don’t fulfil their promise.

My second example comes from a programme I was involved in a few years ago which aimed to bring researchers and policy makers together. Such schemes are very popular with donors since they appear to be a tangible way to facilitate research communication to policy makers. An evaluation of this scheme was carried out and one of the ‘impacts’ it reported on was that one policy maker had pledged to increase funding in the research institute of one of the researchers involved in the scheme. Now this may have been a good impact for the researcher in question – but I would need to be convinced that investment in that particular research institution happened to be the best way for that policy maker to contribute to development.

My final example is on a larger scale. Researchers played a big role in advocating for increased access to anti-HIV drugs, particularly in Africa. The outcome of this is that millions more people now have access to those drugs, and on the surface of it that seems to be a wholly wonderful thing. But there is an opportunity cost in investment in any health intervention – and some have argued that more benefit could be achieved for the public if funds in some countries were rebalanced towards other health problems. They argue that people are dying from cheaply preventable diseases because so much funding has been diverted to HIV. It is for this reason we have NICE in the UK to evaluate the cost-effectiveness of new treatments.

What these cases have in common is that in each case I feel it would be preferable for decision makers to consider the full body of evidence rather than being influenced by one research paper, researcher or research movement. Of course I recognise that this is a highly complicated situation. I have chosen three cases to make a point but there will be many more cases where researchers have influenced policy on the basis of single research studies and achieved competely positive impacts. I can also understand that a real worry for people who have just spent years trying to encourage researchers to communicate better is that the issues I outline here could cause people to give up on all their efforts and go back to their cloistered academic existence. And in any case, even if pushing for impact were always a bad thing, publically funded donors would still need to have some way to demonstrate to tax payers that their investments in research were having positive effects.

So in the end, my advice is something of a compromise. Most importantly, I think researchers should make sure they are answering important questions, using the methods most suitable to the question. I would also encourage them to communicate their findings in the context of the body of research. Meanwhile, I would urge donors to continue to support research synthesis – to complement their investments in primary research. And to support policy making processes which include consideration of bodies of research.


3 Comments

Can an outsider help solve your problems?

My sister, who knows about these things, tells me that most great innovations happen when someone from one sector/area of expertise moves to a new sector/area of expertise and introduces a new way of dealing with a problem.

Face-palm moment

Face-palm moment

This kind of surprises me – my experience is that when new people arrive in my sector, they quite often make lots of the same mistakes that those of us who have been around for a while have long ago tried and discarded. But my sister’s revelation made me wonder whether this slightly negative attitude towards newbies is doing me harm? Is my snootiness depriving me of lots of valuable opportunities to learn?

The answer is probably yes, but I think ‘outsider’ input into problem solving does need to be well managed. It is possible that someone with a new perspective will identify a fabulous and innovative new way to solve a problem – but there is also a high risk that they will jump to the same naive assumptions that you used to have before you became so jaded I mean… experienced.

So here are my top tips to both sides of equation – and, as usual, my advice is gathered from my own experience of messing this type of thing up!

If you are the highly experienced expert who is getting some ‘outsider’ perspective….

1. Stop being so bloomin’ grumpy! Yes of course you know lots about this and of course the outsider will appear ignorant – but if you can attempt to engage with them enthusiastically – even gratefully – and provide evidence for why certain ideas might not work (rather than rolling your eyes!) you might well get a useful new perspective.

2. Build your credibility as an expert by summarising important bodies of knowedge that you have learnt from – including your own experiences, books, experts, research evidence etc. This will be more helpful and more persuasive that just expecting people to realise that you know best (even if you do!).

3. Don’t be afraid to pinpoint parts of the problem which you already feel well-placed to solve – and other parts where you would welcome some input.

If you are the bright-eyed bushy tailed outsider who has been brought in to advise…

1. Make sure it is clear that you want to listen – this usually reduces people’s resistance. And try to spend as much time understanding what the problem is that people are trying to solve before jumping in with solutions. I find the ‘Action Learning’ approach really useful for forcing you to stop trying to solve a problem before you actually really understand it.

2. Be respectful to people’s knowledge and experience and take the time to listen to how they think the problem should be solved (even if they do seem grumpy!). You may eventually decided to provide constructive challenge to their proposed solutions, but this will never be effective unless you really understand why they are proposing them.

3. Repeatedly  invite the experts to challenge any new ideas you have – and develop a thick skin!

.

And, just in case none of this works, you may also want to check out this post on dealing with disagreements…!


14 Comments

Why your knowledge-sharing portal will probably not save the world

One of the most common interventions that people attempt in order to support evidence-informed policy making is setting up an online portal/one-stop shop/knowledge sharing community. In some cases, these can be a wonderful resource. For example, Eldis is a an excellent source of information about development knowledge while Scidev is outstanding for keeping uptodate with science related to development.

However, for anyone who is thinking of setting up some sort of knowledge sharing portal, it is worth bearing in mind that these successes are definitely the exception not the rule. In fact the internet is littered with abandoned knowledge-sharing portals. Countless examples have been set up, fueled by excellent intentions and much enthusiasm, only to die a death a few years later.

If you are thinking about setting up some kind of portal, I would suggest you ask the following questions first:

1. Is lack of a portal the problem?

I have been asked to advise on numerous projects which aim to set up a repository of information for policy makers to access and use. In almost every case, they have failed for the simple reason that lack of information was not the major barrier that was preventing policy makers using information. For example, a few years ago a european NGO set up a knowledge sharing site in collaboration with an  African policy making institution. The site allowed policy makers to request information and then the NGO commissioned high quality evidence products from qualified academics. After persevering with this for a number of years the NGO had to abandon this project because they were not getting any requests for information and even when they proactively produced them, they were not used. If they had spent some time understanding the context they would not have fallen into that trap. In this case it was true that the institution was not making use of evidence but the reason for this was not that evidence was lacking. In fact there was a large amount of evidence available but what was missing was the demand for it – i.e. the incentive to make use of evidence and the skills to understand and incorporate research evidence into policy decisions.

2. Is someone else doing it already?

Don’t succumb to ‘portal proliferation syndrome’. If there are already similar resources, your efforts may be much better invested in supporting them rather than setting up a potential rival. Of course you may have a slightly different focus for your site than the ones that already exist but perhaps a bit of compromise might lead to better overall results. Remember that these sites absolutely rely on critical mass – reportedly only about 10% of members of any network will actually contribute to it. By dividing the potential audience between multiple relatively similar sites, you run the risk that none will thrive

3. Can it be hosted on facebook?

If you are setting up a site for people to interact with each other and they have to go to a site and enter a username and password before they can do that, it will almost certainly fail! The chances are that most of the people who you want to involve in your site are busy and they will just not find the time to do this. For this reason, it will have a much higher chance of success if it runs via a platform that people use already. Actually, the simplest way to do this is to use an email ‘listserve’ – for example the evidence-based policy in development network mainly functions via its email list because this allows people to interact using a tool that they have to open every day anyway. Similarly, facebook is very commonly used. You may feel that facebook does not offer all the bespoke features that they are looking for but, given the widespread popularity of facebook, your alternative platform would have to offer a LOT of benefits before it would really be a better option.

4. Whose one-stop shop is it?

A common argument is that setting up a one-stop will save people time since they will only need to go to one place to find everything they need. The problem is that this argument assumes that there is a large population of people who have similar ‘shopping lists’. So for example if you set up a one-stop shop about climate change adaptation in developing countries, your assumption is that there are many people who are interested in climate change adaptation in developing countries (and nothing else). In fact, many of the people who are interested in the products on your site will be interested in a slightly different, but overlapping theme; they might be interested in climate change in general, or climate change adaptation in all countries, or perhaps adaptations to a range of environmental shocks. Similarly, setting up a site which focusses on ‘development research’ or ‘development policy’ risks excluding a lot of information of interest to people in developing countries (as Enrique of onthinktanks has pointed out on numerous occasions, if you live in a developing country ‘development policy’ is just ‘policy’). Because we all have slightly different areas of interest, there is a strong risk that your ‘one-stop shop’ will become a ‘one-of-many-stops shop’ – which you have to admit is a bit less attractive!

All this reminds me of a question I was asked at the end of a conference talk I gave on evidence-informed policy making. A well-meaning individual asked why we didn’t just create a portal of all the knowledge of use to developing countries so that it could be easily found. I really struggled not to respond ‘yes, that’s a great idea, and we could give it a name, something like.. the in-ter-net?’. A bit facetious perhaps but it really is worth considering whether making your information available on an open-access repository and ensuring that it is search engine optimised might be a better option that the costly, time-consuming task of creating a new platform.

Anyway, if you are not yet convinced, I strongly recommend you read this to find out some more questions to consider before setting up a portal.


3 Comments

Why communicating internally might be the key to getting your message out

Communication

Communication (Photo credit: P Shanks)

I have worked with a variety of research organisations who are struggling with their research communication strategy. One surprising thing I have learnt is that in many cases, these programme actually have what seems to be a very well thought through strategy and even a person who is responsible for communications. However the common complaint that I hear from such people is that their communication is failing because the people they are communicating on behalf of… don’t communicate with them! This makes me suspect that poor external communication is often due to poor internal communication and therefore that any research communication/uptake strategy needs to focus on both aspects. The reason for this is obvious when you think about it – a communication officer can only be as good as the messages s/he has. If they don’t know what is going on internally they will not be able to transmit these messages to others.

So, how do you make sure that your internal communication is working? The first step is to look for blockages. What messages does your communication officer (or equivalent) need to know and what is preventing him/her from getting them. Remember that the communication officer does not just need to know what you are doing, they will also need to know about how you, and others in your sector, are using language and what the dominant narratives and discussion points in the field are. Once you know the blockages (whether due to infrastructure, human capacity or organisational culture) you can put in place a strategy to combat them.

Strategies don’t need to be complex. A simple ’round table’ catch up meeting can be a wonderful way for small teams to hear what everyone else is up to. For larger and dispersed teams, an informal sharing document or wiki can work equally well – provided that there are (enforced) deadlines by which people have to update it and then ideally a meeting (in person or virtual) at which people can ask questions on the information provided.

Another area which can usually be improved is use of emails. Some training and guidance on writing effective emails can do wonders. For example in a previous job the whole organisation received guidance on using better subject lines and we all started tagging emails with keywords such as ‘Action required’; ‘For information only’; ‘Urgent’ etc. It really made dealing with your email queue much easier. Promoting a culture where inboxes are cleared regularly (yes you can!) can also make internal communication much more efficient.

These are just a couple of ideas but I would love to hear what strategies others have tried to make internal communication work better.