kirstyevidence

Musings on research, international development and other stuff

Impact via infiltration

2 Comments

Two blogs ago I linked to an article which I attributed to “the ever-sensible Michael Clements”. Shortly afterwards @m_clem tweeted:

.@kirstyevidence compares @JustinSandefur to @jtimberlake 🔥—but me, I’m “sensible”.Some guys got it, some don’t. https://t.co/KphJ0ITmr8

I mention this in part because it made me chortle but mainly because it prompted me to look back at that Justin Sandefur interview. And on re-reading it, I was really struck by one of Sandefur’s models for how research relates to policy:

My third model of what research has to do with development policymaking is borderline cynical: let’s call it a vetting model.  The result of your narrow little research project rarely provides the answer to any actual policy question.  But research builds expertise, and the peer review publication process establishes the credibility of independent scientific experts in a given field.  And that — rather than specific research results — is often what policymakers are looking for, in development and elsewhere.  Someone who knows what they’re talking about, and is well versed in the literature, and whose credentials are beyond dispute, who can come in and provide expert advice.

Since that interview was published, I wrote a literature review for DFID which looked at the impact of research on development. And, having spent months of my life scouring the literature, I am more convinced than ever that the Sandefur/Timberlake effect (as it will henceforth be known) is one of the main ways in which investment in research leads to change.

This pathway can be seen clearly in the careers of successful researchers who become policy makers/advisors. For example, within DFID, the chief scientist and chief economist are respected researchers. But the significant impacts they have had on policy decisions within DFID must surely rival the impacts on society they have had via their academic outputs?

And the case may be even stronger if you also examine ‘failed scientists’ – like, for example, me! The UK Medical Research Council invested considerable amounts of funding to support my PhD studies and post-doc career. And I would summarise the societal impact of my research days as… pretty much zilch. I mean, my PhD research was never even published and my post-doc research was on a topic which was niche even within the field of protozoan parasite immunology.

Undercover nerds: creating societal impact all around you?

Undercover nerds – surely the societal impact of all those current and former academics goes beyond their narrow research findings?

In other words, I wouldn’t have to be very influential within development to achieve more impact than I did in my academic career. My successful campaign while working at the Wellcome Trust to get the canteen to stock diet Irn Bru probably surpasses my scientific contributions to society! But more seriously, I do think that the knowledge of research approaches, the discipline of empirical thinking, the familiarity with academic culture – and, on occasion, the credibility of being a ‘Dr’ – have really helped me in my career. Therefore, any positive – or indeed negative – impact that I have had can partly be attributed to my scientific training.

Of course, just looking at isolated individual researchers won’t tell us whether, overall, investment in research leads to positive societal impact – and if so, whether the “S/T effect” (I’m pretty sure this is going to catch on so I have shortened it for ease) is the major route through which that impact is achieved. Someone needs to do some research on this if we are going to figure out if it really is a/the major way in which research impacts policy/practice.

But it’s interesting to note that other people have a similar hypothesis: Bastow, Tinkler and Dunleavy carried out a major analysis of the impact of social science in the UK* and their method for calculating the benefit to society of social science investments was to estimate the amount that society pays to employ individuals with post-grad social science degrees.** In other words they assumed that the major worth of all that investment in social science was not in its findings but in the generation of experts. I think the fact that the authors are experimenting with new methodologies to explain the value of research that go beyond the outdated linear model is fabulous.

But wait, you may be wondering, does any of this matter? Well yes, I think it does because a lot of time and energy are being put into the quest to measure the societal impact of research. And in many cases the impact is narrowly defined as the direct effect of research findings and/or research derived technologies. The recent REF impact case studies did capture more diverse impacts including some that could be classified within the S/T™ effect. But I still get the impression that such indirect effects are seen as secondary and unimportant. The holy grail for research impact still seems to be linear, direct, instrumental impact on policy/practice/the economy – despite the fact that:

  1. This rarely happens
  2. Even when we think this is happening, there is a good chance that evidence is in fact just being used symbolically
  3. Incentivising academics to achieve direct impact with their research results can have unintended and dangerous results

Focussing attention on the indirect impact of trained researchers, not as an unimportant by-product but as a major route by which research can impact society, is surely an important way to get a more accurate understanding of the benefits (or lack thereof) of research funding.***

So, in summary, I think we can conclude that that Justin Sandefur is quite a sensible bloke.

And, by the way, have any of you noticed how much Michael Clemens resembles George Clooney?

.

* I have linked to their open access paper on the study but I also recommend their very readable book which covers it in more detail along with loads of other interesting research – and some fab infographics.

** Just to be pedantic, I wonder if their methodology needs to be tweaked slightly – they have measured value as the cost of employing social science post-grad degree holders but surely those graduates have some residual worth beyond their research training? I would think that the real benefit would need to be measured as the excess that society was willing to pay for a social science post-grad degree holder compared to someone without..?

*** Incidentally, this is also my major reason for supporting research capacity building in the south – I think it is unrealistic to expect that building research capacity is going to yield returns via creation of new knowledge/technology – at least in the short term. But I do think that society benefits from having highly trained scientific thinkers who are able to adapt and use research knowledge and have influence on policy either by serving as policy makers themselves or by exerting evidence-informed influence.

2 thoughts on “Impact via infiltration

  1. Thanks Kirsty, I really like this. Good point on the REF – this came up at a recent university seminar on REF impact case studies. One of the presenters explained that she’d been invited to contribute to lots of policy discussions and government guidelines, but she couldn’t count any of this for a REF impact case study because the invitations were based on her expertise built up over 20 years rather than being linked to a specific research project. I imagine academics may find ways to get round this by planning carefully over the current REF period, but recognising the value of this kind of impact might be a better option.

  2. Nice post Kirsty. There certainly a lot of truth in this. Interested to hear your thoughts on what the implications would be on research uptake strategies. Should we encourage researchers to stop messing with policy briefs and concentrate on growing Rasputin-like beards and speaking in code to emphasise their expert credentials? Or is being a good communicator part of being a go-to expert these days?

    Interested to follow up your link for point 3 about unintended results – did you mean to point to the same LSE link as in point 3? Possibly a glitch with the URL.

    Best wishes
    Geoff

Leave a Reply (go on, you know you want to...)