Things We Don’t Want to Know?
As researchers and policymakers, we all say we want evidence-based policy; so why is there little good evaluation evidence about, and why do many decision-makers seem reluctant to use it?
I’m back from a trip to Paris, where I gave a keynote at an OECD-European Commission workshop on Place-Based Policies for the Future, alongside great contributions by Alessandra Faggian, Ralf Martin, Ana Garcia-Hernandez and others.
The workshop focused on policy monitoring and evaluation. This is crucial but pretty dry and technical territory; so my talk focused less on toolkits and data, and more about the political economy of evaluation. Why is it so challenging to to make and use evaluation evidence? How can we do better?
My slides are here.
[update, 9/23] And here’s the full paper. Below is the intro. Happy reading!
Good monitoring and policy evaluation is extremely important to effective placed-based policymaking, especially now, when there is a great appetite for forward-looking place-based interventions. However, robust evaluation evidence is both hard to generate, and surprisingly hard to incorporate into mainstream policymaking infrastructures. This paper will consider why this might be and will suggest some ways forward — for researchers and for policymakers.
This paper first discusses why we should care about monitoring and evaluation of place-based policies. I start by defining place-based policies. I then set out why monitoring and evaluation of place-based policy is especially salient now. I highlight increased awareness of persistent area disparities, shifting views about the effectiveness of place-based interventions, and the current hot topics of big-state industrial policy and industrial missions in the EU, US and UK, all of which involve significant place-based components.
I go on to discuss the main kinds of evaluation questions that evaluators of place-based policies should be asking, show how these questions are connected, and discuss the overall state of the evidence, and the key problem of missing or incomplete evaluation evidence. This problem is particularly germane for ex-post evaluations (as opposed to ex-ante studies), and for impact evaluations (as opposed to process evaluations).
Second, the paper briefly reviews the core components of an evaluation strategy for place-based programmes. I focus on impact evaluations (which starts with the question ‘what is the overall effect of policy X on outcome Y?’). I use EU Cohesion Policies as an example of a classical place-based policy, and broadband support programmes as an example of place-sensitive policy. I cover the core design issues, focusing on establishing a counterfactual, then move to the typical toolkit of methods and data types evaluators bring to these cases. I also look at what we know about the overall effectiveness of these interventions.
Third, the paper sets out the challenges to generating and using evidence, both to implement monitoring and evaluation of place-based policies, and to mainstream the findings into political space and into decision-making institutions. In essence, monitoring and evaluation involves both practical challenges — things that are hard to know — but also cultural / institutional challenges, things we don’t want to know.
Practical challenges include establishing causal effects, which is especially hard in area-based settings; grappling with variable data access and quality, long impact timeframes, varying government capacity across scales and countries, and assessing non-economic outcomes which evaluation methods find it hard to proxy at scale. Other challenges are less tangible but equally, if not more important. We — academics, policymakers — spend a lot of time thinking and talking about place-based policy. And yet we often lack evidence and make it hard to gather evidence on its effectiveness; and incorporating lessons often feels harder than it should be. I will conclude by setting out some high-level ideas about what we might do to tackle these challenges, drawing largely on UK experience from the ‘What Works’ movement over the last 10 years.