Note: this article is not about how politicians find numbers sexy. The word ‘fetish’ in this context designates something that is used or done ritualistically rather than pragmatically.
In the world of UK public policy, everyone loves quantitative skills. Making policy ‘evidence-based’ is considered a matter of significant importance, and the National Health Service is held up as the shining example – primarily because of the prominence of randomized control trials and value-for-money. In the past decade this has led to the development of ‘What works’ centres, such as the College of Policing’s ‘What Works Centre for Crime Reduction”, and LSE’s “What Works Centre for Local Economic Growth”. Based on my research on these centres (from articles such as this), they tend to subscribe to the ‘evidence-based’ movements hierarchy of evidence, with Randomized Control trials at the top and anecdotal experience at the bottom (for those unfamiliar with this hierarchy, my main point is that data is on the top).
This increased emphasis on evidence is not inherently a bad thing. Indeed, in many ways it is a positive development. However, I have a number of concerns on the ways that these work centres interact with policy.
Concern #1: Begging the question – What works… for what?
The title ‘what works’ begs the question of what the interventions are
working for. On the College of Policing’s What Works Crime Reduction page (link), we are presented with a list of interventions, with data on cost, effectiveness, where it works, and the like. But this cannot answer the question of what the Police should do in the first place.
Concern #2: The ‘what works’ frame
‘What works’ neglects the reality that problems can be described in multiple ways. For example, are we concerned with young hoodlums who have not been taught proper values, or are we concerned with oppressed minorities who are lashing out due to opportunity deprivation. There is often not a natural way to interpret data – our personal values play a significant role. The ‘what works’ language seems to cover up these ambiguities by assuming a common frame.
Concern #3: Politics and evidence
‘What works’ does not seem to engage with the reality that policy overlaps with politics. Evidence is regularly used as ammunition to support pre-existing positions, rather than forming a basis for re-evaluating positions. Not that I mean to be entirely down on politics, I just mean that evidence will not transform politics.
Concern #4: Stifling innovation
When practitioners focus on ‘what works’, their attention is necessarily backwards looking, because we necessarily cannot have evidence on new ideas and approaches. ‘What works’ therefore cannot help us prepare for future problems, and may potentially hinder policy that looks to deal with future problems because ‘its not evidence based’.
Concern #5: The gap between theory and practice
Models and quantification represent reality, but there is always a gap. Further, small errors compound quickly. This is why our ability to forecast the weather drops off very quickly. This does not mean that we should not use models (we must!), but we should do so while being mindful that they are not infallible.
I intend to write a more in-depth article on this topic, but here are my immediate thoughts. Let me know what you think.
via Blogger http://ift.tt/2jZ6bq3