Public Sector Solutions
Key Takeaway
In this Vital City article, Tracy Palandjian and Jake Segal discuss the limitations of research in understanding social change and argue for a more open, humble view that embraces uncertainty.
We are highlighting one of our favorite articles of 2024. This article originally appeared in Vital City on March 27, 2024.
What do we do when we don’t know ‘what works’?
Social science research is overdue for a recalibration. Social change is far less predictable than we’d like to believe; the interrelated fields of evidence, measurement and evaluation would benefit from a much heavier focus on feedback, reflection and improvement.
What’s so intriguing about Megan Stevenson’s article, “Cause, Effect, and the Structure of the Social World,” is the magnitude of its claims. Even on its face, it’s a bold challenge to the practical usefulness of causal research in criminal justice. But it’s far more than that, too; Stevenson uses her field of expertise as a window into something deeper, suggesting that, for all our confident assertions about evidence-based policy, we in fact know very little about how change happens in people’s lives.
That claim is enormous, fully against the grain of conventional wisdom and, in our experience, almost certainly true.
At Social Finance, a nonprofit that mobilizes data, funding, and ideas to shape public policy, we often confront what Stevenson describes as “the engineer’s view” of social change: Tweak something here, get more value there. Unlike Dr. Stevenson, we aren’t researchers or philosophers but users of causal research. The models we build — which are about accountability, often linking money with results — make the reliability of that research especially important. So we are serious consumers of the studies she cites and their applicability to real-world issues.
Our experience supports her caution about the limits of social science research. We have found, in our own less-comprehensive but wider-aperture work, a similar pattern: that “evidence” is context-dependent, contradictory, often minor, consistently conservative and rarely replicable. This is true even of policy interventions that top-notch researchers describe as rooted in the best evidence.