Greater Manchester Centre for Voluntary Organisation

Evidence Based Commissioning: cleverness and persistence can get you almost any result you want

I thought I'd finish this series of posts on evidence-based commissioning with a final note looking at the difficulties presented by complex systems.

In his 1985 paper "Economic History and Economics", in the American Economic Review, respected economist Robert M Solow wrote that: "the attempt to construct economics as an axiomatically based hard science is doomed to fail". His view was that any economic activity exists in a social environment so complex that the attempt to untangle this to identify, with any precision, the impact of a change we might make was not possible. In summary:

"A... complicated system... cannot conduct controlled experiments... hypotheses are themselves complex... narrowly economic activity is embedded in a web of social institutions, customs, beliefs, and attitudes... affected by these background factors.... As soon as time-series get long enough to offer hope of discriminating among complex hypotheses, the likelihood that they remain stationary dwindles away, and the noise level gets correspondingly high. Under these circumstances, a little cleverness and persistence can get you almost any result you want".

A scheme that works in one locality may have no impact elsewhere (or even an improved impact) due to differences in the type of social and community institution that exists and the outlooks and expectations of local communities. We are diverse and so are our reactions.

In many ways a hard evidence-based approach is unscientific by its nature. Whilst it will most often be used with good intentions, in the hope that it will lead to good decisions, it may also simply allow a commissioner who is under pressure some comfort in these difficult times. They may not have the resources to fund all the services they'd like to and relying on evidence is a way of passing the buck, eg in giving some kind of justification to those disappointed. There may be more worrying uses on occasion, some will use a requirement for evidence as a power play, cherry picking evidence to justify changes that suit the convictions they have. Providers themselves will use evidence with similar motivations, the requirement for evidence can narrow the competition. This is understandable but doesn't necessarily guarantee increased effectiveness of services to beneficiaries.

Solow didn't think that we should give up understanding economic impact though but argued for a more modest approach:

There is enough for us to do without pretending to a degree of completeness and precision which we cannot deliver. To my way of thinking, the true functions of analytical economics are best described informally: to organize our necessarily incomplete perceptions about the economy, to see connections that the untutored eye would miss, to tell plausible - sometimes even convincing - causal stories with the help of a few central principles, and to make rough quantitative judgments about the consequences of economic policy and other exogenous events.

Whilst evidence can't guarantee cashable savings or with any certainty reduce demand on a specific service, such as an A&E unit, we can use it to look at problems from different perspectives and make connections we wouldn't see otherwise. This evidence can aid our decision-making but can't be used as a substitute for decision-making and the stories generated from qualitative evidence are just as, if not more, valuable as quantitative data in this regard.

At best approaches which seek to treat evidence from social sciences as hard data, such as payment-by-results schemes, will encourage gaming of the system. People will ensure that they produce the needed data, whether or not that results in a decent service. At worst though, evidence-based approaches are committing organisations to significant levels of unplanned risk and could be quite devastating if the eventual outcomes are unexpected.