How politics closes down uncertainty

This is the second in a series of blog posts on uncertainty by Andy Stirling. The first one is here and the third in the series is here.


In a previous blog post, I discussed how uncertainty is a subjective state of knowledge, not an objective condition in the world. The example of nuclear accident risks shows how many deep and intractable forms of uncertainty, ambiguity and ignorance can defy single neatly-calculated aggregations. Yet around the world, strong pressures act on academia and policy analysis to pretend at conveniently simple answers.

With even the term ‘uncertainty’ thereby warped into deceptive double meanings, I suggested reviving the old word ‘incertitude’. Clearly encompassing all kinds of tricky intractabilities in knowledge, this may help resist the dangerous simplifications of probability.

But where do these pressures for closure come from? And how do they work?

A bewildering clamour of methods

Of course, probabilities can still be useful. This might be so, for instance, in games of chance or familiar unchanging situations viewed with a managerial eye. With year-to-year patterns in traffic accidents, workplace safety, or winter colds, for example, aggregated probabilities may (if handled cautiously) offer real value. Treated with care in these kinds of states of knowledge, risk assessment may illuminate one part of a complex picture.

Yet even here, the ‘truth’ in any probability distribution will always be conditional… on the rules of the game; on the underlying assumptions; on a particular circumstance; on a specific choice of ‘evidence’; on a point of view; on the future repeating the past. So – as with the example of nuclear accident risk calculations – what works for some restricted purpose, or tame scenario, or model, may break down horribly in the unbounded wilds of society and Nature.

This is where the politics of incertitude really bites. The nuclear example showed how – across wide areas of science, technology, the economy and society – complexities are routinely sidelined and expediently favourable numbers manufactured to suit the arguments of incumbent interests. But isn’t this practice quite obviously dangerous? How is it possible for so much high-level policy to be so incomplete? Can so much clever analysis really be so flawed?

Oddly, the last answer is ‘yes’. No technique is more deeply embedded around the world in ‘decision making under uncertainty’, than ‘risk assessment’. In justifying policy making across many different areas, the gold standard is ‘cost-benefit analysis’. Both routinely use probabilities to average out a diversity of inconveniently messy perspectives and possible circumstances. And this is what much of the world’s environmental, health and trade regulation is based on.

For example, pretty much every government – and many large business and NGOs – use ‘tools’ like ‘externality assessment’, ‘impact analysis’ or ‘quantitative valuation’ to help convince others which energy policy or health and safety standards or conservation strategy might be considered to be objectively ‘safest’, ‘safe enough’, ‘tolerable’  or even ‘best’.

Each technique routinely delivers its answers with formidable levels of precision. Yet the resulting impression of accuracy is deeply misplaced. For behind the scenes, the same calculations typically yield a variety of equally-rigorous but different possible answers, ranging across many orders of magnitude. Indeed, it is usually the case behind the apparently singular (often loudly proclaimed) ‘results’, that these methods can justify (deliberately or not) virtually any choice of policy that might be favoured by whoever most influences the calculations.

Across the WTO, OECD, EU; major corporations; NGOs; and a multiplicity of agencies, then – these kinds of self-confident prescription are ubiquitous. Related rhetorics clamour around ‘expected utility’, decision theory’, ‘life cycle assessment’, ‘ecosystem services’ ‘sound scientific decisions’ and ‘evidence-based policy’. What distinguishes all these (and many other) methods, is the claim to be able to derive prescriptively one-dimensional ‘objective’ answers to plural open-ended subjective dilemmas of incertitude.

Apples, Oranges and …

cricket ball and apple
Image credit: Apple / dullhunk / | Cricket ball / uk_pictures | Flickr

But the key problem here, is that the world is multidimensional. The many diverse aspects of these calculations defy reduction to any single parameter. How to compare impacts across different groups? How much illness or injury add up to a death? How much money is a marshland worth? Which perspectives to include as ‘valid’ and how (if at all) to weight them?

Like comparing apples and cricket balls, the problem with these kinds of incommensurability are intuitively obvious. What is not so well known, is that none of the elaborate theorising has found any solution. Indeed, Nobel prizes have been awarded for proving from first principles that it is not only difficult to achieve – but fundamentally meaningless and irrational even to try – to derive a single generally definitive answer for this kind of essentially value-laden politics.

Yet normal policy practice is oblivious to this common sense. Economists and risk assessors often learn the impossibilities in their first-year courses – and then conveniently forget them. Arcane theologies of measurement, normalisation and aggregation, are allowed misleadingly to imply that the problem is solved. The ubiquitous excuse is the desire to ‘be practical’.

It is indeed ‘practicality’ that drives this. But this is not necessarily that of the honest artisan, struggling to make a robust picture under the tough realities of incertitude. Often more apt, is the pragmatism of a courtier, obsequiously seeking favour by supporting privilege. The concrete hoped-for reward is less about robust public truth, and more about far more private advantages. What counts as ‘practical’ is firmly situated inside, rather than outside, the corridors of power.

This may seem a rather sweeping criticism. But to recognise these everyday political pressures on expertise is not to be cynical. Indeed, such realism might be reckoned a basic tenet of democracy. Nor need this picture imply bad faith among any individuals in the fields it describes. Closed communities are always vulnerable to collective wishful thinking. Analysts are no exception. It is democracy that helps such parochial interests align better with the public good.

Nor – depending on politics – is all this closure necessarily self-evidently negative. Both in government and commercial life, leadership is a tough business. Here, if any decision is to be sustained in the face of ever-present adversity and criticism, then closure is the most precious political commodity. Without it, established orders are threatened by instability and paralysis. Overblown precision may be strictly incorrect – but it may easily seem instrumentally necessary.

Either way, where democracy comes in, is in being open about these realities. What is being delivered here so ‘pragmatically’, is often less about speaking truth to power and more about power shaping ‘truth’. And, to be fair, this closing down of incertitude should not be seen as any more definitively ‘false’ than it is ‘true’ in any complete or final sense. After all, it is intrinsic to incertitude that ‘truth’ is anyhow under these circumstances (to a large degree) indeterminate.

From bridges to ecologies

Suspension bridge
Akashi Kaikyo Bridge, Kobe by bryansjs / Flickr / cc-by-sa 2.0

All this said, it may still seem to some that this diagnosis is a little unfair or overstated? If so, a simple thought experiment might test it. Consider the precision with which prescriptions are typically expressed under incertitude across different policy areas.

When engineers analyse physical properties in relatively simple material structures like buildings or bridges, it is normal to use error bars. Care is taken with precision. Alternative designs are included.  Sensitivity analysis is routinely applied: to explore how outputs delivered by models, change with assumptions about input values, parameters, or the model scope or design.

Now consider the more complex challenge of characterising entire social or natural contexts within which all such engineered structures are only a tiny part. It can hardly be claimed that incertitude here is somehow less than that faced by the engineer. Yet economists, risk assessors and environment and policy modellers routinely assert their much more expansive prescriptions with far more singular precision and self-confidence.

When presenting policy findings on economic forecasts, technological risks, ecological effects, or health impacts, for instance, much ‘evidence based’ analysis is strikingly precise and self-confident. Outputs are often expressed to several significant figures. Graphs frequently have no error bars. Sensitivity analysis is virtually entirely absent. Alternative options are attended to with far less detail – sometimes barely considered at all. Prescriptions are urged, not conditionally, but as if they were definitive.

Of course, there are always exceptions in such a wide and diverse field. But virtually any broad experience across different settings and policy areas, confirms the above general pattern. The more complex or intractable the issue, the more pronounced the precision and confidence with which policy analysis tends to be expressed. The greater the incertitude, the louder the assertiveness!

It is these patterns that confirm the importance of the political dynamics of closure discussed earlier. What explains the precision, confidence and singular prescriptions in so much risk-based analysis and wider ‘evidence based policy’ is not that associated claims to reduce incertitude are actually true. What is delivered is not ‘truth’, but the precious political commodity of ‘justification’.

By telling singular prescriptive stories under incertitude, analysts aid beleaguered decision makers with what are arguably their main preoccupation: how to justify their decisions. Even if such justifications turn out to wrong, their clarity, precision and self-confidence can (at least for a time) help to support particular concrete actions. Only in this way can hard-pressed politicians and incumbent institutions hope to procure legitimacy, secure, shape acceptance … and (if things go wrong) manage blame.

The Emperor’s simple clothes

In short, it is by pretending things are more straightforward than they really are, that politics can ‘move forward’. By ‘keeping it simple’, incumbent institutions maintain the necessary fiction that they are ‘in control’ – or ‘taking back control’. Always a mantra favouring entrenched vested interests, this language grows ever more intense around the world.

In the big picture, the ‘corridors of power’ may actually control rather little. But even so, the privileges remain very real. And it is stories of control that form the main entry ticket to this world of privilege.

So the narrowness of the ‘pragmatism’ is clear. Asserting narratives of quantification, singularity and precision typically do little to aid control. But what such simple stories do help with is the handling of what a former British Prime Minster called ‘events, dear boy, events’. This is how privilege stays on top, in surfing the fundamentally uncontrollable intractabilities of incertitude.

So, the ways in which policy pressurises the reduction of incertitude are not a mystery. Nor are they a deliberate conspiracy. The driving gradients of power are just an everyday fact of life. But as modernity spreads, expertise entrenches, corporations consolidate, education commodifies, research instrumentalises, global institutions rigidify and ‘new public management’ rolls ever further on – it becomes increasingly difficult to step outside and point this out.

Surrounded by so many busily calculating courtiers, the ‘evidence based’ emperor is superficially grand and sophisticated. But if we, like the child in the story, step forward to ask these questions about incertitude, none of this dazzling policy clothing conceals a rather less edifying sight. Stories of control are the self-gratification of power.


UncertaintyUncertainties can make it hard to plan ahead. But recognising them can help to reveal new questions and choices. What kinds of uncertainty are there, why do they matter for sustainability, and what ideas, approaches and methods can help us to respond to them?

Find out more about our theme for 2019 on our Uncertainty theme page.