What happens when other people take our cautious ‘It depends…‘ as an explicit Yes or No? What are the risks that we face as enterprise-architects when others force us to give a definite ‘Yes’ or ‘No’ in relation to something that’s inherently uncertain?
What can we do if those others base their actions and choices on that ‘definitive’ answer that they demanded from us – even though we’d told them that it was inherently uncertain? And that can we do to protect ourselves from the way people want to blame us – yet never themselves – when things turn out the opposite way to what we were forced to ‘predict’?
Like a lot of people in the EA ‘trade’, I’ve been very concerned about the implications of the ‘earthquake trial’ in Italy. To quote the BBC report:
This week six scientists and one government official were sentenced to six years in prison for manslaughter, for making “falsely reassuring” comments before the 2009 L’Aquila earthquake. But was this fair?
Reading somewhat between the lines, to me this sounds like a classic clash of paradigms:
- the linear-paradigm, which expects and demands that everything can be reduced to simple true/false logic, that everything should be certain, known, predictable, safe
- the flow/probability paradigm, in which everything is somewhat blurry and uncertain, and can only be described in statistical terms, in a modal logic of probability, possibility and necessity.
Many of the sciences will now only describe their work in terms of probabilities: weather-modelling is one obvious example. The catch is that many people want definite answers, a definite weather-forecast, because they need to base concrete real-world decisions on those answers. It should only take a few moments’ thought to realise that that clash has the potential for a really nasty wicked-problem… dangerous for everyone involved…
That’s exactly the kind of clash that we live with everyday in enterprise-architecture. For example, it’s the key source of others’ frustration with us whenever we reply to their question with the ubiquitous – and, to them, iniquitous – answer of “It depends…”. It’s why some planners fail to accept that a ‘roadmap for change’ will always be provisional – especially so in times as turbulent as ours, in terms of the scale and scope of changes in technology and just about everything else. And it’s why those so-certain seeming models created within most of our current toolsets are actually darn dangerous for us, because they give a spurious illusion of certainty that does not exist – and cannot exist – in the real world.
To quote the BBC report again, the real key is communication:
This case is not about the scientists’ ability to predict earthquakes – it is about their statements communicating the risk of an earthquake.
This is why communication is such a key feature in enterprise-architecture frameworks such as PEAF and TOGAF. But again the same challenge will arise: how do we explain the nature and reality of risk? How do we explain something that’s inherently uncertain, to people who live in a world of ‘they-shoulds’, and who in some cases refuse to accept even the existence of uncertainty?
The Italian seismologists understood that an earthquake was unlikely but not impossible. In a press conference, however, the message seemed to be that that meant there was nothing to worry about at all. This is the falsely reassuring statement which formed part of the case against them.
Reading the BBC report, to me it sounds like the L’Aquila scientists and officials were pushed so hard for a non-existent certainty that they ended up giving way out of sheer exasperation:
The government official, Bernardo De Bernardinis – deputy chief of Italy’s Civil Protection Department at the time – is reported to have advised worried residents to go home and sip a glass of wine. He even specified what kind: “Absolutely a Montepulciano.”
This turned out to be a classic example of kurtosis - a seemingly low risk that, if it eventuates, will more than wipe out the gains that have been made from ignoring the risk. At L’Aquila, the risk for a serious earthquake was real, though very low; yet the demand for certainty forced a translation from ‘low risk’ to ‘no risk’. So the people went home, and ignored the ongoing minor tremors. Which was not a good choice: the much more serious earthquake did eventuate, all but flattening the town, killing more than 300 people and injuring many thousands more. One of the certain consequences of this kind of shock – especially on this scale – is the search for the scapegoat, for some one (else) to blame: and the scientists who had give the ‘wrong answer’ about the risk were the all too obvious targets.
From an enterprise-architecture perspective, though, notice one of the themes I’ve recently been exploring here: even though it’s quite subtle, there’s a really serious power-issue in play in this case. What’s happened is that the townsfolk not only passed to the experts the (mental etc) work of assessing the risk, but also in effect ‘exported’ the (emotional etc) work of facing the risk. In this sense,‘export’ is the attempt to offload onto others some form of work that should or can only be done by the self. In both a technical and very literal sense, ‘export’ is an active form of abuse. In reality, the work of facing the risk of earthquake could only be done by the townsfolk, because the risk was, by definition, theirs: and having avoided that work, the supposed ‘natural’ response is to try to blame those to whom the work was ‘exported’ – in other words, to now also ‘export’ the (emotional etc) work of dealing with the consequences of having avoided that work in the first place. That the scientists have now been ‘punished’ for their ‘misleading advice’ merely serves to anchor the delusion that people have a ‘right’ to export the fears to others in this way: in other words, a systemic or structural form of abuse.
Don’t laugh: most current organisations and enterprises are riddled with such forms of abuse – which is why there are so many disastrous power-problems and suchlike in those selfsame organisations and enterprises. Ouch…
Even more to the point, every enterprise-architect is inherently at high personal risk from that type of abuse. It’s an inherent outcome of the type of work that we do: we link across silos and projects that really don’t want to talk with each other, and would much prefer to have someone else to take the blame for the own frustrations. Wicked-problems always make things worse in this sense: and, by its nature, enterprise-architecture is wicked – hence, all too easily, ‘the wicked one’ who can be blamed by everyone for everything. On top of that, we’re dealing all the time with inherent-uncertainty, and we’re surrounded by stakeholders who need concrete, actionable answers, and who really don’t want to hear the words “It depends…”. So don’t laugh at the scientists of L’Aquila, or the townsfolk either: it could very easily be you that’s next up for that kind of (mis)treatment – when you might find your stakeholders holding a rather different and more sharpened kind of stake…
Some practical suggestions here:
- do ensure that your stakeholders understand and acknowledge that a probability always remains uncertain – that it is never certain, or reducible to a simple true/false answer
- do ensure that your stakeholders understand that opportunity and risk are inherent flipsides of each other – opportunity always implies risk, and risk also always implies opportunity
- do clarify the nature and scale of each opportunity/risk – preferably with explicit metrics to underpin each assessment
- do ensure that your stakeholders understand the consequences of each opportunity/risk, in terms of its eventuation or non-eventuation, and of the (probable) implications of each choice for action or non-action
- do document the risks, and others’ acknowledgement of those risks
- don’t use terminology that implies any greater level of certainty than is actually the case
- don’t use ‘hard-edged’ diagrams (Archimate, BPMN, UML etc) where there is significant risk of their being interpreted as implying a spuriously high degree of certainty
- don’t allow others to ‘export’ their fears of uncertainty onto you – especially through systemic channels which afford you no defence against such actions
You Have Been Warned, perhaps? But in any case, do take care on this: as we head into ever more turbulent times, this kind of wicked-problem is all too likely to get much, much worse.