We live in an environment of uncertainty and risk (U&R). Can we make choices that moderate these? Of course: buy insurance, take prudent safety precautions, etc. But I believe we can be more strategic than that. I believe we can do the unthinkable, choose not to shoot ourselves in the foot by being foolish.
I suggest we avoid growing the “risk like” circumstances into “uncertainty like” ones.
in an earlier post Positioning Yourself for Uncertainty and Risk I began to explore this thought. In this earlier post I laid out three descriptions for the U&R spectrum:
- Risks are when we understand the events, their consequences when they occur and the likelihood of their occurrence (A well behaved and understood circumstances)
- Uncertainties subset A: when we can envisage or foresee the event but we are limited in our capacity to sufficiently understand the nature of likelihood (sometimes call these Grey Swans, I think of them as “foggy” circumstances)
- Uncertainties subset B: where we are oblivious to the events themselves (black Swans, or in my mind – “unknown” or “unknowable” circumstances)
Upon reflection, I would revise the “Uncertainties subset A” description to:
- Uncertainties subset A: when we can envisage or foresee the event but we are limited in our ability to sufficiently understand the likelihood of its occurrence, and/or its implications for us, and/or its adverse impacts on our capacity to act during its occurrence (sometimes called Grey Swans).
There are three critical aspects to this description:
- Most obvious, is we can’t determine any useful notion on the event’s predictability. We may say something like “once in a lifetime event” but realistically how would you use this notion? Why is this problematic? Because just because it’s supposedly a rare event, doesn’t mean it can’t happen repeatedly over a short period of time to you. Several months ago I came across a an article (sorry, I’ve lost track of it) that suggested in the financial world “once in a lifetime” events happened about every seven years> The irony of this statement was not lost on me – it reinforced in my mind that we do not process very effectively hard to describe event probabilities. Really now, would any of us use the term “once in a life time” to describe something that happened every few years?
- We have difficulty in understanding how the event’s consequences will impact us or even what the consequences are. This is the: “What will we unleash”, or “How exposed will we personally be?”. Think of a situation where you live in a community where a serious infectious disease is going around. Will you even get it? If so, how serious will it be for you? Even if not, how will your life be specifically impacted by the fact that others you know or interact with get the disease?
- We are unsure how we will be able to act. Will we be “deer in the headlights”? Will we have meaningful options still available to us? Unless you engage in an exercise like scenario planning, how would you even contemplate preparing for some vague event? How much effort should you put into something that from many perspectives seems quite hypothetical?
With recent events taking place in Europe, I am struck by the fact that the (government, bureaucrats, etc.) have systematically taken what is an economic facilitating strategy into the most convoluted alarming edifices. When European officials are concerned that if Greece leaves the Euro domino effect will occur. In fact it seems nobody knows for sure what will happen or how bad it will be. This is amazing, the architects of the Euro created this looming disaster. This is a dramatic example of taking a risk like environment to an uncertainty like setting.
How do we create uncertainty like settings out of risk like ones?
First: We make simple things more complex. When we add complexity to systems we we lose the capacity to see all the consequences of choice and decisions. The effects of this include being more cautious, taking more time or even being naively rash.This is the wise men take pause (will they even act?) and fools rushing in situations. How might we recognize such a system? Try our tax code or any like system that has myriad of rules, rides, exemptions, qualifying circumstances, etc. How do these situations create uncertainty? They make it extremely difficult for people to know how to best act – so they “toss a coin” when this is clearly inappropriate.
Second: We make smaller systems large. Large systems by their nature incorporate more opportunity to go wrong. Large systems “touch more”. The consequences are these systems get applied to the less and less applicable. All systems are designed to handle specific circumstances “well”. Any system that is “unitary” (only one way to do it right/lawfully/acceptably) is creating uncertainty because it will be eventually applied to inappropriate circumstances.
Third: We make systems inappropriately closed. I consider this the bureaucrats “wet dream” in that they want to especially achieve this. this is where some human activity is made “rule bound”. Rule bound systems assume that they capture the nuances of complexity and diversity of the “environments they routinize. When applied appropriately they deliver significant benefits (efficiency and ease of mind around choice). However, when they are closed they are are blind to shifting external circumstances that over time (not necessarily long durations either) make the process useful. In fact as the environment continues to shift, continuing to use the increasingly obsolete process adds increased dangers. The traditional response by these systems owners is to add additional rules thereby introducing the above two risk into uncertainty actions.
A humorous like example of a closed system is an old experieince: many years ago I filled out a form that asked me about my gender (Male or Female?). What if you do not consider ourself fitting into either of the two clearly offered choices? Would you be tempted to circle “or”? Closed systems by their nature, do not check to see if the environment they assume is out their (e.g., only two genders) continues to exist, hence they run the risk (yes, in this case it approaches certainty) that they become obsolete and may even become dangerous.
My litmus tests of whether we have inadvertently gotten ourselves into a “risk into uncertainty” like situation are:
- Can we even allow ourselves to seriously contemplating walking away from this activity? If we are distressed by the sunk costs of our previous efforts we have increased our exposure to uncertainty. If we are sorely tempted to do more of what we have been doing then we have entered into the realm of, “expecting different results by doing more of what we have been doing”.
- Can we even fathom what we are putting on-the-line if we choose any of the options open to us? If we can’t cost, discern, etc. what the consequences are we have moved into an arena of uncertainty. Undertaking actions that make future actions and consequences even more opaque is, “expecting different results by doing more of what we have been doing”.
- For us to extricate ourselves from our situation means we have to embroil those who are outside of our current mess or (even worse) ask those affected (victimized even?) by our past actions to bail us out. When we hear people making excuses for their actions (like “I didn’t mean it!”) we have clearly lost control of the situation and demonstrated our incapacity (even incompetence) to “fix it”
I believe we can make choices and take actions that minimize (remember we fundamentally live in an uncertain universe) our exposure to moving from seemingly “risk like” to “uncertainty like” circumstances. This will be explored in a future post.