Let’s just call it “Uncertainty” that is, call a moratorium on the term “Risk”


The next number of posts are going to explore my approach to how I engage people into understanding and appreciating uncertainty in their lives. Yes I said appreciate in its myriad senses (get the nuances – outright enjoy it). This post will make the argument that we should ban the term “risk” from our conversations – vocabulary – minds. What is risk as a concept for most of us? I was introduced to the concept when I took statistic courses in university. I was taught notions such as:

  • Chance (e.g., tossing coins – heads or tails? fifty/fifty chance of either)
  • Distribution (expected profile of outcomes around an event such as tossing a coin). This was in retrospect the most dangerous lesson of all: I was shown the the “normal distribution” a.k.a. the bell curve.
  • That we could discern and map out the outcomes (easy for a coin toss, could realistically exclude the outcome of it landing and staying on its edge[?])
  • Randomness, that events happen independently of previous and potential future events. It happens because it happens, again the coin toss was such a great illustration of this
  • Standard deviation, the notion of how evens cluster around the average. this looked so nice and symmetrical in a normal distribution
  • Mean (weighted average), mode (most frequent number/observation), median (middle value between the two extremes/ed points). In the lovely normal distribution they are all the same. In non normal distributions they are distinct and often different from each other. So which should we use when we deal with non normal distributions? Our every day conversation would probably only use the terms weighted average or mean (actually they are the same). This can be extremely dangerous in terms of drawing inferences or conclusions from a pile (sorry, “set”) of data.

So the picture of risk is based on a statistical notion that conversationally can described as: “The likelihood that some event or circumstance will happen.” When do we face a risk like situation then? This is an operational (as opposed to conceptual) question. First, and I mean first, we comprehend or know about the event that can befall or someone else. Other wise we would not know about it all, hence there is nothing to consider in terms of its likelihood. Second, and I mean second, we comprehend the consequences/implications of the event. Why bother even spending any energy on something that does not matter. I appreciate when we learn concepts we may work the learning around trivial/simplistic/conceptual example situations. Third, and I mean third, we “understand” in a meaningful way the event and its consequences likelihood (a.k.a., odds) of occurrence. For me it is the issue of likelihood that was most troubling to me. I will outline my concerns from four perspectives both rooted in the practical implications of living with risk. First: the issue of distribution I have seen no real evidence that we can rely on most events being normal in their distribution. In the first place the consequences of an event (e.g., coin toss) are rarely just binary (heads or tails). Consider in your life the issues that you would be most likely (sic) to apply the term risk to. Are the consequences only binary? Not likely (sic) in fact in most events we care about we are contemplating a variety of consequences (e.g., often near misses or hits such as a sale). There is an excruciating amount of complexity and intervening influencing factors that can come into play in almost all situations that we care about in terms of risk. So I find it very problematic to assume that the distribution of a risk event is going to look at all normal. If my concerns are valid, then any lay person’s notion and application of statistics is useless and can be downright dangerous. Why dangerous? Picture a circumstance where we plot out the distribution of outcomes (consequences) we could very easily see a distribution that is “bi” or “tri” modal. That is, there is clumping of observations around two, three, or even more outcomes. So how do you decide (statistically speaking) on which is the most important outcome to manage your risk around? Using the “mean” is the most problematic of all. Median is also problematic as well. Mode likely the most useful conceptually, but in practical terms may not be so. Why practically speaking? It will depend in part on what action you take to manage your exposure. The choice to manage one cluster may in fact increase your exposure likelihood to another cluster. Take the vaccine and increase your risk of allergic reaction, don’t take the vaccine and increase your risk to the illness itself. The choice of management action introduces a subsequent set of risk factors too with their own consequential distributions. Life is an arena where any choice sets up a multitude of subsequent exposures, some of which may be worse that the initial situation we are concerned about. So in conclusion: most distributions are not normal and this introduces all sorts of complexity into understanding and subsequent choices and actions. Second: the issue of likelihood/probability I propose that for almost all of us, we do not have meaningful comprehensions of the odds that something could happen. A meaningful comprehension is in my mind we can make better decisions based on this knowledge. Example: You want to buy some recreation property alongside a river. You have found what appears to be the ideal property. What might you sensibly do next? You might ask about the risk of flooding. You are in luck, a recent investigation into this area indicates that your potential property is at risk only in a 1 in a 100 year flood. You are middle aged, so you do not have a 100 years more to live. So what might you sensibly do next? Ask, when was the last such flood? If the answers were any of the following, what would you decide re. purchasing the property?

  1. 150 years ago (buy/not buy?)
  2. 25 years ago (buy/not buy?)
  3. 7 years ago (buy/not buy?)

In terms of the information provided this information is useless. There are too many other things that are more important to know about. However, if you were an engineer planning to build something in this area, you could make very useful design decisions based on this information (actually the engineer would obtain much more information first). The point here is that an engineer an gainfully act on this flooding likelihood information, for the rest of us, meh. We are often inundated with information on odds and probabilities that are virtually meaningless in and of themselves. Again, the events and their associated consequences are always in a context of many impacting variables. So if the practical value of distributions and likelihood is often very problematic in itself then what are most likely dealing with? It is not risk. Remember the three points sequence of conditions described above. Risk like situations need all three to be meaningfully present. Third: Understanding event outcomes There are times when we can envisage an event (snowstorm, flooding, civil unrest, traffic accident, etc.) but be unclear about the implications. Here we get back to the possible consequences that can arise out of an event we touched on earlier when talking about distribution shapes. I was struck a few years ago when i saw some footage of a dorset fire that ended up going through a subdivision. As you can imagine the devastation was horrific. Yet there were curious situations where a few homes were completely spared, while many more were partially affected by the fire. How an event affects us can vary so much that our processing ability to consider beforehand the implications of foreseeable events is problematic. There are some obvious strategies that many of us take for such circumstances (fire insurance – plan for the and hope for the best). Yet many situations that can impact us we often have a legitimate conversation about how much management action should we take. Consider our participation in health risk activities. We observe a large variation of so called risk reduction activities: ignore the risk completely <–> obsessed with healthy living. Sometimes we can foresee possible events without any clear understanding of the possible implications. Clean the tar spots on your car off with a petroleum based cleaner, rinse and wash off with the runoff going into a storm sewer. The residual cleaner product washes into a watershed that is used to fill a fish pond. You can see how this might go on in terms of a series of one event leading to a cascading series of events, many of which might not happen given other conditions (cleaner residue never makes it to the fish pond because of previous week’s weather. Fourth: foreseeing the risk triggering events There are times when an impacting event occurs that literally takes us by surprise. We just did not see it coming or had no idea it could occur. The problem is we  often retrospectively  rationalize the event to not being a surprise (always after the fact of course). In my mind there are two reasons we tend to do this:

  • We try to come to terms with the impacts by understanding the event, this means we do a cause and effect like analysis on what occurred. This is helpful if we learn enough and incorporate it into how we think, plan and act in the future (i.e., make smarter choices and act smarter too).
  • We also do it because we are troubled by the fact that it was a random and/or not personal event at all. We want to find something and most frequently someone to blame: this event should never have happened and someone should be accountable for this. This a form of self delusion, because, in many cases it was not anticipatable. We seem to forget what didn’t know what we didn’t know after the fact. Now that it has occurred, so we now know what we know, we want to impose this new insight on the past.  I know there are many people (even businesses and government) that make us of this practice. Note, I am not considering here those situations that involve malfeasance or wilful neglect of any kind.

So where are then? We are left with the notion that we are often not looking at risk at all. So what are we looking at then? We are looking at uncertainty. Borrowing from the ideas and thoughts of Nassim Taleb, we are dealing with:

  • Black Swan events: those we did not anticipate and foresee as possible – the surprises
  • Grey Swan events: those where we can foresee and have some information about, BUT, we are missing important pieces of information – implications of distribution, meaningful comprehension of likelihood, and what and how the consequences ensue.

So how would I be most comfortable in using the notion of risk? I would use it as a notion of movement. For example:

  • We are putting ourselves more in a risk like situation by moving into a high crime rate neighbourhood
  • We are putting ourselves and others more in a risk like situation by speeding in a school zone

Risking more means increasing the likelihood of exposure to an event, and this we can often understand and acknowledge before hand. When rare events happen, and they do, it is because:

  • They are random (lottery like occurrence). S#%& really does happen
  • They occur in fixed locations (rare diseases in the jungle). Going into these environs will knowingly/unknowingly increase our chances of exposure.
  • There is a cause and effect relationship that we don’t understand (at least before hand). Can’t anticipate the increasing event possibility such as our earlier flooding example because we don’t know why flooding occurs in the first case. These are non random events in the sense w can increase our understanding such that we will never/mitigated be impacted by such events.

So for me, we are almost always dealing with some form of uncertainty, and thinking of it as such is important as it can influence how we manage our exposure to it. This will be explored in subsequent posts. Mark

Advertisements

About 123stilllearning456

As a management consultant I am passionately interested in talent management and risk/uncertainty issues. In the area of talent management I propose that we seek strategies that look beyond the staffing/employee centric frames of reference. I have been frustrated at the "closing down on possibilities" by these more conventional staffing/employee centric approaches. I have been impressed where people have found systematic solutions to their talent management issues by going beyond the conventional approaches. In the area of risk and uncertainty, I am interested in making this topic relevant to more normal decision making situations. My conceptual foundation is to use the micro-economist's fixed/variable cost theme. I also think it is important to look at these issues for people through their emotional and psychological lens. As a premise I think risk and uncertainty only exist where there is a person who cares about possible events and its consequences. Hence, risk and uncertainty are social based concepts (no sentience, no risk and uncertainty). A major influence on my thinking in this area is Nassim Taleb of "Black Swan" fame. This BLOG provides me with an opportunity to express my thoughts on topics that interest me. As this is an online diary, content is more important to me than polish. I apologize if this distracts from readers' enjoyment and learning. Still I find this a useful way to live up to my namesake, learn more from others and hopefully provoke creative thoughts and ideas in others.
This entry was posted in Decison Making, Risk & Uncertainty, Strategy. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s