Can we live in an uncertain and risk free setting? No! At best I suggest we can choose our more likely sets of uncertainty and risks which we will possibly face and which we are more comfortable living with.
The concept of risk is one where we have some sense of likelihood that an event will occur and the impacts that it will have. Uncertainty removes our understanding of likelihood, and/or impacts, and/or ability to even envisage the events themselves. I also suggest that we face uncertain like situations way more frequently than we do risk like ones.
It should be understood that there are different concepts of uncertainty that the one suggested above. In financial circles they view uncertainty as variability. In physics they have a different notion of uncertainty which says we cannot both know where a subatomic particle is and at the same time know its momentum
One of the intriguing implication (actually assumption) about uncertainty and risk (U&R) is that someone cares. If an event occurs and no one is meaningfully impacted then we (or anyone else) would not consider this a U&R circumstance. What I mean by this is we have no impetus to act upon the possibility of the event occurring simply because it does not affect us. So U&R is always personal. Hence what you care about may be something I don’t in the least and vice versa.
So how do we respond to our U&R context? Several strategies come to mind:
- We can freeze and literally cease to function – this is death.
- We can choose to be reckless. “What the hell, we may as well have fun, because tomorrow we will die!”
- We can try and manage our exposure (typically to the impacts and likelihood) without understanding (often takes the form of worshipping whimsical gods that we attribute as the cause of our plights, and praying and making suitable sacrifices to appease them). The trait of superstition is present in most of us to done degree or other (high random reinforcing endeavours like sports are rife with almost humorous examples).
- We can try and manage our exposure to by using science and associated engineering like disciplines to better understand how and on what basis events will occur, thereby giving us some lead time to take protective action.
- We can buy some form of insurance that enables us to recover from an unfortunate event. This is a transference of impact cost onto someone else who is willing to bet we won’t have the experience.
- We can find a scapegoat/sucker to take the fall for us (e.g., send someone else to fight our war/grievance, get our grandchildren to pay for our over spending/under saving ways, be faster than your hunting companion when the angry bear chases you both).
- We can organize our affairs to shape our exposure into more survivable forms (be big, be small, be diverse, be strong, be weak, be up high, be down low, be part of a large group, be part of no group, etc.).
The strategies for dealing with risk are different than those we would seriously consider for dealing with uncertainty. Because we can quantify risk, we can purchase cost effective prevention or mitigation solutions. Insurance, security measures, types of tires, etc. all are examples of such strategies.
Strategies of dealing with uncertainty have to deal with two levels of possible exposure: where we can envisage the event or we can’t.
Uncertainty where we can envisage the event are ones where we can try and get a “heads up” on the likelihood. This is the field of leading indicators. Tire pressure monitors tell us when we are approaching a tire failure circumstance. Monitoring the hydrostatic pressures in a hillside will hopefully tell us when we are more like to experience a slide. Changes in enrolments in certain education programs indicate we may face shortages in necessary talent in the future. Storms in latin America may give us the heads up to buy our coffee in bulk before shortages (and attendant price increases) begin to appear on store shelves.
The role of science and science like examinations are critical to these aforementioned strategies. By developing an understanding on what causes events to occur gives us insight on how to monitor for their emerging likelihood. In areas where strict science processes seem inapplicable/more problematic, we can build our theories of causality using more intuitive and sometimes stochastic (where see and look for correlation ships) approaches. Note, I consider intuition in a more operational sense in that it is based on our ability to appropriately act using subtle and nuanced environmental signals.
One intriguing area of thought is to use “systems thinking”. The challenge here is always determining your system’s boundary. For example, we can use radar/sonar to help us detect incoming hostile warplanes/submarines. The system here is that sound/electrical impulses will reflect back to our receivers. What if new technology enables stealth fighters/marine craft. Our detection systems no longer will help us (in fact it may make the situation worse if we rely on them too much). Any system by its nature has a boundary on what is pertinent and what don’t need to concern ourselves with. An uncertain like event in these situations is one that happens when our detection system does not think it would, i.e., total surprise.
Can we make our systems thinking approach very open and extensive? Sure, but you then create the sets of U&R that lack of timeliness in response permits. Sorry folks, we are always caught in the dilemma of making choices on what is worthy of paying attention versus what isn’t and then learning we were incorrect.
In some fields like marketing, there is a practice where those involved ask themselves where do their approaches seem to fail. Paying attention to anomalies is useful, however, it is a hard discipline to keep at. When something is working very well for us, we easily consider “outliers” as noise or the normal anomalies and variabilities in life. For example, our marketing plan has met and exceeded all our goals, but there are still people out there who are not yet our customers. Do we even care? Why would we? The ability to take minimal “contrary” data and elevate it into serious questioning of our success is not likely to happen frequently.
I am going to close by elevating this discussion up a conceptual notch. Dealing with the U in U&R is a practice of theory building and hypothesis testing. Because we can never know everything (even about anything) we make astute even educated guesses on why something works the way it does. But remember a theory (e.g., a business model is such a beast) is a simplistic portrait of what we think about how something works. Almost by definition, the theory will be wrong or become evidently inadequate in some fashion or other. A prudent person knows this, believes this, and lives this. How? They are exceeding open to evidence that conflicts with their theory (world view) and take the contrariness seriously.
As for the unforeseen U in U&R. Are you praying to the right deity or deities? I am not being facetious or cynical. If you are uncomfortable with knowing you don’t know and you don’t know how to better “manage” your exposure, then you better find someone/something that seems to have the answer.