We are a business culture that tends to appreciate putting quantized values on things. In the area of uncertainty, by putting a value on probability or likelihood we move from uncertainty to risk. I would like to explore the limits of making this effort by building off a recent article I read.
The article is called “Framing it with Framingham” by Alan Cassels in the January 2011 edition of Common Ground. A. Cassels is a drug policy researcher at the University of Victoria.
Cassels examines the value of using medical risk calculators like the Framingham method. He examines the value of determining the risk of heart attacks or strokes by considering such factors as age, smoking, having diabetes, systolic blood pressure, and total cholesterol. What he discerned was that if we were to focus on age and smoking we determined the significant risk factors leading to heart attacks and strokes. The other factors were of significant lesser value. In other words, by paying attention to age and smoking we had enough information to determine the risk of heart attacks and strokes.
What is intriguing to me is in this case, we do not need to do any specific medical tests to determine whether we have these risk factors in our lives. The cost of information can be very low.
Cassels discussed his findings to James McCormack a doctor of Pharmacy at UBC (he teaches medical students in the use of risk calculators).
McCormack views the value of doing these calculations only if it helps people make the changes necessary to reduce their overall risk.
So the second intriguing point for me is will the additional information provide sufficient motivation to decide and act?
Cassels asked a couple of additional questions:
- “What’s the impact of medicine’s widespread enthusiasm for risk calculators?” (Answer: No one really knows; that study has not been done)
- “How do they measure up against no risk assessment?” (Answer: No one really knows; that study has not been done)
- “Are we sure we are using risk calculators in ways that aren’t harmful to people?” (Answer: No We’re not)
This post is not in anyway an endorsement of not having medical tests, taking drugs for low risk factors in our lives. It is about what are the reasonable limits for seeking information in making “uncertainty to risk” type decisions. By uncertainty to risk, I mean we get a fuller appreciation of the likelihood or probability of an adverse even occurring.
There may be any number of reasons for seeking additional information:
- gives us a better sense of the need to take precautionary steps
- gives us a better sense of the cost of action/inaction
- gives us reason to delay making a decision (just one more study, test, royal commission, etc.)
- gives us a better sense about taking a series of risk mitigation measures (e.g., cancer – surgery, chemotherapy, radiation, hormone treatments) by indicating the additional level of protection/risk reduction by taking each subsequent step.
The thing to keep in mind is additional information can be very expensive to obtain, it can result in delays (which may be significant), and may not even change what you already know (the need to do something). Again, as McCormack indicated above people differ in terms of what they need to be assured of before they will act.
So what does this suggest to me?
- What one piece of information do you need to make p your mind? Is that information readily available? (think about it as doing an 80/20 on the 80/200
- What second piece of information can we get that will provide strong confirmation/question of the information above. This deals with situations like the type 1 and 2 type errors in statistics.
- If we act upon the above, and we are wrong, what can we do?
I believe, at the heart for many of us in business, we delay making decisions because we are concerned about being wrong, thereby, wasting scarce working capital and losing time.
As readers to this BLOG know, I subscribe to the view that we live in an universe that is rife with uncertainty. This means, we will be wrong a considerable amount of the time. So what might we do with that assumption?
- Have a core strategy that you believe will meet most of your needs (e.g., 80%).
- Have two (well at least one) additional “in use” contingencies to cover the remaining amount of need, AND, provide cushioning flexibility when the core strategy is insufficient. This deals with likelihood that you could be wrong.
- Have a clear sense of what you should be observing for that provides an early indication (i.e., leading indicator) that the universe is unfolding in unexpected ways. AND, you will respond to.
This third point is interesting for a number of reasons. As a species, we often seem to have large capacity to ignore early symptoms of change or danger. Also we have capacity to minimize the significance, even when we acknowledge the existence of the information (too early to tell, only a temporary blip, won’t really affect us, etc.). One of the exercises I do with clients is ask the room “How many have ever run out of gas while driving?” and then ask “Was the fuel gauge was working?”. Even if no one in the room has ever suffered this embarrassment, they all know someone who had.
Living in an universe of uncertainty, means being able to decide and act when you know very little about your circumstance. The keys to determine whether you know at least enough to make a reasonable call. You will never know enough to be absolutely sure until the moment has passed. At which time you are no longer being strategic, you are being reactive.