In a previous blog ( Uncertainty & Risk Management: Implications of not having an exit strategy in an all or nothing decision? ) I explored the ramifications of pursuing a course of action without an exit strategy. I understand that there is a certain line of thinking in change management that we have to create a “burning platform” if are to up the odds of change success. Today’s post explores the implications of learning you have made a mistake in terms of the new platform that you have jumped onto. The Lemmings reference is the notion of jumping to certain death. The reference to uncertainty is the notion that if we irretrievably commit to a course of action we are embracing all the uncertainties and risks that that course puts into our path. What if the consequences are such that we recognize we have made a mistake?
Today’s post is inspired by a recent maria Mathews article (News for Calling all country doctors) in the National Post. She explores the implications of a Healthcare Policy study that shows the lack of success in retaining doctors in rural areas. The evidence is we are incurring additional costs without deriving the additional benefits.
Although this is not story of impending dire outcomes. It does illustrate a fundamental problem with many important decisions made by people, organizations, governments, etc. The lack of anticipating that a mistake might be made and putting in the wherewithal to determine if this is going to be the case early enough to minimize the costs of the mistake.
The fundamental problem with burning platforms AND having no plan B is you will in all likelihood have the Lemmings experience. Will this be death? Obviously this will depend on the issue. But even failure in one form or other can be severe. The irony, is these are in large part avoidable.
In my mind the first, easiest, and most sensible action to take when we embark on an important course of action is to put in place means for determining success. These include predictive or leading indicators as well as any real time monitoring mechanisms. Leading indicators tell us that there is something we should begin to pay close attention to, they are not predictions as such. The value of a leading indicator is they give us time to prepare and put in place useful changes so we don’t incur the consequences of the seemingly probable future. This is a navigation pure and simple. Navigation asks and answers three key questions:
- Are we still on track?
- Is there any evidence that we are heading into trouble?
- What responses make sense under the circumstances of the answers to questions 1 and 2?
Is it unreasonable to expect any important decision in life not to include the capacity to navigate?
The second sensible action is to act on the navigational information. This is not an automatic given. Many of us find many ways to excuse, deny, make light of information that casts doubt on the wisdom of our decisions and actions. I have used the example of: “Have any of us run out of fuel when the fuel gauge is working?” to humorously bring this lesson home to clients.
As an opinion, I see this lack of putting in place leading indicator mechanisms and willingness to commit to change when evidence suggests we are not making appropriate headway. The Maria Mathews article is an example of this issue.
Not having a Plan B is not to my taste regarding preparing for uncertainty and risk in important areas of my life. But let’s say it is the necessary thing to do. Should we then:
- Do so in blindness? This is what we do if we don’t have the capacity to ask and answer questions 1 and 2 above
- Be stubborn, blithe, or inconsiderate (of other peoples’ money regarding social policy makers) to not change when there is evidence suggesting failure? This is failure to deal with question 3 above.
No matter what we do in life, we face uncertainty and risks. Of the two, uncertainty is the more problematic as it puts us in the way of events and circumstances that we have not experienced before. Because we do not have very much useful information about uncertainty type events we cannot deal with them like we can risks. But what we can do is manage our exposure to the consequences of uncertainty. An example would be to moderate exposure one sort of uncertain event or other. A better example would be to know how to exploit the event and actually benefit from it. Nassim Taleb would consider being anti-fragile (The first circumstance is that of being robust or resilient).
Having navigational tools and being predisposed to acting upon their readings is an excellent way to mange uncertainty.