May 17, 2010

When to trust your gut

Alden Hayashi, Harvard Business Review

Summary:

Many decision situations do not lend themselves to quantitative analysis.  For one thing, the situation may be so complex that quantitative analysis simply cannot be applied. Examples include areas in public relations, which person to hire, research, marketing, and strategy.

In other cases there just is not enough data to perform quantitative analysis.

Even if data could eventually become available, there are times when decisions have to be made quickly, or else the opportunity is gone. There is no time to gather and analyse data in a systematic and rational manner. Situations like this can be expected to become more common in today’s increasingly turbulent and globalized economy, where things can change at the drop of a hat.

Executives in the strategic positions of organisations often face these types of situations.  They have to rely on gut instinct to make their decisions.  Although in some cases they are provided the results of quantitative analysis, the numbers are often biased to show why something is a good thing.  For example, mergers and acquisitions often show why the merger would succeed (from a quantitative point of view).  The executives have to rely on their instinct to tell them why it might not work.

The question for a decision maker then is how to tune in to your inner instincts and how to tune your inner instincts.

Executives and researchers discover that you need to have your subconscious knowledge emerge and connect with your conscious knowledge.  This can be done through meditative activities such as driving, day-dreaming, showering, and so on – it all depends on what works for you.

Our emotions assist in the decision making process by filtering out patterns that do not apply and by emphasising patterns that apply. In a sense, our emotions sort out and shortlist the considerations that our rational part of the brain can work with. When making decisions, be aware of your emotions and take them into consideration.

Gut instinct is simply based on rules and patterns we have within our subconscious. Some patterns may be built-in (true instincts).  Some are acquired through experience.

The quality of our gut instinct depends on the number of patterns our subconscious stores, the variety of patterns, and how it is able to interconnect those patterns.  The number of patterns come from our experiences, the variety comes the variety of experiences.

Instincts do not guarantee correct decisions. We need to continually self-assess our decisions and ‘train’ our instincts.  We can do this be reviewing our past decisions, reviewing why they were wrong, or why they were right.

Finally, it is important not to fall in love with your original decisions, but to keep flexible and adjust it as new information becomes available.

Contemporary Enterprise-Wide Risk Management Frameworks: A Comparative Analysis in a Strategic Perspective

Per Henriksen and Thomas Uhlenfeldt

Summary:

Many risk management frameworks claim to be holistic and ‘enterprise-wide’.  Henriksen and Uhlenfeldt argue that for a risk management framework to be truly holistic and strategic, it must address the strategy creation process and not just the strategy implementation arena.  It is in the area of strategy process where many strategic risks are created. Hence, an enterprise-wide risk management system that does not lend itself to be used in the strategy creation process falls short of the mark. 

The authors investigate 4 ERM frameworks that claim to be holistic: DeLoach EWRM, COSO ERM, FERMA (a precursor to the current IRM Risk Management Standard), and AS/NZS 4360:2004.  Their study reveals that while these frameworks claim to be applicable at the strategic level, they fall short of providing actionable guidance on how risk management can be performed concurrently with the strategic processes.

A key weakness lies in the frameworks’ treatment of consolidating, prioritizing, and communicating key risks.  The very point of ERM is to consolidate the key risks faced by the organisation so that it can allocate scarce resources most effectively. The frameworks provide little, if any, guidance on how this consolidation, prioritisation, and organisational communication can be done.

The frameworks also acknowledge that risks can result in positive opportunities for the organisation but provide little guidance on how to take advantage of this.  Since the frameworks are not integrated with the strategy creation process - where the biggest opportunities to identify and seize opportunities exists - the frameworks’ take on positive risks are not that helpful.  The authors recognise that in the real world, preventing losses is the focus of management and identifying opportunities is generally the remit of strategy. 

Hence, while risk management in theory helps in identification and grabbing of opportunities, this is seldom done in practice.  The orientation of the frameworks in the process steps is still heavily slanted toward negative risks.

The frameworks add some value in that they pave the way for common risk language and processes across an organisation.

Apr 17, 2010

7 Deadly Sins – Illusory Correlation

Or ‘magical thinking’ as Massimo Piattelli-Palmarini calls it.  This is about making positive correlations even though the supporting data is weak.  Sometimes we notice only data that supports our hypothesis and ignore data that doesn’t.

An example of magical thinking goes like this. We come across a few people who exhibit a certain symptom and also a certain illness, and we associate that symptom with the illness, such that if we see that symptom, then we decide that the illness is also present.

You see someone with red spots, and you diagnose measles.

We forget that sometimes the same symptom appears for a different illnes.  Or the illness is present without that symptom.

Apr 13, 2010

7 Deadly Sins – Overconfidence

Massimo Piatelli-Palmarini writes in his deliciously written book “Inevitable Illusions” about the 7 deadly sins of our cognitive illusions.

His first sin is overconfidence. This is where we feel certain about our knowledge of something, but our knowledge does not really warrant such confidence.

He describes experiments where subjects are asked to answer questions and then rate how confident they are about each answer.  Experiments show that our confidence leads our knowledge.

We think we know something more than we really know.

The results of the experiments also bring about something sobering: we are most overconfident in areas we are more knowledgeable about.  That is, the difference between the level of our overconfidence and knowledge in these areas is bigger than the difference between our level of overconfidence and knowledge in other areas - hence we tend to make mistakes of overconfidence in our areas of expertise.

Apr 2, 2010

On Issues Versus Risks

Whenever you find yourself in an introductory presentation on risk management, you can expect to hear a question like: “What’s the difference between an issue and a risk?” The expected answer seems to be always: “A risk is something that may or may happen, while an issue is something that has already happened.” 

Correct enough, but this description falls short of conveying any relationship between the two.

Here’s one I coined, I like, and plan to use and re-use: “Issues are the risks you failed to manage, now come to haunt you.

The sentence makes clear that many of the issues that you face could have been mitigated if only you had done proper risk management.  The assertion is not always true of course.  Some issues just come from unpredictable circumstances, and no risk management is that perfect.  So surely,  there are exceptions, but the strong assertion of the sentence emphasises just that – that exceptions are the exception.

I believe I originally picked up this relationship from Bill Duncan.  A few years ago he quoted someone he knew who said that in a good risk management process, all the issues that arise will have been previously identified in the risk register.  So it’s not my original idea, but I like the “now come to haunt you” bit, which is mine.

Mar 4, 2010

Pinpointing the Risk

"It is important to correctly identify the cause from the risk", said the presenter of a risk management process overview.
 
I hadn't given much thought about the distinction between the two, and simply implicitly assumed that I know which is which.  But when I tried to articulate how to distinguish between the a cause and a risk, I felt stuck.  After all, they all seemed to be a chain of event/consequence.
 
Ignoring for the meantime that each event E can be a consequence of any number of events, and that E itself can cause any number of consequence, it is clear that from one point of view, an event E2 can be a consequence of an event E1.  Similarly event E3 can be a consequence of event E2.  So a specific event is both a cause and a consequence.
 
For example, let us suppose we are concerned about the risks our property is facing.
 
Risk: Fire
Cause: Faulty electrical wiring
Consequence: House burns down
 
In this case, we put "Fire" as a risk in our risk register.
 
But what about "Faulty electrical wiring"?  Isn't it a risk as well?
 
Risk: Faulty electrical wiring
Cause: substandard workmanship
Consequence: Fire, leading to house burning down.
 
So should Faulty electrical wiring then be in the risk register?
 
Kik Piney reminded me that it is essential to be clear first about the objectives when going about identifying risks.  Having just studied ISO 31000:2009, I am aware of the relationship between objectives and risk, but for some reason I left it out.  (I am not too sure about being clear first about objectives before going about identifying risks, because sometimes noticing potential areas where things can go wrong will actually help you know what your objectives are).
 
Now suppose we have decided that our objective is "to protect our property".  In this case, it is clear that the risk is fire:
 
Objective: Protect property
Risk: Fire
Risk: Repossession
Risk: loss of property due to plane falling on property
Risk: loss of property due to earthquake
 
"Faulty electrical wiring" is not a risk. Either the property has faulty wiring or it does not.
 
If the objective instead is 'Acquire a problem-free property', then 'faulty electrical wiring' is a risk.  The property we are considering to acquire 'may or may not' have this characteristic. 
 
Final point: always relate risks to objectives.  Nothing new here. Just a reminder.

Mar 1, 2010

Project Success

Bill Duncan comments on the definition of project success (link) and touches on the different dimensions beyond merely completing the project 'on time'.  His thoughts sparked a few thoughts as well.

Success can be defined in several dimensions.  The more success criteria defined, the greater the chance that they will conflict with each other.  Invariably, there will be 'success criteria creep.'  Some ranking of success criteria may be required. Perhaps a ranking system may be of use to rank the success criteria according to importance in order to provide guidance whenever conflicts arise.  For example, while it may be deemed important to achieve each major milestone according to schedule, is that more important than completing the whole project on time?  And is completing the project on time more important than meeting a specified project cost?

Other questions to help rank the success criteria might include:

  • What are the consequences of not meeting this success criteria?
  • Are we prepared to spend more in order to meet this success criteria (otherwise is just a nice-to-have?)  If so, how much? 
  • Is it acceptable to fail to meet a success criteria in order to achieve another criteria?

 

Giving this a little more thought, I find a relationship between project requirements and success criteria: why did we define success this way and not that way? The answer lies in the requirements.  We defined this as a success criteria because it is important.  It is important because <project requirement>.  A simple example: success critera A: the stadium is ready for use by March 11, 2011.  Why?  Because a large event is going to use it on March 25, 2011.  Failing to make the stadium available by March 11 means a failure to hold the event.

ChatGPT Prompt Engineering for Developers

The company DeepLearning.AI offers a free online course called "ChatGPT Prompt Engineering for Developers" from Coursera. Large L...