# Puzzle 5 – Decision Making

In this mindware strategy tutorial you will learn about:

**Components of a rational decision**

**How to compute expected utilities to make rational decisions**

**How to identify cognitive biases in decision-making**

**Basics of Rational Decision Making**

In a practical sense, a rational decision is one that selects the best (optimal) available course of action in the face of uncertain outcomes.

Cognitive scientists and economists have a ‘conceptual framework’ for understanding how to make good decisions. Here are some of the basics of this framework:

**Actions or Options**

To make a decision requires a choice, and different choices depend on different actions or options: marry or not marry, book an activity holiday in South Africa or a beach holiday in Cyprus, start making a coconut curry or – alternatively – a spaghetti bolognese, take this bet or that bet.

**Outcomes**

Second, the action/option chosen in making a decision results in an *outcome*. If we choose one job over another (decide to accept job X rather than job Y), the action of signing a contract has real consequences in terms of income, location, career development, and so on. There is usually some uncertainty in the outcomes.

**Risk and Uncertainty**

Decisions typically occur in conditions of* uncertainty* – where the outcomes are not entirely predictable, and where an action/option taken may involve risk – the chance of loss. It is not usual that the outcomes of decisions are entirely certain and it’s important to factor in this uncertainty when you make your decisions.

**Gains vs Losses – ***Utilities*

*Utilities*

There are *gains* and *costs* associated with the outcomes of decisions you make called the * utility* of the outcomes. Gains

*are what you desire or value, costs the opposite. Gains include fun or pleasure, monetary or material gains, rewarding relationships, recognition, status, achievement, new skills and so on. Costs include physical pain, loss of face, financial hardship, stress, unwanted effort or time.*

.

**Expected Utility Theory**

According to expected utility theory, to choose optimally you must multiply the potential utilities (gains or losses) of different courses of action with the probabilities that the actions will lead to these utilities. By doing this we can calculate the ‘expected utilities’ of different actions or options.

Which bet do you choose?

**A. A 10% chance of winning $1000.**

** B. A 50% chance of winning $50.**

** C. Either A or B – both are as good as each other.**

To calculate the expected utility of A we multiply .10 x $1000 = $100.

**To calculate the expected utility of B we multiply .50 x $50 = $25.**

So, the rational answer is A! Your System 1 (intuitive) thinking might not agree with this, but objectively, this is the better decision.

_____

How about? –

Which bet do you choose?

**A. A 10% chance of winning $500.**

** B. A 50% chance of winning $100.**

** C. Either A or B – both are as good as each other.**

0.10 x $500 = $50

0.50 x $100 = $50

So the rational answer is C! Both bets have the same utility – whatever your ‘System 1’ intuitive thinking tells you.

**Mindware Strategy Tip: the ‘Maximum Utility’ Decision Rule**

When you can estimate probabilities and utilities of different options or actions –

**Choose the action/option with the highest (summed) expected utility.**

**Risk-Averse For Gains, Risk-Seeking For Losses**

Which do you choose?

**A. Gamble 1: On the toss of a coin you win $10 if you get heads or win $50 if you get tails.**

** B: Gamble 2: You will get $30 for certain.**

** C: Either Gamble 1 or Gamble 2. They are equivalent.**

Most people in this situation instinctively choose Gamble 1 (A).

That is, we show * risk-aversion* for the possibility of getting less than we know we can get for certain. (It’s like there’s a mental comparison between $30 and $10.) This is called being ‘risk-averse for gains’.

But now what about this choice?

**A. Gamble 1: On the toss of a coin, you lose $10 if you get heads or lose $50 if you get tails.**

** B. Gamble 2: You lose $30 for certain. **

** C. Either Gamble 1 or Gamble 2. They are equivalent.**

Most people given this choice instinctively choose Gamble 1 (A).

That is, we are * risk-seeking* for the possibility of reducing our losses when we know that we will otherwise have a bigger loss for certain. This is called ‘risk-seeking for losses’.

**Overcoming the bias **

But wait! These two are actually identical in terms of expected utilities (it’s just that the signs are different). If we choose Gamble 2 in the first choice, we should choose it in the second: you are contradicting yourself if you choose Gamble 2 in the first, and Gamble 1 in the second (or vice versa).

We need a rational override here! Using our working memory we can apply expected utility theory.

In the first choice above the expected utilities are:

**A. (0.50 x $10) + (0.5 x $50) = $5 + $25 = $30**

**B. 1.00 x $30 = $30**

The same expected utilities for both options! So the correct answer should be C.

But now let’s look at the second choice above.

**A. (0.50 x -$10) + (0.50 x $50) = -$30**

**B. 1.00 x -$30 = -$30**

Exactly the same expected utilities! – the only difference is the sign.

Here again of course, you should choose answer C.

To be rational it is critical that we are consistent for both ‘gain’ scenarios and ‘loss’ scenarios.