Tom Parker coaching
By Tom Parker
Tom Parker is one of the UK’s more active and experienced white water coaches. He is a Level 5 Coach, British Canoeing National Trainer and Rescue 3 International Instructor Trainer. To find out more about the courses he runs, head to www.tomparkercoaching.co.uk

The human factor #2

How flawed thinking leads to danger on the river.

Welcome, once again, to the Human Factor, a series of articles about our decision making processes in white water and how they can lead to incidents. In the first issue, we explored the idea, put forward by Kahnemann, that we have two broad approaches to decision making, the intuitive, rapid route he terms system 1 and the slower, more analytical approach he terms system 2.

Both of these systems have their place in our decision making but problems arise if we use an intuitive approach whilst believing we are being analytical. This tends to happen in the form of biases that affect our thinking without us realising. These biases creep in, in part, because taking an analytical approach to decision making all the time is a big effort so these biases lessen the perception of effort but at the cost of flaws in the outcome.

Tom Parker

Morriston gun barrel

Anchoring – having a reference point

I’d like you to conduct a quick experiment. Take an ordinary die. Roll it and times the result by ten. Four would become forty, for example. This represents a percentage. Write it down. Now consider these questions:

Is the percentage of African nations with UN membership higher or lower than the number you just wrote?

What is your best guess of the percentage of African nations with UN membership?

I’m going to make an assumption here but it is based on a considerable amount of prior experimental evidence. Your best guess will be relatively close to the percentage represented by the dice roll, even though that bears no relevance to the percentage of states in Africa with UN membership. This is a strong example of the Anchoring effect. This occurs when people consider an answer to a question before analysing relevant data.

The result of the initial consideration often becomes a reference point that subsequent answers stay in proximity to. A classic everyday example is in sale pricing. The item is often advertised as, “Was however much, now only this much.” We compare the new piece of information favourably against the original, regardless of how affordable it is. Whilst the Anchoring effect is particularly prevalent in questions involving numbers, it can occur with broader concepts and beliefs.

A good contextual example that comes to mind is river gauge information. It’s a valuable tool to help us plan but we have to be aware of potential flaws and inconsistencies, rather than taking it as concrete. We need to consider gauge location relative to the sections we want to paddle and how regularly the gauge updates its readings. We need to treat the information as a guide but be critical of it in order to minimise the chance of being anchored to one viewpoint that could restrict or misguide our decisions.

Confirmation bias – nobody likes being wrong…

As humans, once we develop a concept, viewpoint or plan, we like to stick to it, regardless of new pieces of information that would challenge the validity of the original idea. In fact, we subconsciously seek to undermine or completely avoid any information that could challenge the original idea, while subconsciously seeking out information that reinforces the original idea, regardless of the validity of the new information.

A simple, prevalent and dangerous example of this is political ideology and social media. I’m sure that we all hold a variety of political viewpoints. However, how many of us surround ourselves, tangibly or through social media, with people who support that ideology, creating something of an echo chamber. This can mean that we never analyse or challenge our own views, leading potentially to conflict.

Tom Parker

Devil’s advocate

A river example would be based around planning. A group could develop a plan for the day but could then fall into the trap of seeking information as the day progresses to lend credit to that plan, regardless of how valid the information is. This can be very tricky to combat. Dr Sara Boilen, a clinical psychologist and backcountry skier/avalanche educator based in Montana, advocates having an experienced member of the team play devil’s advocate for the trip, trying to find valid reasons against the plan.

As we can see, our willingness to anchor to a particular plan, piece of information or concept then our tendency to bias towards information that favours that original idea, regardless of its validity can cause us problems in a dynamic, adventurous environment, where hazards change regularly. Challenging this mindset is tricky, as it requires analysis of plans and information.

“Why?”

Thinking back to the last issue, we know that this will require the use of System 2, the consciously analytical approach, which takes quite a lot of effort. Creating a questioning ethos within the group, where people are empowered to ask, “Why?” can be a powerful defence against these biases. However, this approach also has weaknesses, as we shall explore next issue…