By Tom Parker
Tom Parker is one of the UK’s more active and experienced white water coaches. He is a Level 5 Coach, British Canoeing National Trainer and Rescue 3 International Instructor Trainer. To find out more about the courses he runs, head to www.tomparker
coaching.co.uk
The human factor #4
How flawed thinking leads to danger on the river
Welcome to the final part of The Human Factor, a short series of articles looking at how sub conscious bias can affect our decision making on the river. So far, we’ve identified our capacity for conscious and sub conscious decision making and the pros and cons of each approach. We’ve identified many sub conscious biases that can lead to us making poor decisions without even realising it and we’ve looked at a simple series of questions to share the mental load amongst the group when making decisions on the river to help mitigate these biases. So, that must mean everything is all sorted, correct? If only that were true. It turns out that there are a number of biases that can reduce the effectiveness of the questioning approach amongst the group…
“We’ve had enough of experts…”
Humans are social creatures and part of that is a tendency to form hierarchies within those groups. Hierarchies within our realm quite often tend to be based around perceived expertise, experience and confidence. This can massively inhibit the open, questioning ethos required for the collaborative decision making process to work. The main bias at work here is the association of expertise with infallibility. Those who perceive themselves to be less expert within the group feel less able to ask questions or point out perceived problems, either because they feel they lack the required knowledge to challenge actions or because the feel that the ‘experts’ within the group must have a solid plan for their actions.
This is often referred to as the Expert Halo. Clearly, however, experts are still human, prone to bias and with a finite amount of decision making and analytical capacity, particularly in dynamic, risky environments. Surgery and aviation are both fields that have struggled with this issue in the past, with fatal results and millions of pounds have been invested in developing approaches to help mitigate the effects of the Expert Halo.
These largely centre around fostering and reinforcing a team based approach and a culture of empowering anyone present to call a stop. Hard as it can be to remember to build it in to what we do, those who have more experience need to actively encourage those with less experience to challenge their suggestions. We will struggle to remove hierarchies, but we can mitigate their effects.
The In Crowd…
We all like to feel accepted within a social group, right? No one really relishes the feeling of being an outsider. This need for acceptance is a powerful bias on our decision making and our willingness to question what is happening. It makes us less likely to question courses of action if we feel that analysis will adversely affect the rest of the teams’ experience, leading to them ostracising us. No one wants to be seen as a ‘mood hoover’, so we keep quiet, despite feeling uneasy about the choices that the group are making.
This acceptance bias doesn’t purely occur within tangible groups on the water. Peoples’ decisions are biased by digital acceptance – the quest for that shot or footage that will get you lots of social media likes can lead to all sorts of dodgy situations. Once again, it’s tricky, but we have to be happy to question each other’s suggestions with purpose and support.
Just because nothing goes wrong, it doesn’t mean what you were doing was right…
We’ve considered these biases individually but, in reality, they all mix together and help to compound each other, which makes both self awareness, awareness of our mates and a willingness to ask questions all the more important if we are to avoid theses biases tripping us up.
We also have to consider the fact that our decisions may be riddled with sub-conscious bias, and yet nothing goes wrong. That doesn’t mean our performance was solid, it just means that we were lucky. We have to take a retrospective questioning approach too after our trip, asking ourselves why we did what we did and being honest. Once we admit that we are biased in our thinking, we become more open to people pointing out our bias.
To conclude…
This may sound very gloomy, an insurmountable problem. It’s not meant to, believe me. We won’t ever rid our thinking of sub-conscious bias. However, if we admit that we are biased and understand the nature of those biases, we can minimise their effects. By understanding our motivations, gathering information ahead of the trip and sharing ideas with our mates, before, during and after our time on the water, we can stop the effects of these biases from spiralling out of control or prevent them having such a big effect in the future…