Human Nature

Human Nature

By Captain Karen Speight, Log Board member ast year, while doing research for a course on human factors, I came across the wonderful, but terrifying, Cognitive Bias Codex (Google it), with around 190 known human biases. Wonderful in that it reassured me that everyone struggles with things I thought were my own personal failings; terrifying in that it reveals the true extent of every humans biased thinking and behaviour. That includes not just pilots, but everyone who works in safety industries: air traffic controllers, surgeons, dentists and all other people who have access to vulnerable, soft, squidgy parts of us. Surely I dont have all of those biases? I thought to myself. After all, Im an all-round great gal, not easily swayed by false information, more self-aware than most, and pretty competent... well, darn me if I didnt just fall into the bias blind spot. Alexander Pope said: To err is human and how right he was! Our biases can be difficult to spot, awkward to acknowledge and downright impossible to remove. Too much information The bias blind spot is caused by the human mind taking in more information than it can consciously handle. We cant possibly be aware of all the data that comes into our brains in the average minute: there is simply too much of it. Much is lost, but many scientific experiments have shown that, even when people dont consciously remember things, they are still sub- or unconsciously influenced by them. That is where the secondary problem lies, with the bias blind spot. Even when we know that we are likely to judge ourselves as all-round great guys and gals, not easily swayed by false information, more self-aware and competent than others, we still do it, because we dont have conscious access to the parts of our brain that create this bias. Its a cruel twist of fate that no matter how hard we try, we simply cannot be aware of the flaws in our judgements. Unconscious means unconscious: its just how the brain is wired. This could be a blessing in disguise. If we actually knew how biased, incompetent and downright wrong we were most of the time, we would find it too difficult to live with ourselves. Its much easier to spot the flawed thinking of others. If you still dont believe that we are unconsciously influenced by factors we dont even recognise (the introspection illusion), then consider the following experiment, in which shoppers were shown four identical ties or pairs of tights. People consistently chose one over the others and gave a rationale as to why, stating that they preferred the knit, weave or workmanship, and ignoring the only true differences: viewing order and positioning. On the flight deck, we should consider that, when we make operational decisions, we think that we are aware of how and why we think what we think Help! The pilot has a blind spot How does this affect us as pilots? An obvious example is when we sit in a CRM classroom watching crash videos and analysing incidents, secretly thinking Id never do that what muppets those chaps/ chapesses must have been! If you are reading this knowing that you would never be the type of person to fall into this trap well, sorry, I think you just did. There are plenty of non-aviation examples of this error. Just consider Stanley Milgrams work on obedience to authority, in which hundreds of experimental subjects administered (fake) shocks to students while thinking that they were actually electrocuting them with buttons marked anything from mild shock through to danger of death. If you are now thinking I would definitely never do that, then think again, because the bias blind spot just showed its ugly face again; all the evidence suggests that, actually, you probably would. Why we ignore our biases We have all heard the statistic that 90% of drivers think they are better than average, but what about the experiment that suggested more than 85% of people think they are less biased than others? When we combine the two, we can see why scathingly questioning the decisions of other drivers, while believing that we ourselves are beyond reproach, ensures that the roads continue to be hotbeds of danger and destruction. Yeah, yeah, I know you personally would never think like that. Back on the flight deck, we should also consider that, when we make operational decisions, we think that we are aware of how and why we think what we think. We also tend to believe that we are less prone to biases than our colleagues, including the ever-present confirmation bias that all pilots are taught to fear. We dont examine our own heuristics and shortcuts because we cant. We simply dont have access to the subconscious processes driving them. What can we do? Getting someone else involved who might be able to see the biases we overlook in ourselves could be a lifesaver. This alone is surely one of the biggest arguments in favour of multicrew flight decks. Of course, we have to learn to actually take the judgement of others into account and resist the urge to assess them as less competent than ourselves. We also perhaps need to finish Alexander Popes phrase, To err is human, with its conclusion to forgive divine and remind ourselves that it is perfectly normal to be biased about our blind spots, but we need to learn to ask others to point them out to us, and forgive them for doing so. After all, Pope was an all-round great guy, not easily swayed by false information, more self-aware and more competent than others at least, in his own opinion. HUMAN BIAS Being human