Tech Log

Tech Log

Net gains: goalkeeping and aviation safety By Adam Johns, Safety Consultant What can we learn from goalkeepers when it comes to safety in the air? very day in aviation, pilots, air traffic controllers and other frontline personnel perform countless correct judgements and actions in a variety of operational environments. These judgements and actions are often the difference between an accident and a non-event. Ironically, data on these behaviors are rarely collected or analyzed.” This is how a 2019 NASA paper entitled Human Performance Contributions to Safety in Commercial Aviation begins. It’s an important idea to think about. What NASA and many others are saying is that without humans – and our unique capacity to adapt to a changing situation – successful outcomes in systems such as aviation would not occur at the rate they do. Technology and automation are critical components of the aviation system, but only humans are truly capable of adapting. The unseen contribution When you fly a commercial aircraft, things can very occasionally go quite badly. Things can also go better than expected. But the vast majority of the time, things just go normally. More often than not, operational outcomes are unremarkable. You arrive roughly on time without any serious threats, errors or incidents. No heroic piloting to save the day, just run-of-the-mill operations. But do we ever stop to ask ourselves how this happens? Why are the vast majority of operations unremarkable? Is it because there are no threats faced and no errors made? That’s unlikely, since humans are fallible and errors are inevitable. Threats are pretty common, too, especially when operating in challenging airspace, inclement weather, with an inexperienced crew, or with technical defects. Or all of the above. So, what is happening when nothing happens? Confession: I’m not a commercial pilot (although I did hold a PPL once upon a time) – I’m a career safety professional. In my spare time, among other things, I like to play football as a goalkeeper. You may now be wondering what catching footballs has to do with flying planes. In the context of the opening quote, quite a lot, actually. As a goalkeeper, you are the last line of defence. If you make a mistake, you will likely let in a goal. These mistakes are highly scrutinised. If you lose the game, you’ll probably be made a scapegoat and may even be dropped. At the other end of the spectrum, you could pull off an amazing diving save in the last minute and secure victory for your team. You’re likely to be celebrated as a hero. The line between success and failure is thin. The goalkeeper’s lot Error leading to defeat Poor performance (highly visible) ● Communicating ● Organising ● Sweeping up ● Starting attacks Normal activity (mostly unnoticed) Match- winning save High performance (highly visible) But these two stories only represent the extremes of what a goalkeeper can do. The vast majority of the time, the keeper isn’t fluffing a shot or tipping the ball round the post. Most of time they are influencing the game through their voice. Since a keeper is idle for most of the game, this gives them the capacity to observe, and thus organise, coordinate and generally boost team confidence. These so called ‘non-technical skills’ mean the difference between an average keeper and an excellent one. The more a keeper organises their team through constant effective communication, the less likely the opposition is to take a shot (which represents a threat), thus reducing the need for the keeper to make a save (opening up the possibility for error). To the untrained eye, this non-technical performance is completely overlooked. The layperson tends not to notice the role that the keeper plays when they aren’t making saves. More still, despite the myriad data collected in football, no-one notes the amount of times a goalkeeper influences the game without touching the ball. Only obvious contributions are recorded. The goalkeeper’s role isn’t just a defensive one, though. When they have possession of the ball, they have the chance to directly contribute to an attack and assist their team scoring with a big kick or throw. This adaptability between stopping goals at one end and immediately helping create them at the other is crucial to the team’s overall success. Now let’s return to flying. Can you see the similarities with how we think about flight safety? We collect data that focuses on incidents, near-misses, threats and errors in order to make the system safer. But, by doing so, we’ve completely ignored the hidden work that pilots do every day to ensure a safe outcome. With the exception of a periodic Line Operations Safety Audit (LOSA), the communication, organisation and management of threats and errors – and recovery from them – are missed in our routine data collection processes. Since we only collect data about negatives, the presence of safety is naturally defined as the absence of negative events. In 2010, England goalkeeper Rob Green was vilified after In 2018, England goalkeeper Jordan Pickford was celebrated after letting in a soft goal against the USA at the World Cup saving a penalty in a shootout against Colombia at the World Cup Seeing the whole picture At my former employer, Hong Kong-based Cathay Pacific, we decided to change how we defined safety. Rather than see it as the absence of negative outcomes, we decided to view safety as the presence of capacities to be successful. But if you’re going to define safety by its presence rather than its absence, you need new ways to measure it. Capacity is a tricky term, and can refer to having the necessary time, resources and technology to complete a task. I’m talking specifically in terms of people, and the unique capacity they bring to a situation. With the goalkeeping analogy, we’ve already understood that most of the job goes unseen and unrecorded. When it comes to pilots, if we want to understand their whole contribution to safety, we need to understand what they do to ensure successful outcomes most of the time. At Cathay, we introduced a new methodology for understanding the whole contribution, called the Operational Learning Review (OLR). This was designed to enhance learning from all operations, but most often it is used to follow up on an air safety report (ASR). Typically, when investigating an ASR, an airline will seek to identify what pilots may have done wrong and base their improvement actions on this – usually retraining, demotion or a newsletter. The OLR process just seeks to learn what went on – an objective, judgement-free curiosity about the event, regardless of the outcome and its severity. The goal is to understand what happened and how, from the perspective of the pilots, aiming to discover why their actions made sense to them given the information and resources they had at the time. In most aviation incidents, the outcome could have been much worse, but the crew’s ability to recognise and respond to a threat or error stopped an undesirable event from turning into a nasty outcome. Understanding as a vehicle for learning Within the OLR, one very deliberate part of the process design requires the team conducting the review to consider system-level learning before individual-level learning. With the right processes in place for understanding the ‘what’ and the ‘how’, every operational event presents an opportunity to learn something about the system. Whether it’s the SOPs, the automation, the training system, the command upgrade process or how you recruit, the system always contributes to – and influences – an event’s likelihood. In the OLR, only when the learning at the system level has been completely extracted can individual learning be addressed. However, when considering the performance of individual pilots, a just culture is needed to create the psychological safety for them to be accountable for their own learning. Once the system context is understood, the pilot is asked, ‘What do you think you need to learn and improve following this event?’ This is a very important step but, of course, the company can supplement the pilot’s desired actions with its own. In one case, a pilot asked to have use of the simulator to practise responding to the scenario in which they had found themselves in the real world. The pilot regained his confidence, and the confidence of management, and was in full control of his own learning throughout. A win-win. This is where the concept of a restorative just culture comes in: seek forward-looking accountability on all parties to learn from an event, rather than simply seeking to hold people to account for their past actions. Much more can be written about the design of the OLR and how it is helping Cathay to understand the full pilot contribution to safety. But, for now, it’s important to highlight that the sole focus of the process is to facilitate organisational learning. Everything we do in aviation safety should be to further learning, and that requires curiosity. We won’t achieve true learning with a narrow focus on threats, errors and undesirable events. Instead, we need to broaden our attention and data collection efforts to understand how these things are normally well-managed by crews. This is particularly important when it comes to making interventions in the system. Whether it’s changing regulations, SOPs, training, or adding in more automation, if we don’t understand the full human contribution to safety, then we’re likely to make changes that result in unintended consequences. Humans are the only part of the system that have the capacity to adapt to complex and changing conditions. Pilots, like goalkeepers, continuously anticipate surprise, monitor for it, respond to it and learn from it. For those with organisational safety management responsibilities, understanding how safety is created every day should be core to their role. Look for what’s happening when nothing’s happening. We decided to view safety as the presence of capacities to be successful We decided to view safety as the presence of capacities to be successful Technology and automation are critical components of the aviation system, but only humans are truly capable of adapting TECH LOG