The safety of flight is largely thanks to pilot intervention

The safety of flight is largely thanks to pilot intervention

By Martin Chalk, Log Board member used to have a pet spider, and I trained Spidey to dance to music. One day, however, my brother pulled Spideys legs off and, after that, Spidey would not dance, no matter how loud the music. My conclusion was that spiders have their ears on their legs. (No spiders were harmed in the writing of this metaphorical story.) Data is often extrapolated to conclusions that do not then stand up to scrutiny and, in the professional pilot world, I believe we suffer one enormous such error because of the data set commonly used to derive the conclusion. In 2001, Wiegmann and Shappell, in a paper published by the US FAA, concluded that humans, by their very nature, make mistakes; therefore, it should come as no surprise that human error has been implicated in a variety of occupational accidents, including 70% to 80% of those in civil and military aviation. How often is the 80% figure in this paper quoted? Those of us who work in aviation have heard it many, many times, particularly from those pushing technical solutions and the reduction in pilot complements to justify de-skilling and automation. Using the conclusion in this way, however, falls foul of poor data-selection criteria, as it is based on only the tiny proportion of flights that resulted in failure. My best friend, who is not in aviation (other than as an enthusiastic passenger), often tries to provoke me with comments about pushing the up or down buttons, or by saying that the plane just lands itself, doesnt it? Although well meaning, this is only funny because the general flying public does not realise just how many decades of learning and sacrifice have resulted in our incredibly safe system, where the chances of meeting a dramatic end are lower than when you take a bath! The reason the system is so safe is in no small part because of the effort pilot experts have made over the years and continue to make to push back against latent hazards in the system. These hazards lower safety margins and denude the pilot of the time, skills, knowledge and experience required to operate effectively in a highly complex and dynamic environment. Pilot interventions In 2013, the final report of the Performance-based Aviation Rulemaking Committee and Commercial Aviation Safety Team Flight Deck Automation Working Group, also published by the US FAA, noted that pilots are expected to manage malfunctions as an important part of their tasks. [Data] shows that aircraft malfunctions were noted to be a threat in 20% of normal flights (based on LOSA data). The paper also cites a number of other ways in which pilot interventions resolve threats to flight safety, including: Adapting to changes in operational circumstances LOSA data show that only one in 10 flights is completed as originally planned and entered into the flight management system (FMS) Managing operational threats, such as adverse weather; LOSA data show that adverse weather is a threat that is present in almost 60% of normal flights Mitigating or managing errors made by themselves or by other humans in the system (other pilots, dispatchers, maintenance technicians, air traffic personnel, equipment designers such as altitude-awareness procedures) Mitigating equipment limitations Managing other situations involving unexpected operational risk (such as the decision-making required in the US Airways 1549 Hudson River incident). When the human is kept out of the decision-making process, there is a potential for failures to result in automation not delivering the desired outcome In 2019, Jon Holbrook PhD, of the NASA Langley Research Centre, brought all this together. He used Boeing data for the 10 years from 2007 to 2016, in which there were 244 million departures and 388 accidents. Adding this to the 80% human-error factor implicated in accidents, and the 20% pilot-managed aircraft malfunction numbers, he produced a thought experiment that suggested rather than focusing on the 310 accidents (80% of 388) that may have had a pilot error component we should focus on the 48.8 million flights (20% of 244 million) where pilot intervention had resolved a threat to the safety of the flight. There is a new, emerging hazard that was demonstrated, in part, by the Boeing 737 MAX accidents. When the human is kept out of the decision-making process, there is a potential for failures, or unpredictable circumstances, to result in automation not delivering the desired or predicted outcome. Tightly designed sensor-analysis-action automation has a greater potential for unpredictable and unreliable outcomes. When there are points in the process where a human has a chance to sense check the decisions, there is less chance of mishap. After a conference discussion in 2019, the International Committee of the Red Cross produced a paper titled Autonomy, artificial intelligence and robotics: technical aspects of human control. Although based around the deployment of automatic weapons systems, it makes some points that would apply equally in our safety-critical environment. The paper suggests the necessity of human in the loop systems, where humans have the situational awareness, time and a mechanism by which to intervene if necessary. It makes the argument that, in an ever more complex environment both physically and in the software sense reliability and predictability of outcomes are less certain. Its conclusion, that it doubts that autonomous weapon systems could be used in compliance with international humanitarian law in all but the narrowest of scenarios and the simplest of environments, is a sobering lesson for those who are pushing for the rapid expansion of aircraft or technological autonomy in civil aviation. Safety specialists Pilots place a very high value on, and our industry mostly operates, a just and open reporting culture. This, combined with a willingness to learn and change when reports indicate a threat in the system, has resulted in pilots being the safety specialists in one of only a handful of industries that operates with any sort of mature threat and error management system. As our industry starts the process of recovery from the most destructive trading conditions it has ever faced, the role of the professional pilot in keeping the operation safe and reliable will be even more important than before. Ensuring that efficiencies desired and necessary to overcome the revenue loss over the course of the pandemic do not dilute the situational awareness, time and mechanisms available to pilots to undertake their specialist, safety-critical task effectively will be hugely important. For those of us no longer expecting to take to the skies operationally for a while, there are many opportunities to take our widely transferable skills to other industries. As the BALPA Redundancy Advisory Group suggests, pilots are excellent communicators and great at problem solving, time and process management, conflict resolution, coping well under pressure, and critical thinking. However, if the profession is not recognised by the wider public as safety conscious yet innovative, regulated yet autonomous, and process-driven yet analytical, then we will find it harder to sell our skill set to the wider employment market. I suggest we need to improve our own marketing. F LI G HT SAF ET Y Whats the real value of a pilot?