Are pilots forgetting how to fly due to automation . . . A General Aviation Perspective (Part 2)

The following is today’s post from the new Flying North America Blog.  As a regular contributor to this exciting media franchise, if you are an aviation enthusiast, I would encourage you to visit the blog and subscribe, then sit back and leave the flying to us.  You will enjoy the ride!

  • Wiener and Curry (1980) reported that the advent of automation brings new problems associated with human computer interaction…
  • …The problems associated with automation are vigilance decrements (Frankmann and Adams, 1962; Heilman, 1995), out-of-the-loop performance problems (Endsley and Kiris, 1995; Thackray and Touchtone, 1989), complacency (Mosier and Skitka, 1994; Mosier and Skitka, 1996), and skill degradation (Wiener and Curry, 1980; Hopkin, 1995)…

from Flight-deck automation: promises and problems by Earl Wiener and Renwick Curry

In yesterday’s post we examined the link between flight deck automation and diminishing pilot skill sets focusing more on the commercial or scheduled flight operators.

In this the second of the 2-Part series, I will examine the impact automation has on the General Aviation industry, and in particular the four critical points of contention that were raised in the Wiener and Curry report. Specifically, i) vigilance decrements, ii) out-of-the-loop performance problems, iii) complacency and, iv) the aforementioned skill degradation.

vigilance decrements:

“The flight comprised five separate legs, although three legs were subjected to analysis. On the basis of attentional resource theory, it was hypothesized that task performance would differ based on the requirement for memory retrieval. Consistent with the hypothesis, the results revealed a deterioration in those tasks for which there was a substantial requirement for memory retrieval. Further analysis revealed that the deterioration in performance was best predicted by pilots’ perception of the workload associated with the flight and their perception of their ability to exercise control over aircraft during normal conditions.”

from Vigilance decrement during a simulated general aviation flight by Mark W. Wiggins (John Wiley & Sons 2011)

As I contemplated the above article I could not help but think of the accident involving JFK Jr., his wife Carolyn Bessette and her sister Lauren Bessette, and the role that vigilance decrements played in their tragic deaths.

The NTSB report that “the pilot’s failure to maintain control of his airplane during a descent over water at night, which was a result of spatial disorientation,” was by no means the only reason cited as to why the plane went down into the Atlantic.

For example, and within the context of signal detection theory, in which varying decision-making sensitivity enable the observer or pilot in this case, to differentiate between real and false situations, and as a result make the correct perceptual judgments as the conditions of uncertainty vary, one has to wonder what role training, or the lack thereof, as opposed to automation had in the crash. Especially given the assertion by some that under conditions such as those in which JFK Jr. was flying, that had he relied on his plane’s instrumentation the accident may very well have been avoided. In essence, automation would have worked in his favor had he followed standard training procedures and used his instrumentation to fly the plane versus doing so by visual means.

out-of-the-loop performance problems:

“Human supervisory control and monitoring of automated systems, as well as, passive system(s) information processing can all be classified as forms of out-of-the-loop (OOTL) performance. Whether the operator’s task is to decide if process control intervention is necessary, detect a critical system event, or accept or reject the actions of a computer controller, he or she is removed from direct, real-time control of the system. OOTL performance is a critical issue in overall automated systems functioning because it is associated with numerous negative consequences including: (a) operator failure to observe system parameter changes and intervene when necessary (vigilance decrements); (b) human over-trust in computer controllers (complacency); (c) operator loss of system or situation awareness; and (d) operator direct/manual control skill decay. These consequences have been found to impact human performance under both normal operating conditions and system failure modes, with a greater effect on the latter [15] leading to serious problems in operator ability to perform their assigned tasks when working with automated systems.”

from Out-of-the-loop performance problems and the use of intermediate levels of automation for improved control system functioning and safety by Daved B. Kaber and Mica R. Endsley

The best analogy that I can use to create a contextual everyday relevance is how the use of calculators has affected my ability to do long division in my head as quickly as I could in my younger days.

Don’t get me wrong in that 1 plus 1 still equals 2 versus 11 in my books, but the point is nonetheless effective in terms of illustrating the expressed concerns relative to the means by which automation has shifted human interaction from one of facilitation to observation. Or to put it more succinctly, it is human nature that the less direct our involvement in a particular task the more difficult it becomes to pay attention.

This reality according to the above article becomes even more problematic when an unexpected or emergency variable comes into play.

As a result, one might ask had the pilot of the miracle on the Hudson flight been less experienced than 57 year old Capt. Chesley B. “Sully” Sullenberger, would the outcome had been somewhat different? After all Sullenberger, who had been at the time a commercial airline pilot for close to 30 years, was also a former fighter pilot who’s military training undoubtedly equipped him to handle far more severe challenges than what he faced on January 15th, 2009.

Miracle on the Hudson (January 2009)

The question this simply raises is whether or not the issue regarding out-of-loop-performance is one of discipline and vigilance as opposed to piloting skills.


Identified as being an out-of-the-loop consequence resulting from flight-deck automation attributable to what has been referred to as human over-trust in computer controllers, the emergence of a complacent attitude in and of itself seems to be a fairly straight forward proposition.

However, and according to the book by Linda G. Pierce Understanding Adaptability: A Prerequisite For Effective Performance Within Complex Environments, rather than logically processing relevant pieces of information, people often adopt effort saving strategies call heuristics. Mosier and Skitka (2006), referred to this as automation bias, which is the tendency to use automated cues as a heuristic replacement for vigilant information seeking and processing.

As someone who being funded through the Government of Canada’s Scientific Research & Experimental Development program utilized a similarity heuristics methodology under an agent-based model to develop my strand commonality theory, I can say with a degree of confidence that this approach within the somewhat limited confines of a web-based application is sound. The fact that the system that went into production correctly selected the right source for a particular product 97.3 percent of the time, tends to confirm the model’s effectiveness.

However, when you extend a heuristic approach to other forms of automation, the variables that must be taken into consideration while still being definable, are too broad and varied to be blindly relied upon without ongoing human intervention and analysis. Piloting a plane is one such scenario in which a heuristic dependency is fraught with risks.

While the old computer admonishment of garbage in – garbage out is somewhat descriptively accurate in this instance, in truth the underlying lesson is that while automation is an important part of present day aviation, freeing the pilot or pilots to focus on more important tasks, systems cannot take into account all of the variables associated with human experience and involvement.

Even in those instances, such as with my work in web-based application development, where a relatively narrow set of variables could be defined and effectively managed to produce a consistent outcome, there were still factors that the system could not take into consideration. One example is the effect of inclement weather on a suppliers/couriers ability to fulfill an order.

While the system might produce a favorable outcome score based on past performance, it is of little value if all relevant factors such as changing weather patterns cannot be taken into account, and subsequently captured and incorporated into the decision-making process on a real-time basis.

In the above instance, I would review the system’s recommendation of a supplier and then check to see if there were external factors that needed to be taken into account i.e. bad weather, before submitting the order. This in-tandem coordinated effort between technology and users is essential in that where there is an absence of one, the eventual outcome becomes highly questionably if not unreliable.

Or as W. Edwards Demming used to say, you can’t control what you don’t measure!

skill degradation:

. . . and it brings us back to the original concern relating to automation and the question “are pilots forgetting how to fly?”

In the book Flight to the future: human factors in air traffic control, Billings has reportedly traced the existence of incidents in modern aviation to problems in the interaction of humans and advanced cockpit automation.

Specifically, Billings maintained that many of the problems are derived from the complexity of cockpit systems and from the difficulties pilots have in understanding the dynamic behavior of these systems, which in turn is related to the relative lack of feedback that they provide.

In this regard, the 1980 Wiener and Curry paper has become a seminal reference point for studies in this area including their identification of two critical elements of consideration pertaining to the advent of automation on the flight deck.

To start, the authors maintain that some of the problems can be attributed not to the automation per se but to the way the automated device is implemented in practice, that can for the most part be alleviated by more effective training of users (or pilots) of the automated system.

The second element involves the problems that can arise from unanticipated interactions between the automated system, the human operator, and other systems in the workplace. This, according the book’s author, can be particularly problematic in the cockpit environment, in which the introduction of high-level automation with considerably autonomy and authority has produced a situation in which system performance is determined by qualitative aspects of the interaction of multiple agents.

At the end of the day, I doubt that anyone would dispute the merits of automation in the cockpit and the related benefits of a more precise navigation and flight control capability, improved fuel efficiency, an ability to safely fly in all kinds of weather along with a reduction in pilot workload.

That being said, automation should not involve a pilot’s abdication of his or her responsibility as the captain of the ship so to speak. As such, while there is undoubtedly an aspect of pilot training relative to acquiring and developing the prerequisite skill sets to take the helm of a plane, the real focus of any program should be centered on combating the natural human inclination for those at the yoke to kick back and leave the flying to something else.

The story of Northwest Flight 188, which reportedly went 150 miles off course should serve as a cautionary example of what can happen when automation dons the captain’s hat and assumes unchecked and unchallenged control of the cockpit.



Comments are closed.

  • Books Written by Jon Hansen

%d bloggers like this: