Automation is best deployed as an enhanced decision making tool, not something which a bored human being should be tasked to sit and watch.
“When we design our systems, we need to assign appropriate roles to the human and technological components. It is best for humans to be the doers and technology to be the monitors, providing decision aids and safeguards.” – Captain Sully Sullenberger
The past week has seen several high profile aviation incidents come to light. The first one was a preliminary accident report on the crash and fire which destroyed an Emirates Boeing 777 in Dubai last August. The second was the release of the final report by the Australian Transport Safety Bureau (ATSB) regarding an AirAsia Airbus A330-300 enroute from Sydney to Malaysia last year which suffered navigation and other system failures as the result of erroneous input by the pilots during preflight.
The Emirates crash tragically took the life of a responding fireman, while the AirAsia incident caused no injuries but did result in a diversion. Each incident had the potential for great loss of life, though. The improper use of automation can be implicated in both the Emirates and AirAsia events. Let’s take a look at each of these and see if we can draw some parallels.
The crash report on the Emirates flight, released by the General Civil Aviation Authority (GCAA) of the UAE details that the approach was flown by the captain. The autopilot was disconnected for the landing while the autothrottles remained engaged. The aircraft experienced a longitudinal wind component which changed from a headwind of 8 kts to a tailwind of 16 kts during the approach. As a result of the decreasing performance wind shift, the aircraft made a long touchdown.
An automatic system on the Boeing warned the crew about the long touchdown, and a decision was made to go around. So far so good. Going around rather than accepting a long landing due to shifting winds is the correct decision.
What happened next wasn’t so good. The nose was raised, the flaps were reset and the gear were retracted, but go-around power was not added until three seconds before the aircraft impacted the runway with the gear partially retracted. The post crash fire destroyed the aircraft entirely.
Adding power during a go-around is…or should be, instinctual. It’s considered aviation 101, or rather it used to be. Today’s highly automated aircraft, however, all employ autothrottles which automatically advance themselves when the “Takeoff-go-around” or TOGA button is pushed. This is how go-arounds are performed on automated aircraft.
The 777, however, has a feature which disables the TOGA button after touchdown. This makes sense as you don’t want the throttles to advance after landing in case of accidentally touching the TOGA button. After a normal landing, that is. There are times when a rejected landing, or go-around, occurs after touchdown. The reasons vary, but a landing can be rejected any time until the thrust reversers are deployed, even after the gear touch down.
This is what happened to the Emirates 777. It touched down, and then attempted a go-around without adding power. Questions remain as to whether or not the captain actually engaged the TOGA button but in any case, the captain should have manually pushed up the throttles for the go-around or ensured that the autothrottles automatically advanced.
Why would he not do that? Easy. It’s called negative conditioning or negative training. Go-arounds are routinely practiced in all airline simulator training programs, but go-arounds after touchdown are practiced much less frequently. Over time, muscle memory will expect the autothrottles to advance themselves during a normal go-around as they always do.
Put a pilot in a highly dynamic situation such as a windshear landing, and then perhaps throw in a non-routine distraction such as the automatic runway length warning, and voila, muscle memory takes over and the throttles don’t get pushed up. Automation, which is supposed to make flying easier and safer, might have helped make a crash such as this inevitable.
AirAsia X 223
On March 15 last year, an AirAsia A330 suffered multiple inflight malfunctions of its navigational display systems rendering the aircraft incapable of either continuing to its destination in Malaysia, nor of returning to its origination point of Sydney due to low ceilings. The aircraft eventually landed uneventfully in Melbourne, which had clear weather.
Subsequent investigation revealed that the pilots made a data entry error during their pre-flight checks consisting of a single digit error in programming the aircraft’s location.
Modern navigation systems on today’s commercial aircraft are capable of guiding an airplane to a spot on the other side of the globe with accuracy down to several feet. But in order to know where to go, the computers on the airplane first have to know where they are.
Part of the preflight process is to enter in the aircraft’s current location in the form of a latitude and longitude. The pilot entering this data made some sort of fat finger error which resulted in the actually entered position being thousands of miles away from the Sydney airport. So after the aircraft departed, discrepancies between where it actually was and where it believed it was caused the computers to crash resulting in a nearly complete failure of the navigational system.
After identifying and while attempting to fix the problems with the navigation systems, the crew compounded their problem by cycling two of their three flight computers to off and back on. This incorrect procedure resulted in the loss of other primary flight displays and rendered the aircraft incapable of flying even a simple approach back to Sydney necessitating the diversion to Melbourne.
Even after arrival at Melbourne, the aircraft had to make several attempts at a completely manual landing without the benefit of either the autopilot nor autothrottles. There is little doubt that flying a highly automated aircraft left the pilot’s manual flying skills in a somewhat rusty state, which is completely expected.
Automation: Friend or Foe?
Over reliance on automation is also well known to cause a deterioration in manual stick and rudder piloting skills, which go unmissed until they are needed. The crash of Asiana 214 in San Francisco several years ago was a perfect example of this.
But as Captain Sully warned in the quote above, automation is best deployed as an enhanced decision making tool, not something which a bored human being should be tasked to sit and watch, as it is today.