ÜBERLINGEN MID-AIR COLLISION 2002
It is evening, in the sky over southern Germany. Two commercial aircraft are flying on a collision course: a Russian charter flight from Moscow to Barcelona, and a DHL cargo flight from Bergamo to Brussels. Their courses should be corrected by an air traffic controller in Zurich, but he is doing the job of two controllers, at two different work stations, as his equipment is degraded by ongoing maintenance work. Both planes are equipped with TCAS, an automated warning system that is the last line of defense if air traffic control fails to separate planes soon enough.
Less than a minute before the crash, the air traffic controller notices the collision course, and gives the Russian crew a command to descend. Seven seconds later, the automated warning system orders the Russian crew to climb, while ordering the DHL crew to descend.
The Russian crew begins to follow the human’s command to descend, ignoring the later automated command to climb (except for a highly mitigated objection by the lowest ranked crew member).
The result was a mid-air crash: the DHL crew followed the automated warning to descend; the Russian crew obeyed the human air traffic controller and also descended.
Regulations for the use of TCAS state that a TCAS instruction takes precedence over an air traffic controller’s instruction: that is, “Obey the machine.”
Why did the Russian crew obey the human command? One reason suggested by the accident investigation is that they did not have simulator training on TCAS. They had paper and pencil training, but not the near-realism of simulator training.
Another possible reason is that the human command came first. They were already committed to the descent when they heard the TCAS command seven seconds later. A Russian pilot with the same training interviewed after the accident made the interesting argument that the human voice sounds intense and passionate, while the machine voice is robotic: of course one would obey the voice that sounds more concerned.
HOW TO THINK ABOUT AUTHORITY
I am involved in this question because I am working at a project at NASA doing formal modeling of the FAA’s development of the next generation of air traffic control (NextGen).
The area I am working in is called “Authority and Autonomy (or Automation). “Authority” covers humans, human and machine systems, legal and regulatory structures, and rules of procedure. As a social scientist on the project, I am particularly focused on the authority issues, and how they work with what we know about
As a first approach to the question of authority, it’s useful to return to Max Weber’s classic question: What is the basis of legitimate authority. If A gives B an order, why should B obey? What makes B believe that A has the right to issue that order, and that B should obey? The issue of legitimate authority is mostly discussed in the context of large scale, state authority or religious authority.
I am looking at legitimacy in the short term: conflicts or ambiguities of authority that take place in minutes or seconds. If the Russian crew had received an instruction from TCAS before they heard from the air traffic controller, they might have obeyed TCAS and disregarded the controller. The ordering could make the difference.
Within the world of aviation, authority questions are often believed to be solvable by training the humans to automaticity. Yet in the moment, human judgment is always required. Further, many of these conflicts of authority are a result of design decisions in complex socio-technical systems. It is important to look at these decisions, rather than assuming that the humans can be trained to work with whatever design is produced.
Other examples of authority questions in the aviation domain include:
Who do I obey if I believe the legitimate authority has gone crazy? (The 2012 JetBlue incident is an example.)
What mode of automation are we in, and therefore what does an automated indication mean?
What happens when multiple authority regimes are giving conflicting commands?
More on these questions later.
Meanwhile, it might be worth looking out for issues of authority and authority deflection the next time a service person tells you: “I’d like to help you here, but the computer won’t let me do it.”
FOR MORE INFORMATION SEE:
Summary of Überlingen accident
Dramatization of Überlingen accident
Official accident report, Bundesstelle für Flugunfalluntersuchung
German Federal Bureau of Aircraft Accident Investigation