#ifIhadglass I’d probably crash the aeroplane

With luck, the image of Asiana Flight 214 burning out on the end of Runway 28R at San Francisco International Airport should become an icon of our future.

Why? Forget the sterile row between Malcolm Gladwell and whoever. Concentrate on the person at the left, taking a photo on their iPhone. Fortunately, they probably don’t have Google Now, so it won’t have told them to go back to work. One point that kept getting made in this thread wasn’t about Korean culture, or even crew-resource management, but rather about the kind of future they’re selling for all of us. Let’s recap this quote:

maybe we could get the interface down to an iPhone app that would superimpose a bright white line over the camera’s view of the surrounding street just telling us where to walk and what to do and buy all day long. Wouldn’t that be a bit of a relief?

And also this post on the classic “Children of the Magenta Line” lecture.

Now, something which keeps coming up in discussions of how that B772 ended up there (the key text being in here) is that Asiana was an early, and unusually intense, adopter of the aviation abbreviation FOQA – Flight Operations Quality Assurance. This comes up all over the place, mixed with pub racist comments on Koreans, people who want to bash Airbus and have apparently forgotten that the plane was a Boeing…the interesting thing is that I don’t think Korean culture was important here, so much as an emergent surveillance culture that is fundamentally global.

So, FOQA. This basically means downloading the quick-access flight recorder at each stop and cramming the data into some sort of big table, then running queries to see if anyone’s been a bad boy. Now, this is not in itself stupid. Much of the CRM revolution in aviation was founded on using data as a way of sidelining authoritarian sky-god types’ machismo and bullshit.

There are sensible ways of doing this. For example, a large majority of landing accidents result from an unstabilised approach; you might define a stabilised approach as one in which the aircraft is on the centreline, on the glide slope, at the correct speed, having completed all the checklists for landing, by say 1000ft of altitude. Otherwise, you should go around and try again. You might decide to officially not care about the go-arounds, and care about anyone who pressed on.

What gets measured, gets managed, though, and therefore, this project has an evil twin. What if we were to identify a platonic ideal for the approach into each destination we serve, define the LNAV-VNAV path for that, and check each flight’s exact numbers against it? We could identify the six-sigma limits, and then put the fear of God into anyone who overstepped the mark.

And that’s roughly what they did.

Of course, there is a response to surveillance. In this case, it took the form of using the automatic pilot whenever possible. The original autoland concept was built on the idea that the computer can (usually, with important exceptions) steer more accurately than you can, but it cannot command as well as you can. As the FOQA monitoring values the absence of deviation from a programmed path above all else, this was a winning strategy. The monitoring program could value nothing else, after all.

This meant both a progressive loss of manual skills – the usual take-home from Children of Magenta – but also a loss of competence with regard to using the automatic systems. Rather than responding to an unusual situation by either flying yourself, or commanding a different flight path on the mode-control panel, people began trying to manage the automation around it, operating at yet another level of abstraction. When the line it was following didn’t go the right way any more, they tried to get the right line, rather than going the right way. There’s a great example of this in here if you like 111 page French air accident reports.

(It is very telling that pathological learning seems to characterise human-computer interaction in all kinds of contexts.)

And they will do this to you. Have you registered your vitals with your line manager yet? On the other side of the political tracks, do you know your annual electricity consumption?

The problem with the last one is that it’s perfectly plausible to me that the best option isn’t actually staring at a display to work out whether you’ll be the biggest electricity saver among your Facebook contacts if you run the washing machine now and can you wait before you run for the bus to get into work and are you worried you might blow your stress metrics and get fined? Perhaps it might just be a shit idea? Perhaps, actually, we ought to concentrate on greening-up the national grid? Perhaps your health might be better if they didn’t do this shit to you?

But this is not one of the options provided, just ever more surveillance, imposition, and bullshit. You can’t optimise your way out of a problem unless you can control the target criterion of that optimisation.

The Asiana accident and Children of Magenta are interesting, politically, in that it’s a long-running experiment with a lot of data points that was run on an elite population, and that worked pretty horribly. However, although it’s generally agreed to be a failure, it’s still hard to roll it back.

As an interesting counterexample, look how well the pink-collar artisan class did in the crisis compared to the elite-under-surveillance up front. That’s leadership, but then, that’s usually defined as being what you do outside the defined circumstances of normality.

6 Comments on "#ifIhadglass I’d probably crash the aeroplane"


  1. This is a very interesting post, and sparks two quick points: Have you read David Mindell’s _Digital Apollo_? I’m about half-way through, and so far it’s truly excellent; of particular relevance to this is his discussion of the way the astronauts’ role changed (along with that of test pilots) with the development of autostabilising technologies. Whether astronauts should be pilots or mission commanders operating a flight management system was contested all the way through the Apollo program; though their role was in the event mainly the latter, aspects of the former still remained.

    Secondly, what do you make of Langewiesche’s take on FMSs in _Fly by Wire_? I thought he made a pretty compelling argument that the Airbus approach is superior from a safety point of view, so long as pilots are properly trained as mission commanders and in the operation of the FMS.

    Reply

        1. I think it’s out later this year from Manchester UP; alas at the moment (IIRC) there’s only a hardback release planned.

          Reply

Leave a Reply to Jakob Cancel reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.