The Bookshelf talks with Steven Casey
Greg Ross
Even a powerful piece of technology can be useless if people find it confusing to operate. In an alarm clock or a cell phone, a bad interface is frustrating; in a power plant or an operating theater, it can be deadly.
As a specialist in ergonomics and human factors, Steven Casey has spent years observing the consequences of poor design. In his 1993 book Set Phasers on Stun (Aegean), he collected 20 true cautionary tales that illustrated the real costs of human error in settings from cockpits to passenger ferries.
In this year's follow-up, The Atomic Chef (Aegean), Casey presents another score of stories about the "surprising yet foreseeable events" that can arise due to a misleading gauge, an awkward workspace, poor maintenance or an ill-defined procedure. Very often the outcome is death, even though the machines involved are working properly as designed.
American Scientist Online managing editor Greg Ross interviewed Casey by telephone in December 2006.
At the beginning of the book you give a quote by the ecologist Garrett Hardin: "People are the quintessential element in all technology." Do designers tend to forget that?
Well, it is often forgotten. Not just designers, but engineers, and I think the stories in the book tend to speak to that point—make examples of instances in which engineers or designers didn't incorporate human factors or ergonomics into the design of the system or the product.
As technology becomes more sophisticated, do you think human error is actually increasing, or is it just becoming more noticeable?
I think it's probably a case of both. The systems are more complex, and it's also more noticeable because as things become more sophisticated, one individual has so much more power to disrupt a system — whether it's a nuclear power plant or an aircraft with 555 people on board. The opportunities for amplifying the consequences of error are increasing all the time.
Instead of user error, you prefer the term "design-induced error." Is it ever appropriate to blame the user?
I use the term as it is commonly used in this field, design-induced error, and the stories that I've selected for the book really focus on the user interface. I've tried to select stories where there is a deficiency in the system or the actual interface that the operator uses could have been done better.
I ask because in several of the stories, it seems there were adequate procedures in place, but the users just decided not to follow them.
That is true. In the lead story, for example, "The Atomic Chef," there were procedures in place. There was a very expensive and elaborate system for making the uranium fuel for this breeder reactor, but the three operators, along with their supervisors, did make the decision simply not to use the system that had been put into place to avoid exactly the kind of accident that happened. [In 1999, two workers were killed at Japan's JCO Nuclear Fuel Processing Facility when a solution they were mixing reached nuclear criticality.]
But again, I think it's a good example of a systemic problem within an organization. The whole social aspect of that setting was as important as, say, a one-on-one operator interface — and these "macro ergonomic" issues, I think, will become increasingly important in the future.
You run your own human factors research and design firm. Do clients generally call you before or after something goes wrong?
Most of the time it's before. Occasionally I do get involved in something where there has been a problem, or the client is aware of a problem, or oftentimes they'll know that, say, a competitor's product does a better job in this regard in terms of its ease of use, for example. So it's both, really.
Do you see patterns in the issues they're facing?
Most of my work is with vehicles of one kind or another, whether they're automobiles or aircraft or agricultural or construction machines, so the types of things that I deal with are surprisingly consistent from one machine to the next. I do tend to see similar issues as technologies migrate from one class of vehicle to another — for example, GPS systems migrating from cars into farm equipment, and so forth. So I see similar types of issues.