Dangerous systems usually required standardized procedures and some form of centralized control to prevent mistakes. That sort of management was likely to work well during routine operations. But during an accident, Perrow argued, "those closest to the system, the operators, have to be able to take independent and sometimes quite creative action." Few bureaucracies were flexible enough to allow both centralized and decentralized decision making, especially in a crisis that could threaten hundreds or thousands of lives. And the large bureaucracies necessary to run high-risk systems usually resented criticism, feeling threatened by any challenge to their authority. "Time and time again, warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced," Perrow found. The instinct to blame the people at the bottom not only protected those at the top, it also obscured an underlying truth. The fallibility of human beings guarantees that no technological system will ever be infallible.
( Eric Schlosser )
[ Command and Control: Nuclear ]
www.QuoteSweet.com