REVIEW: "Minding the Machines", William M. Evan/Mark Manion

Discussion in 'Computer Security' started by Rob Slade, doting grandpa of Ryan and Trevor, Sep 28, 2004.

  1. BKMNDMCH.RVW 20040527

    "Minding the Machines", William M. Evan/Mark Manion, 2002,
    0-13-065646-1, U$29.99/C$46.99
    %A William M. Evan
    %A Mark Manion
    %C One Lake St., Upper Saddle River, NJ 07458
    %D 2002
    %G 0-13-065646-1
    %I Prentice Hall
    %O U$29.99/C$46.99 +1-201-236-7139 fax: +1-201-236-7131
    %O http://www.amazon.com/exec/obidos/ASIN/0130656461/robsladesinterne
    http://www.amazon.co.uk/exec/obidos/ASIN/0130656461/robsladesinte-21
    %O http://www.amazon.ca/exec/obidos/ASIN/0130656461/robsladesin03-20
    %P 485 p.
    %T "Minding the Machines: Preventing Technological Disasters"

    Part one is an introduction. It is ironic, both in terms of the title
    of the chapter; "Technological Disasters: an Overview"; and
    particularly the title of the book, that although the authors list
    four categories of disaster causes, the examples given overwhelmingly
    indicate human error, if not outright malfeasance. The
    classifications provided are also confusing: what difference is there
    between human, organizational, and socio-cultural factors? The
    comparison of natural and man-made disasters, and the supporting
    tables, in chapter two raise more questions than they answer: why are
    both types increasing at almost identical rates (in glaring contrast
    to the stated conclusion)?

    Part two looks at the prevalence of technological disasters. (I
    thought we just did that?) Chapter three says nothing new about Y2K.
    The theories of technological disasters, in chapter four, are flawed
    by an overly simplistic view of systems, one which completely ignores
    the inherent tendency of complex systems in general, and digital
    systems in particular, to catastrophic failure modes. As noted, the
    book is heavily larded with tables and figures, most of which have
    little apparent relevance to the text, and some of which actually seem
    to contradict the written material. One example in this chapter
    points out that the figures are, themselves, unexplained and poorly
    captioned: a diagram with six numbered interrelationships is followed
    by a numbered list--for a completely different set of factors. In
    chapter five the authors set up an odd, and poorly explained, matrix
    of "systemic dimensions" underlying disasters. "Human Factors
    Factors" (sic) are technological (as opposed to social) systems and
    external (as opposed to internal) systemic factors. The reporting of
    details in the examples in this and other chapters is suspect: despite
    specific and itemized accounts of the Therac 25 tragedy in at least
    two of the references listed for this chapter, the authors insist that
    somehow the type of radiation was at fault, rather than the flawed
    user interface that allowed incorrect dosage settings to be retained
    by the device, even after the operator believed the error had been
    rectified.

    Part three supposedly looks at technological disasters since the
    industrial revolution. Chapter six meanders through a wide variety of
    industrial "revolutions," and then delves briefly into future biotech,
    nanotech, and robotics/artificial intelligence. A terse and bemusing
    expansion of the earlier four part matrix into twelve goes on in
    chapter seven.

    Part four provides an "Analysis of Case Studies of Technological
    Disasters." Chapter eight insists on fitting a number of tragedies
    into the matrix from chapter seven. The reasons for the choices are
    not obvious: the authors insist throughout the book that the Bhopal
    poison gas release was due to "socio-cultural factors" when it is
    clearly, as far as the book recounts, due to greed and a lack of
    provision for safety equipment and procedures. (Another table
    maintains that Bhopal was an "accident" while the sinking of the
    Titanic, with far less impact in deaths and injuries, was a disaster
    and a tragedy.) Chapter nine lists one "lesson learned" from each of
    the "case studies": actually, what all of them have in common is the
    fact that technological disasters have *numerous* causes, not just a
    single one. The Tenerife airliner crash, as only one example, was
    caused by overloading of a backup situation, fear of regulations that
    made no provision for emergencies, miscommunications, failure to
    verify communications, pressure of overloading of facilities, and
    other failures.

    Part five talks about strategic responses. Chapter ten states that
    scientists need to stress professional education and safety. Now, I
    can sympathize with that attitude in large measure: as a virus
    researcher I've been crying in the wilderness about malware for many
    years, and have recently been exhorting corporations to support free
    public security awareness training as a benefit to the enterprise by
    reducing overall levels of risk. I think it a bit unfair, though, to
    put all the weight for safety on the shoulders of the professionals,
    when the rest of society is completely obsessed with time-to-market
    and dancing pigs. Chapter eleven tacitly admits this fact, with case
    studies that demonstrate that in many instances of corporate
    wrongdoing the executives were warned of the dangers in advance. No
    recommendations for specific responses are made. The four legal
    branches of the United States government, and their relationships to
    technology, are listed in chapter twelve: again, no suggestions are
    forthcoming. A fairly standard overview of risk analysis is given in
    chapter thirteen, which, I suppose, might be some kind of endorsement
    of and recommendation for risk analysis itself. Chapter fourteen
    assumes that "democratic" decision making is better than "technical,"
    without ever examining the dangers of social and political influences
    forcing the bad public policy rulings that the case studies in the
    work truly demonstrate.

    This book actually says very little about either technology or
    technological disasters: most of the evidence points out fraud,
    avarice, and other social factors that create most any kind of
    disasters. For those who really do want to know how to make
    technology safer, it would be best to look elsewhere.

    copyright Robert M. Slade, 2004 BKMNDMCH.RVW 20040527

    --
    ======================

    ============= for back issues:
    [Base URL] site http://victoria.tc.ca/techrev/
    or mirror http://sun.soci.niu.edu/~rslade/
    CISSP refs: [Base URL]mnbksccd.htm
    Security Dict.: [Base URL]secgloss.htm
    Book reviews: [Base URL]mnbk.htm
    Review mailing list: send mail to
    or
     
    Rob Slade, doting grandpa of Ryan and Trevor, Sep 28, 2004
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Writer R5
    Replies:
    1
    Views:
    528
    Vlvetmorning98
    Nov 27, 2003
  2. Mike McGee
    Replies:
    0
    Views:
    717
    Mike McGee
    Dec 8, 2003
  3. Rob Slade, doting grandpa of Ryan and Trevor

    REVIEW: "Disaster Recovery Planning", Jon William Toigo

    Rob Slade, doting grandpa of Ryan and Trevor, Jan 5, 2004, in forum: Computer Security
    Replies:
    0
    Views:
    607
    Rob Slade, doting grandpa of Ryan and Trevor
    Jan 5, 2004
  4. Rob Slade, doting grandpa of Ryan and Trevor

    REVIEW: "Network Security Essentials", William Stallings

    Rob Slade, doting grandpa of Ryan and Trevor, Apr 28, 2004, in forum: Computer Security
    Replies:
    9
    Views:
    816
    Bruce Barnett
    May 18, 2004
  5. HELPMV

    Especially for Evan Platt

    HELPMV, Jan 12, 2006, in forum: Computer Support
    Replies:
    10
    Views:
    597
    Evan Platt
    Jan 14, 2006
Loading...

Share This Page