This website covers knowledge management, personal effectiveness, theory of constraints, amongst other topics. Opinions expressed here are strictly those of the owner, Jack Vinson, and those of the commenters.

Great Information Disasters

About a month ago, I came across a reference by Duane McCollum to Great Information Disasters edited by Forest Horton and Dennis Lewis.  I found a copy at a local college via inter-library loan.  The book is a collection of "Twelve prime examples of how information mismanagement led to human misery, political misfortune and business failure."  It was published in 1991.  Some of it was a little dry, but it was interesting to see how information mismanagement creates problems just about anywhere. 

Some of the best insights were repeated a number of time:  Don't focus so much on the tactics that you lose sight of the overall goal of what you are doing, whether that is to succeed in business or win a war.  Don't fall in love with the technologies you use so much that you can't see when the technology won't work.

I made it through 3/4 of the book before having to return it with my comments.  I assume the remaining sections are just as interesting and informative.  Here is a brief description of each section I read.

  1. Is the West Losing the Information Productivity Contest?
  2. Hitler's Decision to Attack the Soviet Union, 1941.  This was your classic story of a misguided leader with no one who was strong enough to tell him, "no."  At the same time, the Soviet Union continued trying to work with Germany in all attempts to avoid war.  In the end, Germany attacked and discovered a variety of holes, mainly that the Red Army was far stronger than they expected.
  3. Three Mile Island: The Information Meltdown.  The 1979 meltdown at Three Mile Island was part of my education in chemical engineering.  The control room was a maze of lights and indicators, manageable during normal operations.  But it an emergency, there are so many things happening that operators cannot see the big picture from all the flashing lights and indicators.  On top of that, there were known problems with the facility that had either been ignored or swept under the rug.  There was even a temperature sensor accurately reading a massive temperature diversion, but since it was never expected to read above 700 degrees, that is where the maximum reading sat.  And then the emergency communication system broke down with some people urging evacuation, and others saying "everything's fine."
  4. The Tacoma Bridge Disaster: A Lesson in Disregarding Information?  Every university physics student has seen the dramatic footage of the Tacoma Narrows suspension bridge oscillating and then collapsing in the wind.  FOOTAGE??  This piece claimed that the designer essentially ignored or was unaware of 100 years of information on oscillations induced by the wind in suspension bridges.  The most damning data is a table comparing a number of critical parameters on Tacoma and other suspension bridges of about the same era.  Tacoma's parameters were at least an order of magnitude different, if not multiple orders.
  5. Cultural Dissolution, a Societal Information Disaster: The Case of the Yir Yoront in Australia.  This was an interesting accounting of the dissolution of a native population in Australia.  They were still operating with stone age technology and within a very strict social order around that technology.  When well-meaning westerners arrived, they brought new technologies that ended up breaking the social order and very quickly destroying the culture.  This happened in the mid-1900's.
  6. Disaster at Arnhem: The Role of Information During the Operation 'Market Garden' in September 1944.  The Allies needed access to the sea port at Amsterdam, then occupied by the Germans, so they planned an attack at Arnhem to isolate Amsterdam.  The attack involved land and air support and was executed very quickly without testing critical communications equipment and without ensuring understanding across multiple command units.  The result was a massive set of gaffes and mistakes and a setback for the Allies.  Radios were broken.  Military units didn't move to where they were expected to be.  The Allies weren't aware of recent German movements.  The Allies didn't take advantage of the Belgian and Dutch underground, who had access to communications.  And there was no one in direct command of the entire operation.
  7. The PPS Information System Development Disaster in the Early 1980's.  This article was set up as a description of a terrible dream from which the project manage awoke only to realize that the dream was a perfect description of everything that had gone wrong with a massive Personnel Payroll System.  The nine components of what went wrong included several familiar problems with big installations.  My favorite was the aspect that the proposed new system was not going to fix anything that was really wrong with the university as a whole.  It wasn't going to increase enrollment or reduce real expenses.  And after spending $4 million in four years, they barely had a workable system.
  8. The Events of October 1987.  This section focused on two disasters in England: one was a massive storm that damaged 1 in 6 homes in England (~ 1 billion Pounds Sterling), and the other the global stock market crash that wiped 20% from the value of the market.  In both cases, the information disaster revolved around the problems of relying on technology over being smart about the data being presented and understanding whether the systems can operate under those conditions.
  9. The Pinnacle of Deception: Civil War Intelligence and Signals in 1864.  This article suggested that the US Civil War was the first war in which significant spying and counter-intelligence efforts were used on both sides of the conflict.  The South fed false information to the North in order to distract them and pull the North away from critical Southern supply lines and weak defenses.  At that time, 140 years ago, spying consisted of monitoring each other's flag signalers and decrypting the messages.  The North was easily fooled into believing the false evidence because it fed into familiar fears about what the South might do. 
  10. $170,000 Down the Drain: The MRAIS Story.
  11. Comments on Gaskill's 'Timetable of Failure'.
  12. The US Stock Market Crash of 1987: The Role of Information System Malfunctions.

Effectiveness and busy-ness

Oblique strategies