Thursday, April 17, 2014

Trapped in the Net

From the Bookshelf Dept.: I remember reading the book; I remember writing the review. I just can’t figure out who I wrote it for. Local newspaper? Computer magazine? My archives are mum on this point, but re-reading the review reminds me that it was a pretty prescient volume.

                                                                             

WE ALL KNOW THE DESTINY of that new personal computer destined for the home: no matter how ardently you declare it to be for personal productivity, it becomes a game center and a frivolous Internet access screen. And, to a large extent, a waste of time.

The scenario isn’t all that difference in a business environment. As financial trading, for example, became more computer based, two problems emerged: the abstraction of moving money with a computer, and the undiscriminating tools of automation. Both of which probably contributed to the stock market crash of Oct. 1987:

“Many experts blamed [the crash] ... largely on computerized trading programs,” writes Gene I. Rochlin in Trapped in the Net, “which kicked in at the first steep decline, turning what would otherwise have been only an unusually bad day into an automated panic, driven entirely by electronic transactions.”

Rochlin, a professor of social science at the University of California, Berkeley, has been studying the effects of computerization on a number of critical areas affecting our lives, such as finances, the airline industry, the military and nuclear power plants. While computers are installed with the promise of better control and job simplification, nasty far-reaching effects are emerging, without a lot of public scrutiny.

Subtitled “The Unanticipated Consequences of Computerization,” Rochlin’s book takes a sober look at those effects. In the case of the stock market crash, he writes, “markets are symbols – promises rather than tangible goods,” and thus all the more susceptible to the robotic characteristics of computer control. Traders no longer concentrate on the market floors, thus regulators no longer can use old-fashioned methods of riding herd on the markets.

The example of Nicholas Leeson and the downfall of Barings P.L.C. in 1995 is cited as another example of computerization gone amok, in this case allowing Leeson unprecedented access, power and independence. Orange County, CA, treasurer Robert L. Citron was another whose uncontrolled speculation bankrupted the county.

Computerization also demands upkeep; in the wake of the World Trade Center bombing, it was discovered that few of the computer networks affected by the blast had reasonable backup and recovery strategies.

Can computers process and impart too much information? In U.S. Navy lingo, you “have the bubble” when you’re able to make sense of the overwhelming amount of data on display to manage an aircraft carrier–and not even seasoned officers are able to make sense of the info without developing a cognitive model to assimilate it, something that requires both constant access and an almost intuitive way of taking in the data. Thus, duty shifts are limited to two hours lest one of those specialists should “lose the bubble.”

Even having the bubble doesn’t guarantee success. In 1973, the Israeli Air Force brought down a Libyan airplane that had veered off course thanks to a sandstorm over Egypt. Both the Israeli and the Libyan crews had cognitive models for what seemed reasonable in different contexts. For the Israelis, it was a model of attack; the Libyan pilots continued to believe they were approaching Cairo until it was much too late. Similar circumstances prevailed in the 1987 attack by an Iraqi Mirage jet on the USS Stark.

The book presents many such case studies. The Airbus 320, a civilian airliner with an unprecedented amount of computer control has crashed spectacularly; other problems both in the cockpit and in the air traffic control tower are chillingly detailed. Air traffic controllers continue to hang onto many old-fashioned aspects of their job, as in the use of lengths of paper, called Flight Progress Strips, on which are scribbled the identity and coordinates of the nearby sky traffic. This is to ensure that the controller remains actively involved with what’s out there, and it’s a system that endures in spite of attempts to put computers in more control.

But air traffic controllers are highly visible, Rochlin observes, easily able to get their way. Such isn’t the case in factories and plants, particularly potentially dangerous sites like nuclear power plants. Again, specialist employees develop cognitive systems for dealing with computer-provided information, and resent the changes wrought by technicians with no similar real-world experience.

Trapped in the Net doesn’t offer much in the way of solutions, but certainly is valuable as a warning about this dangerous interface between humans and machines. Rochlin’s writing style is drily professorial, but his research is impressive and his presentation is logical and effective. Far from the popular sci-fi tales of a sensate machine taking cruel control of the world, he shows us that a scary scenario already is taking place as computers fail to replicate human cognitive skills–or numb those skills into oblivion. You’ve already had a taste of it in your bank and possibly on the job; much more frightening are such problems at thirty thousand feet.

Trapped in the Net by Gene I. Rochlin
Princeton University Press, 293 pp., $29.95

– 16 October 1997

No comments:

Post a Comment