Magazine article The Futurist

Human Factors: The Gap between Humans and Machines

Magazine article The Futurist

Human Factors: The Gap between Humans and Machines

Article excerpt

Human Factors THE GAP BETWEEN HUMANS AND MACHINES

The most advanced fighter plane the United States has, the F-16, can perform without structural damage at a remarkable 12 times the force of gravity--12 G's. Since its introduction more than a decade ago, several dozen Air Force pilots have died flying the F-16, because even the best pilots with the best training and the best equipment can function at no more than 9 G's. That 3-G difference is what psychologists call "human factors" -- the gap between human and machine capabilities.

As advanced technologies become increasingly pervasive in the workplace, the problem of human factors will increasingly confront managers. The fact that few managers have even heard of human factors -- let alone thought of how to deal with them -- is cause for great concern.

Information-systems authority Paul Strassman notes that, over the last decade, data-processing budgets for U.S. corporations increased an average of 12% a year, while productivity increases averaged only 2% a year. According to Strassman, managers are beginning to be disturbed by this discrepancy Although we may not really know how to measure productivity in a service economy, the evidence appears unassailable that whatever it is we are measuring has not appreciably benefited from the massive influx of technology.

Productivity expert George H. Kuper, head of the Manufacturing Studies Board of the National Academy of Sciences, says that managers are reluctant to face up to the "social revolution" required to effectively integrate technology into the workplace. The implication of course, is that vast sums of money have been spent on advanced technologies without any real understanding of how people might interrelate with these technologies.

In offices, the most troubling manifestation of the human factors problem has to do with information. Machines can now give us more information more quickly than we can possibly absorb it. There is no possibility of understanding all the information or of utilizing it effectively.

Food companies, for example, used to get market information monthly. Now they get it weekly, and marketing people are still struggling to process one week's information and make the right decisions based on it when the next week's information arrives. Before long, they will be receiving that information daily, and the prospect fills them with dread. Yet, they cannot avoid getting it, because they fear appearing to do less than their competitors.

We are in the grip of a seemingly irresistible technological imperative that, in effect, says, "Go, at all costs, go." But we must pause and think about what we really need and wether available technology can help us get what we need.

Philosopher Daniel Dennet says that information technology can ruin our lives unless we think of some radical ways to get it under control. He points out that, in the past, people could lead virtous lives of "unavoidable ignorance." We can no longer be ignorant with a clear conscience. We now have unlimited opportunities to know, and this information obligates us to act. At the same time, as Dennet points out, "we drown in the available information, unable to make ...decisions."

Anthony Smith, director of the British Film Institute, adds that the overabundance of information has resulted in a loss of the advantages of distance and doubts. Where information is complete, there is no room for intuitive speculation. When information is immediate, there is a shortening of the time available to reflect, ponder, and consider in decision making.

In other words, the availability of information mandates its use. As a consequence, there is an implied obligation to use all the available information when making any decision. And when the information is more than the human brain can process effectively -- as it increasingly is -- the result is not faster decision making, as was optimistically anticipated, but a delay in or even abdication of decision making What happens is what author Stan Lee, in his wry satire on the information age entitled Dunn's Conundrum, called "negative information" -- that is, information that reduces rather than increases one's knowledge. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.