An unknown man cruised into the corporation's private parking lot, flashing the security guard his driver's license instead of a company ID. He entered corporate headquarters and advanced through the marble and glass lobby. The officer at the sign-in podium never even looked up. From there, it was a quick elevator ride to the executive suite where he displayed to the secretary a "personal and confidential" letter for the CEO. "Sure, take it in yourself," she said. "The boss is in there now."
The man delivered his message with a flourish, tossing it onto the CEO's walnut desk. The boss looked up in horror. "Gotcha!" it read. The perpetrator was the security contractor's district manager, out to prove that a security breach could happen easily.
Many opinions were voiced in the aftermath of the stunt. Some said the problem was the low wage paid the security officers. Others thought it was the lax work ethic of the employees. Still others focused on a lack of proper training. The eventual outcome - and the one that the district manager wanted - was a 50 percent increase in the contractor's permanent working hours. But was this the best approach to better security?
No. A better approach is human factors psychology, the application of behavioral science to workplace performance issues. Human factors methods, especially signal detection theory (SDT), can be used to diagnose performance problems, reduce threats, improve security officer morale, and save time and money.
Hit or miss. A major element of a security officer's job is to detect rare events. Once in a while, a driver like the district manager tries to enter a private lot with a false ID. Similarly, an occasional visitor will fail to sign in at the lobby security podium. The security officer's duty is to decide whether or not these actions indicate that danger is present.
When faced with a potential "rare target," the security officer's decision results in one of four outcomes, three of which have consequences. The officer can take action and intercept a genuine threat - a hit, as it is called; the officer can take action in the absence of a threat, raising a false alarm; the officer can take no action and experience no negative outcome - a correct rejection; or the officer can take no action and allow a crime or other security breach to occur.
A hit could stop a thief or an ex-employee bent on revenge. For security officers and their supervisors, a hit might mean a bonus. A miss can lead to real trouble - an injured employee, stolen money or merchandise, or theft of company secrets. Too many misses and a security contractor might be terminated. The third outcome with a consequence - a false alarm - will be tolerated to a point, but too many false alarms gives an officer a detrimental hair-trigger reputation.
Signal detection theory. Signal detection theory is a systematic, data-based approach to human vigilance that can help maximize the likelihood of a hit, whether the target is a false ID or a weapon in a suitcase, while minimizing false alarms. Applied wherever rare events must be caught, SDT has improved the detection of defects by inspectors in automotive, glass, and aircraft manufacturing operations. SDT has also been used to increase a physician's chance of detecting hard-to-find breast tumors.
Two factors fundamentally affect the detection of rare targets, regardless of whether the target is a tumor in human tissue, a bubble in a glass windshield, or a mugger in the bushes. First is the detectability of the target, or the ratio of the strength of the stimulus to the "background noise." Second is the bias, or expectancy, that a target will actually appear.
Detectability. With regard to metal detection, for example, such weapons as a .44 magnum revolver would have a very high hit rate and a low false alarm rate. On the other hand, a difficult target would be a weapon such as the largely plastic Glock 17, containing less than half the steel in a .44 magnum revolver. The chances of either a hit or a false alarm are less with a difficult target. A weapon is ranked "impossible" to detect when chances of both a hit and false alarm are equal.
In general, detectability depends on the ratio of the signal - or target - to the background noise. For example, glare on an x-ray screen increases background visual noise and lowers the signal-to-noise ratio. Therefore, hits will be less likely. Likewise, when a security officer forgets his or her reading glasses and can't see properly, the signal-to-noise ratio declines and the chance of a miss increases.
The higher the ratio of signal to noise, the greater the chance of detection and the lower the chance of a false alarm. The security manager who made it into the boss's office had a fairly low signal to noise ratio: he was dressed in a business suit to blend in with the crowd. If he had dressed in a tie-dyed t-shirt, his signal to noise ratio would have been higher.
Response bias. Bias is the second factor affecting whether a security officer will catch an intruder, spot a false ID, or see the weapon in the ghost images of an x-ray. Bias can be defined as the officer's expectancy that a rare target will appear.
The effects of bias on hits and false alarms can be significant. When targets appear rarely, they become unexpected (out of sight, out of mind). When a target does appear, it is easily missed. Likewise, since the officer is biased against saying "I see it," he or she is also likely to have a low false alarm rate. On the other hand, when the chance of a threat is high, the chances of both hits and false alarms increase. In other words, because the officer expects more targets, he or she is biased toward saying "I see it" both when it is there and when it is not.
Recently, a three-month experiment confirmed that the frequency of a threat does indeed affect the chance of detecting the threat. False images of handguns and knives were occasionally projected onto x-ray images of carry-on baggage screened at an airport checkpoint. A seven-fold increase in test weapons among normal baggage led to a 33 percent increase in the probability of a hit, along with a 30 percent increase in the probability of a false alarm.
At one airport security checkpoint in 1989, there was an 88 percent hit rate for test objects surreptitiously passed through the checkpoint by the FAA and the airlines. When the number of test objects hidden in baggage was raised by 20 percent, the bias of the security screeners was altered and the probability of a hit rose to 95 percent. As expected, false alarms also increased slightly, but stayed within tolerable limits.
Reinforcements. Reward or correction affects bias in the same way that the rate of threat presentation does. If, for example, an x-ray operator gets $50 for every threat detected, the probability of a hit will increase along with false alarms. Likewise, if the operator's job depends only on detecting a few standard test objects, that is exactly what he or she will look for, to the exclusion of other threats.
Rewards do not have to be monetary. They can be compliments, small trophies, notes from the boss, or other kinds of positive reinforcement. This point was shown dramatically during an experiment at a major European airport. Detection of simulated weapons concealed in baggage declined from 85 percent to 35 percent over a three-month period simply because positive feedback to the x-ray operators was totally eliminated.
Correctional feedback is also necessary. The perpetrator of the "Gotcha!" message made it through the lobby of the company's headquarters without a challenge because the officer at the sign-in podium was never complimented for catching someone who failed to sign in, nor had he been reprimanded for failing to catch them. Without consequences for good and bad performance, overall performance declines.
Each person in the security chain at the headquarters in the opening example - the parking lot guard, the security officer, the executive secretary, each of whom was charged with detecting a rare event - was biased against saying "Stop." The best approach is to alter staff bias with simulated targets and to provide positive feedback when targets are caught and correctional consequences when they are not.
Aptitude. People differ in their ability to detect visual signals within visual noise. For security companies charged with finding the odd weapon in a blizzard of suitcases or the forged ID card among hundreds of good cards, it is important to measure aptitude for the job before an applicant is hired. An aptitude test that assesses a person's ability to find visual signals within visual noise can help rule out those who are likely to make mistakes.
Aptitude test scores have been shown to be highly correlated with detection ability. In one study, the paper and pencil aptitude test reliably predicted the success of potential analysts in finding contraband in x-ray images of intermodal shipping containers.
A human factors research program was aimed at selecting and training operators of a high-energy x-ray system for U.S. Customs. The operators analyzed x-ray images with a complement of computer-based enhancement tools so that a decision could be made whether to pass the container through customs or recommend that it be searched. Before being trained, the fifteen candidates took the aptitude test.
It was found that the test score and the performance measure were highly correlated. For example, the highest scorer on the test, a former helicopter pilot who specialized in low-level night flying, also had the highest performance score. The score on the fifteen-minute preemployment test neatly predicted the results of a costly, week-long training course as well as a month of on-the-job training.
Whether a security officer will detect a rare threat depends on the individual's aptitude for detection, the physical properties of the threat, the background noise, the bias of the security officer, and the reinforcement or correction given for performance. Before management decides whether new equipment or additional staff are needed, the company should carefully analyze these performance factors.
Michael B. Cantor, Ph.D., lives in Atlanta. He is a human factors psychologist and consultant. Bill Maddox and Larry Parrotte of Argenbright Security, and Andy Johnson of Analytical Systems Engineering Corporation, assisted with this article.…