Chapters 2 through 6 show a substantial gap between public opinion as viewed by practitioners and as measured by polls. The most obvious interpretation of the gap is that practitioners are misreading the public. But another interpretation is possible: that the polls themselves offer a misleading or incomplete picture. They may fail to get at dimensions of public opinion that are particularly important to those who make policy. Pollsters may not be asking the right questions, or they may be asking them in a way that does not reveal the whole story.
Our project was structured to explore the possibility that the practitioners might be right. Once our interviews showed the existence of the gap, we entered a dialogue with practitioners aimed at generating challenges to the findings of public opinion surveys. We pursued this dialogue through a series of workshops in which we presented practitioners with public opinion data and encouraged them to challenge this data. In particular we sought to elicit ideas for poll questions that might reveal aspects of public attitudes on international engagement not found in previous surveys.
Three such workshops were carried out in the spring of 1996 with officials from the State Department, the National Security Council, the Defense Department, the Office of Management and Budget, congressional staff members, journalists, and representatives of several non-