Reporting with the Tools of Social Science: 'We Had Put the Social Scientists on Notice That Journalists Increasingly Would Be Competitors in Their Field.'
Doig, Stephen K., Nieman Reports
Cynics would say "precision journalism is an oxymoron, like unbiased opinion or civil war. But precision is an ideal to be sought in journalism, though not often achieved.
As defined by Knight Ridder reporter Philip Meyer in his groundbreaking 1973 book of the same name, precision journalism is the use of the tools of social science to replace, or at least supplement, reporters' time-honored methods of citing anecdotal evidence and doing educated guesswork. Today, thanks to Meyer's call, I'm one of hundreds of investigative reporters who have crafted serious stories using such tools as survey research, statistical analysis, experimentation and hypothesis testing. It's social science done on deadline.
As it happens, I got a glimmering of these methods even before I discovered journalism as a career. In my freshman year at Dartmouth in 1966, I slogged my way to a so-so grade in calculus, the last math course I ever took. I remember little calculus today, but I did learn something my professor, John Kemeny, had coauthored two years earlier: the computer language BASIC. I thought it very cool that I could peck a few lines of English-like instructions into a teletype machine and seconds later a mainframe computer somewhere on campus would calculate precinct-level vote percentages for my American Government homework.
However, it was 15 years later before I got a chance to start applying such methods to my journalism. The problem was that much of what Meyer recommended could best be done with a computer, which during the 1970's meant big-iron mainframes that only universities or corporations could afford. It was nearly a decade before personal computers were developed and usable by nontechies like myself.
By then, in 1981, I was a reporter in The Miami Herald's state capital bureau, and I had bought an Atari 800 computer to play with at home. I quickly realized that my expensive toy could help me do my job better. I relearned BASIC, then persuaded my editors to buy one of the new-fangled IBM PCs for me to use at work. At one point, I spent a week writing and debugging a program that would take a legislative roll call vote and produce cross-tabs not only by party but also by such other revealing political demographics as race, gender, geography, leadership position, and source of campaign contributions. It would even write the roll call agate we appended to legislative stories. (Today, of course, such an application could be built in minutes with off-the-shelf database software.)
A couple of years later, I got to meet Meyer for the first time. He did a day-long training seminar at the Herald attended by a few of us, including Richard Morin, who later would go on to be the polling director for The Washington Post for nearly 20 years. Rich and I, in particular, came away from that seminar inspired to be precision journalists.
So I spent the next decade at the Herald teaching myself, in bits and pieces, the social science tools I hadn't had enough sense to study when I was in college, from statistics to cartography. An example of my academic cluelessness came in 1991, when I was working with Knight Ridder colleagues Dan Gillmor and Ted Mellnik on a project about racial segregation in the United States using just-released census data. I spent days trying to noodle together some way to measure the degree of segregation in a community, but nothing useful emerged. I finally mentioned my frustration to Mellnik, who then mentioned it to a friend who was a sociologist. "Oh, you want the dissimilarity index," the friend promptly replied, giving Mellnik a citation for an article describing it in a scholarly journal from the 1950's.
Armed with that already-invented tool, a month after the decennial census data was released we produced our analysis of how segregation had--or hadn't--changed in every state, county and city across the country since 1980. …