Goodstein, David, Academe
Scientists aren't saints. Although few falsify results, the field is so competitive that many misbehave in other ways.
My career in scientific misconduct began more than a decade ago. That's when I realized that federal regulations would soon make it necessary for all universities to develop formal rules about what to do if the unthinkable were to happen: that scientists at their institutions would be suspected of fraudulently misrepresenting the results of an investigation or the procedures needed to replicate those results.
Since then, scientific misconduct has become a virtual academic subspecialty for me. I have given lectures, written articles, and taught courses about it. I have also drafted regulations, seen them adopted by my institution (the California Institute of Technology), copied by other universities, and, much to my dismay, put into action in a highprofile case at Caltech.
During that case, I had the remarkable experience of seeing a skilled lawyer, with a copy of my regulations highlighted and underlined in four colors, guide participants in following every word I had written, whether I had meant it or not. Through all of that, I have learned things about conduct and misconduct in science that I would like to share with you.
Let me begin by stating right up front what I have come to believe. Serious misconduct, such as faking data, is rare. When it does occur, it is almost always in the biomedical sciences, not in fields like physics, astronomy, or geology, although other kinds of misconduct do happen in these fields. Science is self-correcting, in the sense that a falsehood injected into the body of scientific knowledge will eventually be discovered and rejected. For just that reason, dissemination of falsehoods is never the purpose of those who perpetrate scientific fraud. Still, active measures to protect science are needed, because if the record became badly contaminated by fraudulent results, it would no longer be self-correcting.
For a long time, the government made a mess of trying to protect science. Government agencies performed poorly in this area partly because they mistakenly tried to obscure the important distinction between real fraud and lesser forms of misconduct.
In addition to these observations, I have also concluded that we scientists are complicit in presenting to the public a false image of how science works, which can sometimes make normal behavior by scientists appear suspect. Let me try to explain what I mean by all of this.
First, a word about terminology. People are touchy about words in this business. When a philosopher colleague and I decided to offer a course in this subject, we wanted to call it "Scientific Fraud." But the faculty board, in its wisdom, didn't want us teaching that to our students, so we had to call it "Research Ethics." The federal government, in all its gyrations, has to this day studiously avoided using the word "fraud" in connection with scientific misconduct, because in civil law that word has a specific meaning. I, however, am not afraid to call a fraud a fraud.
Intent to Deceive
Fraud means serious misconduct with intent to deceive. Intent to deceive is the very antithesis of ethical behavior in science. When you read a scientific paper, you can agree or disagree with its conclusions, but you must be able to trust its account of the procedures used and the results produced by those procedures.
To be sure, minor deceptions arise in virtually all scientific papers, as they do in other aspects of human life. For example, scientific papers typically describe investigations as they logically should have been done rather than as they actually were done. False steps, blind alleys, and outright mistakes are usually omitted once the results are in and the whole experiment can be seen in proper perspective. Also, the list of authors may not reveal who deserves most of the credit (or blame) for the work. …