Academic journal article Journal of Research Administration

Why Do Ethical Scientists Make Unethical Decisions?

Academic journal article Journal of Research Administration

Why Do Ethical Scientists Make Unethical Decisions?

Article excerpt

Never let your sense of morals get in the way of doing what's right.

~Isaac Asimov

In December of 2002, the Office of Science and Technology Policy defined research misconduct as "fabrication, falsification, or plagiarism (FFP)--the "high crimes"--in proposing, performing or reviewing research results (OSTP.gov). However, as discussed below some commentators suggest there is a much wider--and grayer--area of misbehaviors and faulty decisions that are not captured in this limited definition. If these troubling practices are allowed to continue unchecked they will eventually erode any attempt to establish a solid foundation of responsible conduct of research.

Martinson, Anderson and de Vries (2005) state that serious misbehavior in research is important for various reasons, not least because it damages the reputation and undermines public support of science. They suggest that, in light of the public's penchant for headline grabbing cases of scientific and medical misconduct, the research community can no longer afford to ignore the ever-widening array of integrity issues.

The question always is: "Why?"

Martinson, Anderson and de Vries (2005) surveyed several thousand early- and mid-career US scientists funded by the National Institutes of Health (NIH) and asked them to report their own behaviors. Although the survey did not attempt to link specific behaviors to specific incidents, the results yielded a range of questionable practices. These results force a closer examination of the "negative aspects of the research environment."

The modern scientist faces intense competition for limited research grants, which can create many scenarios for compromise that extend well beyond FFP (Martinson, Anderson and de Vries, 2005). The survey authors state: "In ongoing analysis, not yet published, we find significant associations between scientific misbehavior and perceptions of inequities in the resource distribution processes in science." These behaviors undermine the scientific process, could lead to "misuse of public monies," and generally foster an environment that lacks integrity (Mitchell, 2005). Lower (2005) is more blunt: "Corporate America provides a research environment that is not particularly conducive to good scientists or good science."

I suggest that there is more to the issue than a simple succumbing to the pressures of "publish or perish" or the demand "show me the money." One needs to consider from where core belief systems come and how they may be affected by outside influences.

Values, beliefs, moral, ethics and integrity are intricately interwoven concepts and are consistently--albeit mistakenly--used synonymously. Benefiel (n.d.) says that values are learned from childhood. These are the beliefs that children absorb from those who raise them and from their immediate surroundings. Benefiel (n.d.) goes further to say that morals are the intrinsic beliefs developed from the value systems of how one "should" behave in any given situation and that ethics are how one actually does behave in the face of difficult situations that test one's moral fiber.

Kidder (2005) talks about moral courage as, simply, the courage to be moral. To be considered moral, he says our moral fiber must adhere to one of five core moral values: honesty, respect, responsibility, fairness and compassion. As one attempts to examine past incidences of scientific misconduct, the inherent breaches of research integrity, and the prevailing conditions that cause ethical scientists to make unethical decisions, one needs to understand these basic or core values, agree on some common definitions, and recognize the influence of outside forces.

Kidder (1995) notes four basic paradigms of ethical decisions: 1) justice versus mercy--fairness, equity and even-handed application of the law often conflict with compassion, empathy and love; 2) short-term versus long-term--difficulties arise when immediate needs or desires run counter to future goals or prospects; 3) individual versus community--this can be restated as us versus them, self versus others or the smaller group versus the larger; and 4) truth versus loyalty--honesty or integrity versus commitment, responsibility or promise-keeping. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed

Oops!

An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.