Regulations on experiments involving human subjects are not always followed to the letter. Credit: L. PATERSON/SPL

The watchdogs that oversee the ethics of human research projects can sometimes provoke scientific misconduct. That is the counter-intuitive conclusion of a series of papers to be published over the next few months. The authors, who specialize in research ethics, say they have evidence that some ethics panels are alienating researchers and inadvertently promoting deceit.

Patricia Keith-Spiegel of Simmons College in Boston, Massachusetts, says she began her studies after hearing of cases at other US institutions where scientists had violated research rules after feeling that they had been mistreated by institutional review boards (IRBs). Experiments involving human subjects in the United States, from social-science studies to medical research, must be rubber-stamped by an IRB. Researchers acknowledge that the boards are necessary to ensure that subjects are treated correctly, but sometimes complain that the boards fail to understand the research involved and do not explain their decisions properly.

As an example, Keith-Spiegel cites a researcher she knows who became frustrated at lengthy IRB review times and so routinely began data collection before receiving approval. Another researcher admitted to omitting aspects of protocols for research projects after receiving demands for numerous “picky” changes. Typical IRB requests include changes to consent forms or restrictions on the type of questions that subjects can be asked.

“I realized that there are scientists who want to do things the right way but who are having to distort their research protocols because of perceived unreasonable or ridiculous demands from IRBs,” says Keith-Spiegel.

These fears are backed up by a survey of misconduct rates among 3,000 researchers funded by the US National Institutes of Health. Published earlier this year, it found that a third of respondents had engaged in one of ten types of misconduct in the past three years (see Nature 435, 718–719; 2005 10.1038/435718b, and B. C. Martinson et al. Nature 435, 737–738; 2005). Further analysis of the survey data, to be published next March in the Journal of Empirical Research on Human Research Ethics, shows that misconduct rates were highest among researchers who felt that they had been unfairly treated by other governing bodies in science, such as funding review panels. A similar relationship is likely to exist between misconduct and the perception of unfair treatment by IRBs, says Brian Martinson of the HealthPartners Research Foundation in Minneapolis, Minnesota, lead author on the two studies.

Keith-Spiegel has also studied the issue by asking scientists' opinions on fictional situations in which an IRB refused researchers permission for a study. In cases where the IRB responded in a curt manner, rather than explaining its decision, subjects empathized with the rejected researcher and assigned a less significant punishment if that researcher went ahead and ran the study anyway. The results are still being analysed, but Keith-Spiegel says they seem to suggest that researchers are more open to committing misconduct if they feel wronged by an IRB.

Keith-Spiegel and Martinson say that their findings can be explained by organizational justice theory, a well-established method for studying workplace relationships. Studies in work environments other than science have shown that employees are more likely to commit misconduct if they feel their managers are not giving them due reward or are treating them unfairly. In a paper due to appear in next month's Ethics and Behavior, Keith-Spiegel argues that the same relationship can exist between IRBs and scientists.

Ethics committee chairs who spoke to Nature say they try to avoid such problems by maintaining a good relationship with scientists. “I've certainly heard of problems,” says Leigh Firn, chairman of an IRB at the Massachusetts Institute of Technology. “But we don't see ourselves as the ethics police. Unless it is something of substance we won't request changes.” Brian Shine, a consultant at Oxford Radcliffe Hospitals Trust, UK, and chairman of a local ethics committee, adds that he invites researchers to meetings to discuss potential problems and always writes to them afterwards to clarify the discussion.