Research Replication in Behavior Analysis

Research replication is a core tenant of science. Thus, the surprising recent report that only 60 psychology research studies out of 100 attempts were replicated has raised general concerns about “scientific” findings in psychology and their application. The report is particularly concerning because, in these failures to replicate, there was no evidence that the original experimental data had been fabricated or otherwise manipulated. Although there may be valid reasons for such failures to replicate (and none of the studies were related to behavior analysis) the findings do beg the question of how trustworthy behavior-analytic research is as well.

A major difference between the “traditional” research reported in the report and behavior-analytic research is that behavior analysis has replication built into every experiment. An effect is shown by comparing the treatment condition to a control condition, and the control condition is such that every participant serves as her own control. Because of this, within every experiment there are as many replications, called “direct replications,” of the effect as there are participants. Another type of direct replication occurs when, after showing an effect, the treatment is withdrawn and the baseline in the absence of any treatment is re-established. Reinstating the treatment rounds out the analysis by showing that the effect is reinstated. In some situations, of course, it is unethical to withdraw and reinstate a treatment that is working, but in such cases behavior analysts use other techniques to ensure that the treatment is having its desired effect.

If a participant fails to show an effect at a given treatment level, behavior-analytic research does not “average out” the failure by lumping all of the data together and taking an average effect of the treatment across all the participants (see “Average Joes and Josephines”). Rather, the treatment often is “tweaked” by trying a different level until an effect is shown. Of course sometimes such tweaking still doesn’t work, but, again, instead of averaging out the failure, it is reported along with positive findings and identified as an  anomaly that invites further analysis.

Effects of treatment often vary as a function of the exact levels of the treatment conditions that are studied. For example, response rate of a pigeon varies in a very orderly way in relation to the frequency with which a key-peck response is reinforced: more frequent reinforcement yields higher response rates. By showing such an orderly effect for each participant in treatment data as the treatment varies, called a “systematic replication” in behavior-analytic parlance, the investigator or therapist can be increasingly confident that the treatment effects are “real.” Such systematic replication can occur within an experiment, when a participant is exposed to a number of treatment conditions in which the treatment values are slightly different from one another, and across many experiments which involve variations on the treatment.

There are never air-tight guarantees in any science that an effect is replicable. For many reasons, in behavior analysis as in other sciences, there are cases of fraud and scientific misconduct. Such unethical behavior is rare, but it does occur. Barring these egregious incidents, which cause the public to lose faith in any science’s findings, we must look to the methods of the science to understand why a finding reported by one investigator is not replicated by another. Maybe a requirement of scientific reporting should be that a finding has to be replicated before it can be reported. Such is regularly the case in behavior analysis, through the direct and systematic replications discussed herein, and it seems a good criterion to apply to every psychological study.  

Posted by Andy Lattal, Ph.D.

Dr. Andy Lattal is the Centennial Professor of Psychology at West Virginia University (WVU). Lattal has authored over 150 research articles and chapters on conceptual, experimental, and applied topics in behavior analysis and edited seven books and journal special issues, including APA’s memorial tribute to B. F. Skinner.