The Behavioral Interface

At a recent luncheon I found myself in conversation with two interesting entrepreneurs who had, over the past 16 months, created what seemed to be a quite excellent behavior recording/management system for tablets, smart phones, and PCs. The system was designed for use by behavior analysts who develop and carry out behavioral treatment programs for young children with autism spectrum disorders.  The system was meritorious because of its flexibility in allowing the therapist to individualize the behavior being recorded for each child, its ability to present visually, in graphs, the results of interventions on an on-demand basis, and its ability to summarize quickly treatment programs and their effects for the funding agencies who provide oversight in monitoring such programs. I was impressed by all of these facets of the system they had created and told them so.

As our conversation continued, my two luncheon companions began discussing problems they had observed in the ways in which many of the therapists in autism treatment centers failed to record data correctly even when the system for recording is straightforward and requires little effort on the part of the therapist. They were particularly dismayed by the fact that the children’s very futures were on the line when therapists failed to implement treatment programs and record data as required by both the protocols and by oversight agencies. Indeed, they noted, their recording system was in part a response to such observations by them in their own treatment centers.

The luncheon scenario I describe is all too common, not only in working with children with ASD, but anytime there is a seeming foolproof technological solution to a behavioral problem. Computer technology can be a great boon to all types of human performance management, but everyone who has ever worked with managing human behavior knows it is no panacea. Any technological solution has to be implemented by humans and used by other humans. Both the implementation and use are problems not of computer engineering but of behavioral engineering. The technology may be a necessary part of many good solutions to some human problems, but the technology is only as good as its use. If people won’t use something or don’t use it correctly, what’s the point? It isn’t helping solve the problem and it is upsetting all of the people who spend valuable time and money in its development.

My point is simple. New engineering or other mechanical technologies require new human technologies to get the system working as it was designed to work. My luncheon companions expressed dismay that therapists were as neglectful in their duties to children with serious behavior problems as to not bother to record necessary data appropriately or even implement programs as they were designed. There is a rather substantial literature in applied behavior analysis on this very problem under the topics of compliance and treatment integrity. Not surprisingly, people can be made more effective in implementing and maintaining treatment programs, including recording data, by simply re-arranging environments to optimize performance. This can be done in different ways, depending on the circumstances, but all basically involve managing the consequences of proper adherence to expected implementation standards.

Hopefully my luncheon companions will heed my suggestions about optimizing their seemingly excellent technological innovations by recruiting people skilled in behavioral engineering and technology to help implement them. By so doing, they hopefully can achieve the full potential of their valuable addition to the armamentarium of tools for helping children optimize their own potentials. 

Posted by Andy Lattal, Ph.D.

Dr. Andy Lattal is the Centennial Professor of Psychology at West Virginia University (WVU). Lattal has authored over 150 research articles and chapters on conceptual, experimental, and applied topics in behavior analysis and edited seven books and journal special issues, including APA’s memorial tribute to B. F. Skinner.