The "Why" Effect: Good Behavior Analysts Seek to Understand
Want to know a skill that’s underrated?
Understanding the real connection between cause and effect in our ABA interventions.
It's essential, yet I've noticed a tendency to overlook this critical aspect, especially when dealing with problem behaviors.
Here’s a scenario that illustrates this point perfectly from several years back.
I recently reviewed a case where a behavior analyst was addressing a child's attention-maintained problem behaviors. The child seemed to seek out social interaction and general reprimands, which were a consistent outcome of their problem behavior at home. During our discussion about the behavior plan, the analyst mentioned considering a weighted blanket as part of the intervention.
Now, first, a disclaimer: I have very little issue with weighted blankets or sensory tools—they can be beneficial. In fact, I have very little issue with any proposed intervention as long as it’s ethical, safe, and implemented with the best interests of the child in mind.
And I especially don’t have problem with it if it’s hypothetically sound.
But, in this case, there was no clear link between the use of a weighted blanket and the child's need for attention. This mismatch got me thinking about how often I see the “kitchen sink” approach, where we throw everything at a clinical issue but the “kitchen sink". At this point, we’re merely guessing and grabbing at straws.
And, we should be as honest as we can about it with ourselves.
It’s crucial for us as behavior analysts to truly align our strategies with the identified functions of behaviors.
And we have to think critically.
We have to truly assess the plausibility of our hypothesis. In this example, if a behavior has a history of attention as a consequence, how does a weighted blanket address this? Is that really a reasonable assumption? Or are we just guessing?
At this point we’ve potentially gone from behavior analysis to behavior alchemy.
Remember the law of parsimony?
That’s the one that suggests we start by offering the simplest and most reasonable hypothesis when solving a problem. A weighted blanket to combat attention-maintained problem behavior is not a simple or reasonable proposal.
Now, if the learner seemed to engage in negative behavior to seek out interactions that resulted in pressure and resistance, then we have an entirely different story. A weighted blanket really could be appropriate.
Moving on, visual schedules are another tool I value greatly. They are fantastic for providing structure and predictability, which can alleviate anxiety and help individuals who struggle with transitions or uncertainty about reinforcement. However, using visual schedules as a go-to for all problem behaviors, without understanding the underlying cause, can lead to ineffective interventions.
Take, for example, a non-verbal child whose problem behavior seeks to gain access to an iPad. An elaborate visual schedule with multiple pictures might help learners who are heavily reinforced by daily predictability. But, if they’re not, then why are we using it. For this kiddo—who loves his screen time—it does very little to address the problem behaviors that likely are occurring due to an inability to communicate.
This brings me back to the fundamental point: the importance of aligning our interventions with the true function of the behavior.
We shouldn’t simply throw random interventions at a problem behavior. Instead, the data we have on the problem behavior should educate us on the interventions we use.
It sounds straightforward, but it’s a common area where even the most seasoned professionals can stumble.
In my experience, even well-intentioned plans can miss the mark if they're not rooted in a thorough understanding of what's driving the behavior.
Aubrey Daniels, a significant figure in the field of Organizational Behavior Management, often discusses how “fads” come along in corporate problem solving—like the “open concept office” for example. These fad solutions end up over used. They are applied in the wrong contexts. They don’t work and they don’t solve the problem. They may even fail so terribly that people never use them again. And that’s unfortunate because they might be useful in some contexts versus others. But, no one bothered to understand the cause and effect piece. No one bothered to ask themselves “why” the solution did or didn’t work. And so they scrapped it.
Naturally, misidentifying the cause can lead to ineffective solutions. This is as true in clinical settings as it is in corporate environments as well!
So, some food for thought. Next time you’re proposing a potential solution for a problem behavior, ask yourself “why” this solution is appropriate? Why do you think it will work?
Martin Myers is a BCBA with a passion for helping improve the field of ABA. He is the creator of BxMastery, with over 4,000 goal ideas, sequenced, to inspire your programming. With 10+ years of experience in the field, he’s dedicated to empowering others and fostering positive change through effective leadership and communication. Connect with Martin on LinkedIn, Facebook, Instagram, and TikTok for more insights and updates.