Editor’s Note: Whenever we talk about PBIS, we talk about implementation. We can’t talk about implementation without talking about fidelity. So, what is implementation? How do we know when we’re doing it right? How can we do it better? This is the first in our series exploring the basics of implementation and how to assess your own to get the outcomes you want to see.
When was the last time you read a journal article just because? Sat down on the good side of your couch – you know your couch has a good side – with a hot cup of coffee, curled up under the blanket, and really tucked in to read about baseline conditions, validity, and latent class analyses.
Never. My answer to that question is: Never.
When I read journal articles, I do it slightly sweaty, armed with a dictionary, thesaurus, and the ever-open arms of Google. It looks something like:
In all the ways educational research is juicy and full of promise, the way we talk about research findings can be overly complicated. If a journal article loses us at its abstract, those findings don’t stand a chance to see the light of day in a classroom.
In a recent survey of 510 educators, the Institute of Education Sciences (IES) found out firsthand the growing divide between the research they fund and the practices teachers actually use. About 85% of those who responded said they believe research is useful. However, 45% said that the research they find doesn’t translate easily into practice. That’s a problem IES Director, Mark Schneider, wants to solve. “In years past, IES has spent much of its budget and energies identifying what works for whom under what circumstances. But that’s only part of our job. Just as important: We need to figure out the best channels to get that information into the hands of teachers, so that more students have teachers who are using the most effective, evidence-based methods.” Schools need evidence-based strategies. Researchers know what those strategies are. So, how do we get those strategies into the hands of teachers in a way that doesn’t involve an arsenal of reference materials? In a word: Implementation.
The National Implementation Research Network (NIRN) focuses on all the ways we implement things, specifically looking for what works, what doesn’t, and how we can do it better. According to them, there’s a good formula for success:
Effective Innovations x Effective Implementation x Enabling Contexts = Socially Significant Outcomes
It’s basic multiplication. Put a zero in any of these boxes and you get a zero out the other side. #Math. In practice, this can look something like what happened to me the other day when I tried a new recipe. I had all the ingredients. I had all the tools. I know my way around my kitchen. I was ready. I set out to make a pomegranate glaze. [So fancy, I know.] I ended up with straight up edible glue. My whisk stuck to the bottom of the bowl. I had to soak it overnight. The recipe wasn't bad, I just make a mistake along the way.
The same thing happens in schools. We have effective innovations – strategies backed by research, replicated across multiple settings. We have enabling contexts – schools invested in using good strategies to create positive changes in their buildings. What’s missing? You guessed it. Effective implementation.
Before you jump head first into swinging dough around teaching everyone else how to swing their dough, here are seven things the folks at NIRN want you to be sure to include in your implementation to make it the best it can be.[1]
“In years past, IES has spent much of its budget and energies identifying what works for whom under what circumstances...We need to figure out the best channels to get that information into the hands of teachers, so that more students have teachers who are using the most effective, evidence-based methods.”
1. Choose Key Players Carefully
Some programs are designed to be implemented by anyone, without any specific qualifications, degrees, or personality types. In those cases, you’re one step ahead of the curve with as broad a base of participants as you can imagine. Other programs require careful thought about who will help you implement. Schools with Check-In Check-Out programs know the person checking in and out with students doesn’t need to be a specialist, or even a member of your PBIS team. This person is someone with a natural connection to students, with time available in the morning, and the charisma you need to get students in and out a positive note. That’s the person you ask to help. Think carefully about the kind of things you need someone to do, the amount of time you’ll need them to do it, and which skills they’ll need in order to hit the ground running as you implement.
2. Teach the Why of the Intervention During Training
You might be inclined to teach every in and out of the intervention during your pre-service or in-service training days. The thing is, it’s not likely anyone can master an intervention over the course of two days. What is possible? Building a foundation. Pre-service and in-service trainings should give people the basic theory, philosophy, and values associated with the practice. Help them understand the why of it. Give them the space to ask questions, air their concerns, and work through their hypotheticals. These trainings are great places to solidify buy-in from the very people you’re asking to implement, so work toward that.
3. Take Advantage of Coaching
Coaches bring experience to your implementation. They know the intervention. They’ve seen how other schools have adapted it to fit their context. They know what works and they’ll keep you from straying too far from using the intervention the way research intended. A coach’s perspective will take you from just following the steps to really embedding an intervention seamlessly in your school’s existing systems.
4. Regularly Assess Fidelity
To know whether you’re implementing the important parts of an intervention, you’re going to have to assess fidelity. When the time comes to check in on your implementation, use a fidelity assessment to answer three things:
- Are you doing what you said you would do?
- What’s going well?
- Where are the places you could improve?
Sometimes assessing fidelity means taking a survey as a team. Sometimes a fidelity check means sitting in a classroom to observe the intervention as it happens. However you do it, don’t feel badly about the things that aren’t going how you thought they would. Use the results to build an action plan to improve your implementation.
5. Use Data to Make Decisions
The very first meeting I ever attended here at PBISApps, we looked at no fewer than six pieces of data in the first hour. We learned pretty early on, if you have a hunch about something, you better find the data to back it up. Without data, you’re just guessing at what to do next. Fidelity data will tell you how closely you’re implementing the way research said you should. Outcome data will tell you the kind of impact your implementation has on students. Don’t hope you’re doing the right thing; let data guide you every step of the way.
6. Get Your Administrator On Board
No amount of buy-in will get any intervention off the ground without an administrator right there with you. Research suggests implementation is more strongly related to a principal’s actions than to any teacher’s personal characteristics or ability.[2] Administrator’s make implementation so much smoother by allocating time, resources, and the authority to change structures if necessary.
7. Make It Fit In Existing Systems
Coming out of winter break, it’s pretty clear to me how much I value our regular family routines. The small change of one kiddo not going to school throws the whole thing out of orbit for two weeks. We all feel the shift. That’s just our family of four. Expand that to an entire school, dozens of teachers, hundreds of students, and an ecosystem of existing frameworks, routines, and initiatives. If you add a new strategy without considering how it fits with what you've already go going on, you're likely to overwhelm and frustrate the people you need to implement it. As you tackle new interventions, regularly consider how they fit with the rest of your systems and make adjustments as your data suggest you should.
Odds are high you're implementing some initiative, intervention, or framework in your building. To be sure doing that thing results in the positive effects you're after, it should be based on evidence. To transform that research into practice takes effective implementation from everyone. That means: Make sure you have the right people involved, especially your administrator, and they all understand why they're doing it. Take advantage of good coaching wherever you can. Use data, both fidelity and outcome, to evaluate how closely you're doing everything the way research hoped you would and to verify those practices deliver the outcomes you expect. Finally, embed all of this within the things you're already doing so you don't overextend the limited time and resources available. Every part of this puzzle works together, one element compensating where another lacks, to bring research's best practices into your real world setting.