Teach By Design
Data-based decision making
Survey
Implementation
Apr 9, 2019

The Best Decisions Take Guts and Data

Using data to make decisions is important. What role does your intuition play in the process?

No items found.
Apple Podcast Button

Over the last three months, Teach by Design has focused on implementation. We talked about the seven core elements of successful implementation. Then, we talked about how to measure your implementation – both how closely you did what you said you’d do and whether you saw positive outcomes because of it. We highlighted two surveys: the Tiered Fidelity Inventory (TFI) and the School Climate Surveys for students, family, and staff. Well, if you take a survey, you’re going to end up with data. If you know anything about us at all, once you end up with data, we’re going to tell you to make some decisions. Which brings us to this very article. How do you use data not just to make decisions, but also to make effective solutions?


Ironically, I set out to highlight research data demonstrating the way decisions are more effective when people use data to inform them. The research I found included next to no percentages, statistics, or standard deviations. I landed on real life, human statements from focus groups and interviews conducted in nine high schools across the United States.1 Researchers identified these schools by their commitment to continuous improvement and their user of data in their decision-making processes. Researchers expected to hear all the ways teachers depend on data as they make decisions that influence the way they teach in their classrooms. What they found was: teachers are more likely to rely on their intuition than solely information downloaded from a database.


Over the course of 186 interviews about their school’s practices, 98 interviews about their school’s culture, and 19 focus group discussions, researchers found 40% of teachers used systematic data to make their decisions. That seems pretty good. The sentiment changes a little when you find out just as many teachers only used their intuition or gut feelings when they made decisions about their classrooms. You can almost feel how a researcher – someone dedicated to collecting and analyzing data – might react when they hear a teacher describe their decision-making process this way:
“I think that once you’ve been doing a job for a couple of years, I think all teachers have instincts and you can kind of like feel when things go right or wrong. Now I know that that’s something that’s pretty vague, but really I think good teachers have that. Almost like an intuition...

gif of the child clutchimg her chest in shock

When I think about an administrator’s reaction to student behavior, everyday decisions about reading groups, or small choices about where to put someone on a seating chart, the critical side of my brain hopes there are data involved at some point. On the other hand, using data as the only source of information has its own set of problems. Data are flat. They give you facts about something, but no soul. They give you a snapshot of a moment, but no story. Data come to life when you know the context they came from and the stories that created those numbers. That’s why 15% of the people these researchers interviewed – the 15% who used a combination of data and intuition – might be onto something. In his TED Talk, Sebastian Wernicke tells a perfect example of how this exact idea led to the development of two very different TV shows.

A Tale of Two Shows

Roy Price is a senior executive with Amazon Studios. He’s responsible for picking the TV shows Amazon will produce. Specifically, he wants to pick shows that generate between a 9 and 10 rating on IMDB’s scale. These are the kinds of shows where once the season ends you find yourself Googling when the next season premieres. These are shows like The Wire, Game of Thrones, and Breaking Bad. In one effort to create the next binge-worthy series, Roy and the good folks at Amazon released eight different pilot episodes to their audiences for free. Roy watched millions of people as they watched each episode and he analyzed all their data. When the results came in, the data told him that Amazon should create a sitcom about four Republican US Senators. They created that show. Do you remember it? Me neither.
Over at Netflix, Ted Sarandos has a similar mission to create the next series you can’t stop watching. His approach is a little different. He takes the mountain of data already available to Netflix – ratings, viewing histories, what people say they like and don’t like – and he analyzes all of it. Armed with that information, as well as his experience working in this medium and his instincts about good content, Netflix decided to produce a drama series about one US Senator.

house of cards.jpg

Yeah. THAT drama series about one US Senator.
House of Cards was one of the very first binge-worthy shows. It received critical acclaim, 33 Emmy nominations, eight Golden Globe nominations, and an 8.8 on that coveted IMDB scale.

How can two people, both good at their jobs, both using data to inform their decisions, end up with wildly different outcomes? I’ll let Sebastian Wernicke tell you more about it.

By looking at data, we find details about something we’d otherwise miss if we relied solely on our own experiences. We break down a problem to its smallest parts to know every in and out of the situation. That’s just one step of several in a decision-making process. The second step Sebastian talks about in his TED Talk is the part where we take all these pieces and we put them back together into a solution. Data do a great job of breaking a problem apart, and a lousy job of telling us what to do next. Our brains are good at that. It’s our experience, the intangible ways we relate to the data, that lead us to solutions better suited to our specific contexts.
What does this mean for you? Right, we were talking about surveys.
At the end of the school year, your team has a mountain of data at its disposal and a responsibility to create a plan to improve next year. You could take the pieces needing improvement and just slide them over to your action plan verbatim. Or, you could use those pieces, alongside your own experiences, to inform the solutions you come up with as a team. I sat down with Bert Eliason, Celeste Rossetto-Dickey, and Danielle Triplett to find out how they guide teams through the action planning process. Here are a few of our best tips to help you find that balance between using data and using your gut.

Create an Action Planning Agenda

First things first: Every meeting needs an agenda. If someone schedules a meeting and doesn’t have a set of things to talk about, run…run far, far away. An agenda lets everyone know what’s included in the discussion as well as what isn’t. Give people the chance to prepare for the discussion by collecting their thoughts ahead of time and bringing their questions ready to ask. Your action planning agenda might look different, but here are some topics you should be sure to consider:

  1. Review the big picture including national, statewide, or district-level data.
  2. Revisit your team’s annual action plan
  3. Review the end-of-year data for your school
  4. Discussion time!
  5. Select and prioritize items for next year’s action plan
  6. Assign items to team members and set goals for when they will be completed
  7. Plan how to share decisions with stakeholders after the meeting
Data do a great job of breaking a problem apart, and a lousy job of telling us what to do next...It’s our experience, the intangible ways we relate to the data, that leads us to solutions better suited to our specific contexts.

Data Points and Gut Checks

The example agenda above relies on data to inform decisions. Wherever there are data, there are gut reactions to it. Here are the data points we suggest reviewing and examples of how to incorporate your experiences into your discussion time.

The Big Picture Review

At the beginning of your meeting, start by looking at the bigger picture of what’s happening nationally, statewide, and in your own district. Your school isn’t the only one implementing PBIS; check out how other schools’ rate their own fidelity.

The Data Points

To know what’s happening in the big picture, there are a few places you can look. Nationally, schools taking the TFI score around 74% on Tier I implementation, 69% on Tier II, and 62% on Tier III. If you’re curious about how that looks across each TFI subscale, check out the full evaluation brief here.
Every year, PBISApps publishes national ODR averages and every SWIS school has the option to include that average as a line on their Average Referrals per Day per Month report. Check out more information about the 2017-18 national averages here.
Your school’s district coordinator knows how implementation is going locally and the state coordinator can send you information about PBIS implementation at the state level.

Gut Check

When it comes to looking at how other people are doing, our natural instinct is to compare ourselves to them. Once you know how other schools in your district, state, or the country score on the TFI, the immediate next question is: What’s our TFI score? Before you answer that question, take a minute to reflect on the national, state, or district data you just saw. Some questions to ask are:

  • What would it feel like to be in a school like that?
  • What resources do schools have to assist with implementation?
  • Does the average seem high or low? Why?
  • What have I heard from other schools in our district or state? Does their experience relate or contrast to the data I see?
  • What changes happened at the district or state level affecting overall implementation?
  • What was my experience with the district coordinator like this year?

The Annual Action Plan Review

Your team started out the year with a plan. Your plan consisted of goals you set and some number of things you were going to do to accomplish them. You’ve looked at this plan nearly every team meeting and it’s time to revisit that plan, one more time.

Data Point

The only real data you need here is the original action plan. If you never wrote a formal plan, somewhere you wrote down your goals for the year. Pull them out. Read them.

Gut Check

Ever walk into the kitchen and forget why you went in there in the first place? I’m asking you to remember what things were like seven months ago. I know. It’s a big ask. Think back to those first few days of schools, fresh off summer break...

  • How did I feel to be at school?
  • What were my hopes?
  • What was the first team meeting like?
  • How did those feelings, hopes, and meetings change over the course of the year?
  • Did we commit to too much or could we have done more?

End-of-Year Survey Data Summary

This summary is critical. Knowing how you end the year informs every goal you have moving into next year. From this data, you’ll decide how much improvement you want to see, and where you’ll focus your effort. Take the time to summarize the data ahead of your meeting. Here’s how.

Data Point

PBIS teams should expect to have two types of survey data to review: fidelity and outcome data. The TFI is your fidelity data, the School Climate Surveys are your outcome data. Someone from the team should take the time ahead of your meeting to review these data and bring a summary to your action planning meeting. Conducting an initial review ahead of time makes the meeting so much more productive. Decide who will summarize the information and give them the time to do it.


If the data summary task falls on you and you don’t know where to start, here’s what you do.

  1. Start with the survey’s Total Score Report. Compare your school’s average to the national, state, or district average.
  2. Move to the Subscale Report. Find which subscales scored best and which ones scored lowest.
  3. Finally, look at the Items Report. For the subscales with the best scores, write down the top three items you think your team should commit to sustaining next year. For the subscales with the lowest scores, write down the three items you’ll want to be sure the team discusses further on how to improve them next year.

Gut Check

It’s time to move beyond the scores and start asking yourself how you relate to the data you see. The data describe a school you just spent the last nine months working in. Now is your moment to tell your stories, describe your experiences, and highlight your highs and your lows.

  • What was my contribution to the way implementation worked this year?
  • How did my classroom feel?
  • Is there a subscale or an item that stands out to me as something I relate to?
  • What connections can I make between the datasets?
  • Was there an event, or something significant that happened during the year I think shifted our implementation for the better or worse?
  • What did I hear from my grade level team?
  • How did the playground feel? How would I describe the hallways or the cafeteria?

Now What?

It’s time to make some decisions. Ideally, the discussions you have in your meeting bring you to themes. Within that theme, select and prioritize the items for your action plan. Don’t forget to include items around your team’s logistics.

  • Is anyone leaving the team next year?
  • Is there a change in roles on the team?
  • Will the current meeting schedule (day, time, location) work next year?
  • Are there concerns related to administrator support or turnover to be addressed?


Once you have the items selected, decide on your next steps. The action planning meeting is a great time to reflect and talk about what worked and what didn’t work so well. Don’t get so lost in the discussion that you forget about the true meeting purpose: Come up with a plan – complete with goals and assignments – to share with your staff and with families.
When you do it well, your action planning meeting should incorporate both the data you collected throughout the year as well as your lived experiences as teachers and PBIS team members in your school. An expert perspective is incredibly valuable to making effective decisions. You are the experts in your building. Connect the data points to everything you know about your students, your resources, your building, and your implementation. Create the solutions that work for your context, not just the ones spit out of a database.


Special thanks to the following people for their contributions and their time given to this article:

PBISApps Trainers - Presenters - PBISApps

Celeste Rossetto Dickey, M.Ed.

Celeste has worked in education since 1979. She spent seven years as an elementary and middle school teacher and another thirteen years as an elementary school counselor. She’s worn many hats in the PBIS world: PBIS Coordinator, PBIS Trainer, MTSS Coordinator. She currently works half time with the PBIS Applications Training Team. Celeste is passionate about assisting all students to be successful using the PBIS/MTSS framework in schools.

Danielle Triplett | Educational and Community Supports

Danielle Triplett, M.Ed

Danielle started her career in education as a middle school language arts teacher. With her insight into the important role systems play within schools and classrooms, she became a PBIS coordinator in the Gresham-Barlow school district. Today, Danielle is a research assistant at Educational and Community Supports working on an instructional alternative to exclusionary discipline.

Presenters - PBISApps

Bert Eliason, Ed.D.

Bert came to PBISApps after spending 18 years as a teacher and 15 years as a middle school principal. As an administrator, he was actively involved in implementing PBIS. He currently works as a research associate on the PBISApps Training Team. He often writes PBIS implementation, the use of PBISApps, and racial disproportionality in school discipline. He is a Northwest PBIS Network board member and a strong believer in proficiency-based education and equitable outcomes for students and staff.

1. Ingram, D., Louis, K. and Schroeder, R. (2004). Accountability Policies and Teacher Decision Making: Barriers to the Use of Data to Improve Practice. Teachers College Record, 106(6), pp.1258-1287.

Download Transcript

Megan Cave

About

Megan Cave

Megan Cave is a member of the PBISApps Marketing and Communication team. She is the writer behind the user manuals, scripted video tutorials, and news articles for PBISApps. She also writes a monthly article for Teach by Design and contributes to its accompanying Expert Instruction podcast episode. Megan has completed four half marathons – three of which happened unintentionally – and in all likelihood, will run another in the future.

No items found.