So what is it that most of these workshops offer? They should be entitled, “Test-Score Driven Decision-Making” because that’s what most of them are really about. They are about using test scores to drive all school improvement activities. They're most often about not moving a muscle until you've consulted the numbers. Those who push making decisions based on test scores alone, elevate them to "sacred" status because it is somehow seen as "objective" and above reproach. Their true "objectivity" is certainly always debatable, but test scores are clearly not all the data we have at our disposal as school leaders. I really can’t help but resent the fact that these individuals actually think the decisions I make as an administrator aren't data-driven. I submit that every decision I make as principal is data-driven, otherwise I would be leading a school by the old proverbial “seat of my pants.” But perhaps the data I use isn't "sacred" enough to be considered data by the data-purists.
So what do I mean when I say my decision-making is always data-driven? It means that every single time I make a decision, I use data. When I am staring at each and every decision-point, I always begin the process by gathering data. For example, if I notice a student is failing multiple subjects, I have used data (grades) to determine that a student is in trouble. I might then call that student in my office and ask questions to gather additional data. "Why are you struggling in Algebra I?" I might ask. Or, "Is there something that’s keeping you from being successful in biology?" The answers to these questions are data. I would add that this kind of data allows me to make much better inferences regarding a student’s classroom learning and what to do about it than the test scores alone would do.
The problem with the term “data-driven decision-making" is that, most of the time, people most often always mean “just test scores.” They wall off a whole world of data as irrelevant because it isn't “sufficiently objective.” It’s almost as if speaking to a teacher to get their impression of a student’s performance is somehow tainted because it isn't numerical or something. Don’t get me wrong. I do see test scores as useful. They can tell us a bit of what we need to know about a student’s learning. What they do not tell us is the whole picture. For example, a student who repeatedly gets poor test scores in reading could allow us to draw any number of inferences about the reasons for their performance. Those might be:
- They have had bad teachers in former years.
- The school they attend is more or less a war zone rather than a school.
- They didn't have breakfast. In fact, perhaps they are homeless.
- They didn't get any sleep the night before the test.
- They hate the subject being tested.
Making decisions entirely based on test scores alone is educational malpractice; there's simply no other way to state it. Data-Driven decision making can consider test scores, but it should also include that qualitative stuff that drives data purists nuts. But then, those people don't live in a real world anyway.
The point of today’s post isn't to create excuses for bad test scores. The point is to make clear that “data-driven decision-making” should never, ever be just about test scores. Those pushing “promotion-based-on-test-score” schemes for students” and “teacher-evaluations-based-on-test score” schemes would do well to remember that our best decisions about the welfare of the kids, and teachers, in our schools are not based on the latest end of grade tests or latest SAT scores. The best decisions we make might glance at these scores, but are based on data that is much closer to what the student needs are. Our best decisions are made when we take considerable time to investigate and gather much more than numbers on a spreadsheet.
So, in order to keep from having to read any more these "data-driven workshop emails, I have set up a Gmail filter to dump them into my trash folder immediately. The whole idea that any one in education could make decisions without data is just plain ludicrous.
As a self-proclaimed data junkie, I found this post refreshing. I recently entered the field of education (4th grade teacher) after a long stint in automotive. I immediately felt the pressure to raise test scores and found tools to feed me data on specific standards where specific kids were falling down. As a result, my class scored top in the county. However, I had a little boy who lost his mother this year. Smart kid but his grades were reflecting the personal tragedy he was processing on a daily basis. He needed love and a caring adult in the environment he spent most of his days--not small group instruction to boost his scores.
ReplyDeleteI also had two kids taken off their ADD meds this year. Predictably, their scores fell, as well. But they stopped growing and the pediatrician was concerned and I wanted to fully support those parents through the transition. They had to relearn how to focus med-free. I don't work in an inner-city school so I can only imagine the challenges faced by those teachers. I do love data but I am learning to appreciate its limitations and value of non-numeric 'data'.
Thank you. I'm glad you found this post helpful. The data you describe above is just as vital as test scores. We can't ignore the qualitative stuff about our kids and truly make inference about whether they are learning or not. Thanks for reading.
Delete