Two days ago, I posted about Teacher Match and Hanover Research, two companies that are now using value-added statistical modeling to predict the effectiveness of prospective teachers' abilities to raise test scores. ("Using Statistical Models to Predict Future Effectiveness of Teacher Candidates: A Snake Oil Approach") As I pointed out, there are some major flaws, especially flaws in the assumptions about teaching, in using this approach as even a part of a new teacher candidate selection process. Here's some more thoughts on this heinous practice:
1. It elevates standardized testing even higher in the decision-making processes for schools. This is using imperfect assessments to make decisions about whether a new teacher can raise test scores. States haven't done the validation studies to prove that the assumptions they make based on scores are valid. States develop tests on the cheap, or they purchase ready-made tests of questionable validity, and that were not designed for the purposes for which they are using them. Tests do not deserve this level of emphasis. This practice, by default, views raising test scores as the goal of good teaching.
2. It makes the hiring processes of schools and districts even more mysterious. In one district where Teacher Match is used, a source inside that district reported they have a major decrease in applicants because new candidates were being asked to submit to this mysterious process before being hired. I suspect this would be a big problem with any of these kinds of products. Besides, who wants to go into teaching to become the best test-score raiser in the business? These voodoo products will only make it harder to find teaching candidates not easier. It gives teachers the wrong message up front: your primary job is to raise test scores.
3. It is just another expensive drain on already short educational resources. One district contract with Teacher Match showed a district paying well over $30,000 per year for the service. In tight budgetary times when teachers are spending hundreds of dollars on their own school supplies, it amazes me that, morally, a district could justify spending this kind of money on a statistical gimmick. Districts are throwing more and more money into these statistical quackery schemes, when there are so many other pressing needs.
4. School districts, as I have witnessed many, many times in my 24 years as an educator, are purchasing products like Teacher Match, based entirely on the promises and marketing of the companies. Instead of accepting their word that their product will do what they say it will, they need to be forced to produce independent, peer-reviewed research. If they can't produce those studies, tell them to come back when they can. And, because I am not a firm believer that high test scores equals good teaching, they need to use measures other than test scores to prove their product is effective.
5. The fact that companies like Teacher Match and Hanover Research even exist in the education industry now is due to the Obama administrations' insistence of elevating test scores importance in everything a school does. This legacy will leave public education in worse shape than George Bush's No Child Left Behind. Arne Duncan and his Department of Education believe that data is data and any old data will do as long as it is "objective." This shows immediately that he and his cohorts do not have a clue about education. When you have non-educators like Duncan and half his Department of Education, you get these kind of detrimental approaches to education.
6. A major assumption behind Teacher Match and other statistical quackery products like it, is that schools can be operated like businesses, where their business is churning out high test scores. This assumption about public education is wrong. Because of current federal policy, public schools are being viewed even more like a business whose product is high test scores. That might be acceptable if your goal as an education system is to produce "high-quality test takers." What the education policy of President Obama and Arne Duncan is doing is destroying the culture of public education, test score by test score.
Teacher Match's Educator's Professional Inventory and Hanover Research's Paragon K12 are the latest in value-added voodoo products to be peddled to school districts. They will only serve to elevate the importance of test scores even higher than they already are. Districts even thinking about purchasing this snake oil should be ashamed of wasting limited education money on such products. There comes a time when you have to realize statistics aren't going to tell you everything what really need to know. Not everything can be reduced to numbers subject to statistical analysis. My fear is that some administrators who see test scores as the sole goal of their school are going to use this data to as the only basis of hiring someone. Can you imagine a profession where whether you can produce high test scores determines your entry, and whether you can keep producing those high test scores determine whether you can stay? That folks, is a factory model of educational delivery if I have ever heard of one!
Excellent post. This is just silly. Districts are paying for analysis that predicts a preservice teacher's ability to affect test scores? Value added models are not reliable for rating teachers who actually have taught actual students. Now we're paying for predictive modeling? Campbell's Law is playing out in public education far more extensively than I would ever have thought possible.ReplyDelete
Educational leaders and teachers are going to need to stand up at some point and refocus the system so we can begin to do the work that is necessary to prepare students for the current century. These types of programs are distractions we can't afford if we're going to provide every child with the education he or she deserves.
I agree wholeheartedly. These are distractions and sidestreets that prevent us from doing the difficulty work of making schools work for students. The whole problem behind these kinds of measures is that companies sell this kind of product and educators, who should know better, buy them. Public education has become one big market for all kinds of tomfoolery, and this is just one of many examples.Delete
Thank you for this. Teachermatch is voodoo of the most odious kind.ReplyDelete
I am a Board Member in a district considering this product. I have repeatedly asked for their research and predictive validity studies. They will not provide more than selected data and graphs. Even if I thought placing predictions of value added scores based on a questionnaire near the center of the hiring process and professional development was a good idea (I do not), I would not do so without reviewing the evidence that the product does what they claim. I amazed that other districts have done so.ReplyDelete
Unfortunately, the fact that these companies choose to hide their processes behind claims of "proprietary reasons" is enough to make me even more cautious. In my opinion, we should run the other way as fast as we can when a company just expects them to trust them on these claims. If they can't provide the unquestionable data, send them packing.Delete