Monday, May 18, 2015

Educators Don't Much Care for the Fraser Institute

Well, it's that time of the year again. The Fraser Institute published its annual School Report for BC elementary schools this spring, then followed it with its version for BC secondary schools. For those of you in jurisdictions other than BC, the Fraser Institute is an "independent" think tank that collects data and uses it to publish reports, mostly on the apparent inefficiency of our public institutions. We describe them as right-wing, because it seems they are against paying taxes and believe that privatization leads to better service, and just you never mind any of those messy issues around equity. The school reports themselves are designed to be easy to digest, melting down our provincial achievement metrics for the year into a single digit number, then sorting them into rank order for easy public consumption. The reports create a bit of a fuss and seem to sell newspapers, though compared to what you might typically deal with in the United States in regard to high-stakes testing where principals and teachers actually lose their jobs and schools actually get closed, what happens here really isn't that interesting. I like to refer to it as low-stakes shaming.

The elementary report numbers are based entirely on the results of our Foundation Skills Assessment tests, which are reading, writing, and numeracy tests taken by students at the grade four and seven level. These take about eight hours to administer in those years, and the results have no impact on the students' report card grades or whether or not students transition into the next grade.
Not everyone likes our provincial standardized test program, but I tend to defend it, mostly because the tests take up less than .03% of a student's instructional time from kindergarten through grade twelve, including practice. For a very minor investment in time, our standardized tests do provide valuable information for individual students, educators, and for our system somewhat, though primarily as something to triangulate against other school level assessments. However, these tests were never designed to be used to determine the apparent success of a school because the cohort sizes are typically much too small to be statistically relevant in this type of test battery. It is also trailing data and rather out of date by the time the Fraser report is published, and therefore not of much use to the educators. It may be encouraging to see that our grade four and grade seven students did well on their FSA tests fifteen months ago, for example, but it hardly informs our ongoing practice because schools and parents have had access to that test information for some twelve months by the time the report comes out. As well, if you are a parent looking for an elementary school, a rating that came out this month is similarly out of date, and may be even less helpful as a result. To be blunt, using the FSA tests to generate a school rating is just plain dumb, and that is about the nicest thing I can say about it.

The secondary report numbers are based on Graduation Program data, which are likely more important to secondary educators. The calculations used for the 75% of secondary report number are based on the results of the exams a BC student must take to graduate: English 10, Science 10, a Math 10, a Social Studies 11, and a Language Arts 12, each intended to take about two hours to complete. These exams, now in their final year of use in our province, count for 20% of a student's final grade at the ten and eleven level, and 40% at the twelve level. The remaining 25% of the calculations used to generate a secondary school's number are based on transitions rates for grade ten and eleven students, and the percentage of grade twelve students who graduate in that current year. We all want our students to pass the exams, transition from grade to grade, and complete grade twelve in that year. This is our key work at that level. Though not as dumb as the elementary rating system, I do have several issues with the data balance selected to generate the numerical rating. I wouldn't give the exams 75% percent of the weight. That is too high. I wouldn't calculate transition in the way that they do which actually penalizes a school statistically for having all their eights move forward -imagine that! I am also not sure about the male/female balancing as an indicator, as it tends to punish smaller cohorts. A small school could have perfect gender balance over five years, but year to year could show statistical volatility and be hurt in the calculation every time.


Even the data point that compares school achievement against community education levels, closely correlated to household wealth and never actually included in the actual calculations of course, is completely absent from this year's publication. BC schools do an amazing job of narrowing the achievement gap between our richest and poorest students, among the best in the world at only 5-7%. In a report that ranks schools, a 5% head start is big advantage when its not factored into the calculation process. Imagine a track meet where the wealthiest schools receive a 5-7% head start in every race, but that minor detail never makes it in the newspaper, and you are getting the idea. Sugata Mitra has described a strong negative correlation between standardized test scores and proximity to a region's primary population center. I often wonder what impact this type of factor would have on a school rating system if it were to be included. 

Finally, there is the factor that averages schools against each other. Your school's achievement can improve, but if slightly less than the average improvement across the province, your overall number will actually go down, giving the false impression you are in decline and inviting the local newspapers to pounce on an easy negative headline opportunity. Principals love that. Think about it: our schools have been improving steadily over the last decade and a significant number of them still have to face the stigma of public failure and the harm that comes with that because of this averaging factor. Ouch. Peter Cowley of the FI has been known to say, "If you don't like the way we do our report, feel free to make your own." One day I just might just take him up on that. 

Data limitations aside, the Fraser Institute Report on Schools rubs educators the wrong way because it is misleading. BC parents generally don't know how the numbers are generated, and many assume they are based on something far more complex, astute, and current than these particular standardized tests taken by a small number of students over a small period of time. In addition, the design of the ranking could potentially encourage some educators to make decisions that are not in students' best interests, such excluding them from exams they might actually pass, or "perching" grade eleven students who are missing some credits, or assigning them to a separate alternate school, as these practices can improve a school's numerical rating, at least in the short term.  


Most importantly, the very act of ranking schools is an unhealthy, unethical process. It pits us in public education against each other if we choose to buy into it, it certainly encourages privatization, and most importantly, it undermines some of the schools and teachers who are doing the very best, and frankly, most important work with our most vulnerable learners. For these reasons, educators don't much care for the Fraser Institute, and rightly so.




I will augment this rant with a disclosure of sorts. Often people who criticize these rankings are working in schools on the lower end of the ranking, and this can appear to smack of sour grapes. I'm always impressed when a school at the very top end of the rankings describes the system as unimportant or unethical, and I am pleased this is happening with increasing frequency, especially as the system is becoming more and more interested in competency development. Though I've never worked in one of those high ranking schools, my own history with the rankings is actually somewhat positive. My little secondary school was recognized, not once, but twice, as the fastest improving in the province. Our Grad Program needed an overhaul, and we did the work we needed to do to get it in shape, and I'm proud of that. The media did descend upon us when the report came out, and our rising achievement became a bit of a positive for our learning community. It also became a bit of advertisement for our work with assessment for learning, as well as for being fully committed to student success, both of which I believe in completely. Ironically though, the attention we received from this upturn in trailing data was based for the most part on work we had been doing fully two years prior, and was no surprise to me or anyone else who had stayed awake during my frequent data sessions. It was also odd and somewhat uncomfortable that this thing that had been kicking sand in our face for so long was helping us to somehow become more credible. Odd, uncomfortable, and might I say, even a bit dirty. As in, "Thanks so much for swooping in after the fact and finally permitting people to believe we were doing something right."  

Still, I stand by the information in the previous paragraphs, and when I got the opportunity to say those things to a reporter, there wasn't sour grapes driving the conversation. The schools in our district are primarily focusing on competency development and doing just fine on their core skills too, thank you very much.


At the end of it all, we are compelled to do what we do. I seem to be able to forgive the cat for taking out the Christmas tree ornaments every year, so too should I be able to forgive accountants for their counting. However, we are all concerned about the impact this has on our public education system because of the apparent agenda behind it and the way the media often presents the information. I wonder how many extra papers were sold by this easy negative headline pounce! How very, very sad. 

No comments:

Post a Comment