AAPS School Board member Andy Thomas has written an
excellent letter that deconstructs the ridiculous color-coding system that gave some of the best schools in the state (including some in Ann Arbor) "red" scores, and some average-at-best schools in the state "green" scores, using a green-yellow-red scheme.
I will admit that my first reaction to the whole scoring system was--this is so ridiculous--in part because it almost entirely looks at test scores--that we should just ignore it. In general, my approach to testing is that if I don't believe in it as an evaluation tool for schools (and I don't), then I shouldn't engage with it and treat it as valid.
BUT--the school systems themselves don't get that luxury. So while we are working on that "
stop overtesting" thing, I appreciate someone pointing out how idiotic this so-called evaluation is.
I am publishing the entire letter. I have added the color, and also some (clearly identified as Ed. Note) "color commentary."
If you would like to see what the letter looks like all nicely-laid out, on some nice AAPS stationary,
you will find it right here.
I invite you to share it!
The Letter
Dear Superintendent Flanagan:
I am writing to express my concerns over the Michigan
Department of Education’s “Dashboard and Accountability Scorecard” and its
color-coded rating scale.
According to your spokesperson, Jan Ellis, the color scale “is meant to
be a fairly easy way for the public to understand from a variety of
measurements how their school buildings and districts are doing.”
Well, it may be easy to understand the color scale: Green is the best, followed by lime,
yellow, orange and red, which is the worst. When I learned that two of Ann Arbor’s high schools –
Pioneer and Skyline – received the lowest possible rating of “red”, my reaction
was… I SAW RED!!!
I imagine that many parents will simply look at the color
rating and make a judgment regarding the quality of a particular school. That is, after all, the intent – to
make it easy to measure how a school is doing. Some parents may use the color scale to select what school
or district they want their children to attend. Who would want to send their child to a school given the
lowest rating the state can assign?
As it so happens, I have a son who attends Pioneer High
School – and his experience is absolutely inconsistent with the “red”
designation. According to “U.S.
News and World Reports”, Pioneer ranked 11th out of 873 Michigan high schools;
Skyline ranked 28th. I also am a
member of the Ann Arbor Board of Education, and am regularly briefed regarding
the various measurements of academic success for Pioneer and Skyline – and by
most objective measurements, these schools are excellent.
So where is the disconnect? As usual, the devil is in the details. But first, let’s look at some of the
other data the State provides regarding our schools.
The Michigan Department of Education provides a “top to
bottom” percentile ranking of all Michigan public schools (including charter
schools) on its website.
[Ed. Note: In the percentile ranking, being closer to 100% is better than being close to 1%.]
Pioneer
ranks in the 93rd percentile of all schools in the State. Skyline is not far behind, with a
percentile ranking of 89. So what is the relationship between the percentile ranking and the color code? Apparently there is none.
And which high schools earned the highest “green”
rating? I could find only three
schools on the State’s database that were rated “green”. I wanted to compare their achievement
data to Pioneer to see if they scored an even higher percentile. What I found was that none of the three
“green” high schools even received a percentile ranking. Furthermore, two of
the schools were listed as “closed”, and a third (Ashley) had no published
achievement data, presumably due to its small size.
Apparently, for a school to receive a “green” rating, it
must either be closed or must have such a small number of students that no
meaningful achievement data is available.
Let’s move on to the next-best rating of “lime”. At least here, there are some schools
we can compare to Pioneer:
Percent of Students Proficient by
Subject
|
Subject
|
Pioneer
|
Mayville
|
Reed City
|
White Cloud
|
Lake City
|
Math
|
80.88%
|
35..71%
|
31.86%
|
35.29%
|
39.39%
|
Reading
|
93.31%
|
66.67%
|
81.42%
|
80.88%
|
80.3%
|
Social Studies
|
86.14%
|
56,.13%
|
52.25%
|
67.16%
|
60.27%
|
Science
|
76.25%
|
45.24%
|
33.83%
|
44.12%
|
49.29%
|
Writing
|
88.8%
|
66.67%
|
57.41%
|
46.27%
|
66.67%
|
Color Rating
|
Red
|
Lime
|
Lime
|
Lime
|
Lime
|
As you can see, achievement for these four “lime” schools is
significantly lower across the board than for Pioneer. So what gives?
As I understand it, the color rating is not based on the
overall achievement level of the school, but on a rubric that includes a number
of subcategories of the student population. These include: students who rank in the bottom 30% of all
scores, various racial and ethnic sub-groups, students with disabilities, and
economically disadvantaged students.
Points are awarded to each of the academic subjects for each
subgroup. A maximum of 2 points are
awarded per subject, for a total of 10 possible points for any given
subgroup. Additional points are
added for the completion rate (i.e. graduation rate) for each subgroup, and for
“other factors”, including educator evaluations and compliance factors. The actual number of points is added
up, and divided by the maximum number of possible points. The result is a percentage, which is
used to rate the schools. The
higher the percentage, the “better” the color score of the school.
Here is the way Pioneer’s score was calculated:
Category
|
Math
|
Reading
|
Soc Stud
|
Science
|
Writing
|
Graduation Rate
|
Total
Points
Awarded
|
Total Points Possible
|
All students
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
12
|
12
|
Bottom 30%
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
N/A
|
0
|
10
|
African American
|
0 out of 2
|
0 out of 2
|
1 out of 2
|
0 out of 2
|
0 out of 2
|
2 out of 2
|
3
|
12
|
Asian
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
12
|
12
|
Hispanic
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
2 out of 2
|
2
|
2
|
White
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
12
|
12
|
Economically Disadvantaged
|
0 out of 2
|
0 out of 2
|
1 out of 2
|
0 out of 2
|
0 out of 2
|
2 out of 2
|
3
|
12
|
Students with Disabilities
|
0 out of 2
|
0 out of 2
|
N/A
|
0 out of 2
|
0 out of 2
|
2 out of 2
|
2
|
10
|
Educator Evaluations
|
|
|
|
|
|
|
3
|
3
|
Compliance Factors
|
|
|
|
|
|
|
3
|
3
|
Totals
|
|
|
|
|
|
|
52
|
86
|
Percent of Points Possible
|
|
|
|
|
|
|
60.5%
|
|
Now, here is how Mayville’s score was computed:
Category
|
Math
|
Reading
|
Soc Stud
|
Science
|
Writing
|
Graduation Rate
|
Total
Points
Awarded
|
Total Points Possible
|
All students
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
12
|
12
|
Bottom 30%
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
0 out of 2
|
N/A
|
0
|
10
|
African American
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
0
|
0
|
Asian
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
0
|
0
|
Hispanic
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
0
|
0
|
White
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
2 out of 2
|
12
|
12
|
Economically Disadvantaged
|
N/A
|
N/A
|
2 out of 2
|
N/A
|
N/A
|
2 out of 2
|
4
|
4
|
Students with Disabilities
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
N/A
|
0
|
0
|
Educator Evaluations
|
|
|
|
|
|
|
2
|
2
|
Compliance Factors
|
|
|
|
|
|
|
2
|
2
|
Totals
|
|
|
|
|
|
|
32
|
42
|
Percent of Points Possible
|
|
|
|
|
|
|
76.2%
|
|
Both Pioneer and Mayville got the maximum number of points
in the “all students” and “white” categories (although it is hard for me to
understand how Mayville could get the maximum number of points in math for
having only a 35% proficiency rate).
Both schools received zero points for the “bottom 30%” category. The difference between the schools is
that Pioneer is much more diverse.
There were not enough African-American, Asian, Hispanic or disabled
students in Mayville to be statistically meaningful. So they were awarded no points for these sub-groups, but
neither were these sub-groups included in the “possible points” column. So Pioneer has more than twice as many
possible points, and because Pioneer received no points for some of these
sub-groups, it dragged down their average.
In other words, schools with very little diversity will have
higher scores than those with wider diversity. Each additional line of sub-categories is another chance for
a school to be marked down.
[Ed. Note: What follows here is an excellent analogy, in case you were having trouble with the math.]
(To draw a somewhat ridiculous analogy: It is as though you were comparing the
GPAs of two students. The first takes
only one class (basket- weaving) and receives an A. The second takes basket-weaving, calculus, English, physics
and Latin IV. Even though the second student gets an
A in basket-weaving and three of his other four classes, his GPA will be lower
than the first student’s if he gets a B in calculus.)
So that is how the scores are derived. But how does this relate to a school’s
overall color rating? According to
the web site, the cut-off scores for the various color designations are as
follows:
Green 85%
or higher
Lime 70%
to 84.5%
Yellow 60%
to 69.9%
Orange 50%
to 59.9%
Red Below
50%
So, based on
this standard, Pioneer just made the cut-off for a score of “yellow” – not
exactly stellar, but still much better than its actual designation of
“red”. But there is a catch. After all the scores are calculated, an
“audit check” occurs. If a school
fails to pass certain audit criteria, the result will be an “automatic red” (or
as I call it, “automatic flunk”).
One of these criteria is, if a school has more than two subgroups with
less than 95% participation in assessment in any of the academic cells, it is
an “automatic red” – regardless of the school’s percentile ranking, or its
overall score on the rubric.
This
is what tripped up Pioneer.
[Ed. Note: For those of us who are opposed to over-testing, this is an extremely significant issue. If in a class of 50 students (or a subgroup of 50 students), more than 1 student opts out of testing, then the school automatically "flunks," even if the other 48 students all passed. Which explains, to some extent, why the school districts have so much anxiety about people opting out of tests. They are high stake for the schools.]
Pioneer’s
participation rate among three subgroups was below the target of 95%
participation:
Subgroup
|
Subject
|
Students Enrolled
|
Students Assessed
|
Percent Assessed
|
Economically disadvantaged
|
Mathematics
|
70
|
66
|
94.29%
|
Hispanic
|
Social Studies
|
47
|
43
|
92.41%
|
Economically disadvantaged
|
Science
|
70
|
66
|
94.29%
|
Had only one additional economically disadvantaged student
been assessed in mathematics and science, the percentages for both categories
would have been raised to 95.73%, and Pioneer would have been classified as a
“yellow” school. Of the 600
Michigan schools that received the “red” designation, nearly half were due to
this “automatic flunk” provision.
The “automatic flunk” is apparently designed to punish
schools who allow even a small number of students to fall through the cracks when
it comes to taking assessment tests.
I would point out once again that this has the effect of punishing only
those districts with highly diverse student populations. If a school has no subgroups (or no
subgroups large enough to be considered statistically significant), it is
exempt for the “automatic fail” provision. The more subgroups a school has, the greater the likelihood
that it will miss at least one student in at least one subgroup.
The “automatic flunk” provision also offers a somewhat
different perspective on the question I raised earlier: “Who would want to send their child to
a school given the lowest rating the state can assign?” Given the way “automatic flunk” works,
the question might be more reasonably expressed as, “Who would want to send
their child to a school in which four out of 70 economically disadvantaged
students are not properly assessed in math or science?” My guess is that most parents would
answer these two questions quite differently.
Finally, let’s compare some overall measurements of success,
including percentile ranking, math proficiency and color score for a number of
high schools:
School
|
Percentile Rank
|
Math Proficiency
|
Color Designation
|
Ashley
|
Not available
|
Not available
|
Green
|
Mayville
|
6
|
35.71%
|
Lime
|
West Bloomfield
|
44
|
55.91%
|
Yellow
|
Portage Northern
|
77
|
57.72%
|
Orange
|
Pioneer
|
93
|
80.88%
|
Red
|
Notice a trend?
One would expect a strong correlation between performance measurements
(such as math proficiency and percentile ranking) and color designation. However, for these schools, at least,
the higher the student achievement data, the worse the color rank.
So, to summarize my findings:
·
The only ways for a high school to get a green
designation are 1) to have so few students that no statistically significant
measurements can be obtained, or 2) to close.
·
The best way to get a lime designation is to have
a school with no minorities present – no minorities, no achievement gap.
·
If you are a large, diverse school and have even
a small number of students in the various subgroups who are not tested, you
receive an “automatic red."
·
Schools with very high achievement scores can
nevertheless receive a lower color designation than schools with very low
achievement scores.
Given these findings, I believe the color rating scheme
used
by Michigan Department of Education is not
only arbitrary, meaningless and useless,
it is
actually destructive. It completely fails the stated
objective of providing “a fairly easy way for
the public to
understand from a variety of
measurements how their school buildings and
districts are doing.” In fact,
anyone relying on these
color ratings would, in all likelihood, be completely
misled
regarding the quality of a given school.
It’s high time to toss this system into the nearest trash
can and start over.
Andy Thomas,
Trustee
Ann Arbor
Board of Education
Consider subscribing to Ann Arbor Schools Musings by Email!