Pages

Monday, March 26, 2012

Testing the Year Away: Julie's Comments

My friend Julie and I went to last week's school board meeting to raise some concerns about the amount of testing going on in the district, and some specific concerns with the NWEA MAP test. And this is what Julie had to say:


These concerns have been compiled from various sources including my own observations, concerns shared with me by other parents in various elementary schools in the district, as well as feedback from teachers and students.

·      The new MAP test overlaps with much of the testing and assessments already being done.  We keep adding tests, but don’t remove any, leading to TOO MUCH testing, and taking up valuable instruction time and school resources.   SRI, reading assessments, Fast Math, MEAP, NWEA MAP, classroom assessments all conducted within the first 6-8 weeks or so of each school year.

·      People are concerned that the upcoming Tech Millage will be used to upgrade equipment for the primary purpose of supporting this increased testing.  Many parents willing to support a tech bond for the purpose of improved education of our kids, are much less likely to support a bond used for more testing of our kids.

·      We understand that Lansing is beginning to require comprehensive teacher evaluation overhauls.  However, the MAP was never designed to be used for this purpose, and has no statistical validity in this context, by NWEA’s own report (see attached).  If this test was added to the district to fulfill this requirement, it will be flawed data.  If this test was added strictly for the purpose of evaluating children, it is redundant and provides questionable benefit for very high cost, in dollars, resources and time.

·      The MAP test is currently administered three times a year, leading to narrowing of curriculum and more “teaching to the test.”  The frequency of this testing forces teachers into a linear pattern of teaching, completely opposite the project-based, in depth style of teaching often referred to as “Best Practices.”  Ann Arbor Open is an extremely successful and very popular program precisely because it emphasizes project-based, in-depth learning.  The Board and the AAPS administration have repeatedly indicated that they would like to take the things that are “working well” at Ann Arbor Open and Community, and help bring them into other classrooms throughout the district.  This test does precisely the opposite, further decreasing time for pursuing children’s interests, or delving deeply into subjects of interest or relevance.

·      The fact that the students’ scores are immediately visible to the students following the test leads to comparisons, competition, and anxiety, all of which are unnecessary and counter-productive, and work to undermine attempts to create cooperative, collaborative, safe learning environments in our classrooms.  The AAPS has clearly chosen NOT to use grades in the elementary schools, and this was done for a reason.  It is well-known that focus on letter grades leads to extrinsic motivation instead of intrinsic love of learning, and these MAP scores have already produced anxiety in kids, and have altered cohesion in classrooms.

·      Feedback from some of the children taking the test reveals that some kids have already learned that the test can be “shortened” and made easier by just answering randomly.  Other students have found themselves sitting for a single test-taking session for a full two hours or even longer, which is an excessive amount of time in elementary school.

·       The cost for purchasing and continuing to run this test, in dollars, is simply not worth the dubious “value” it provides.  Our dollars are slim and must be utilized to the absolute best interests of the children, and this test is not that. 

Action Points
There are several  points upon which action can be taken immediately, and could help the Board elicit more feedback and information about this Pilot program, and soften some of the consequences in the meantime.
Ideally, I believe the Board should consider putting this test on “HOLD,” and not utilize it next year, until it can more fully evaluate the test’s unintended consequences, and it’s validity in the context of that for which it was purchased (state mandates for teacher evaluations).
Immediately, I believe the following could and should be done:
1.     Drop the test from three times a year to two.   Drop testing for K-2 age students.  (if the test continues to be given)
2.     Insist that the NWEA remove the line of code that enables students to see their scores.  As the consumer, we can refuse the product if it is not so altered.
3.     Consider inviting regional and local experts on Teacher education and effectiveness to discuss current understanding of what comprehensive teacher evaluations should look like.
4.     Seriously consider discontinuing one or more of the overlapping fall student assessments, if the MAP is to be used again next year
And, of utmost importance:
5.      Circulate a FEEDBACK SURVEY to all teachers, principals, parents and students.  This test was purchased as a Pilot, and the only way to evaluate a pilot is to get feedback.  The survey should be anonymous, so that teachers and administrators feel they can speak freely without consequence.

I want to thank the Board for their time tonight, and their time in considering this document. 
[Julie did give the board her contact information as well. If you want to contact her, send me a note and I will forward it along.]

16 comments:

  1. I agree wholeheartedly and I hope that the board will listen to parent feedback on this topic!! Thanks Julie and Ruth!

    ReplyDelete
  2. My son is in second grade at A2O. His MAP tests revealed him to be scoring very well in math and reading, both times. This came as no surprise. He also rarely works to that level in the classroom unless it is a topic that interests him, like the multicultural fair or a focus study. This is not a situation that more testing is going to address, and in fact, things like MAP testing might mask his classroom issues, which are very clear to his teacher without standarized testing.

    ReplyDelete
  3. This post is very encouraging! I have had many of the same concerns that you and Julie have had about the NWEA MAP tests and am very glad to find more people in our community that share these concerns.
    I have done some homework asking questions and voicing concerns about the MAP test. In January, I contacted the principal at our school and asked about having the scores turned off at the end of the tests. This email eventually led me to John Van Riper, the AAPS Director for Information Technology.
    Mr. Van Riper told me that “The issue here is not what AAPS can do but is, instead, a corporate decision made by NWEA related to a product based on an operating/hardware system that is out of date and going away. I am being honest when I say that, purely from a business perspective, I am sympathetic to their position. It was only due to their willingness to work with us this year and support a dying product that we were able to accomplish what we did. I know - this is not a good position to be in but the older the equipment we have gets - the more situations like this we will have to deal with. I wish it were different. Please don't stop communicating with me. If there is anything I can do or anything I can clarify, I am more than happy to do so.”

    The action plan that Julie presented to the school board addresses all of my concerns regarding this test (I am so grateful not to be the only one asking questions!). Thank you for all your hard work. Please let me know how I can help!

    I’d also like to take a quick moment to say thank you to Mr. Van Riper. He was efficient, informative and willing to take a few minutes to find thorough answers to all of my questions.

    Shelly Rettell, AAPS parent

    ReplyDelete
  4. Hi, Julie here! Thanks for posting this Ruth. I am concerned about Mr. Van Riper's comments that our old technology is part of the problem. My concerns about the test are much more philosophical and larger in scope than that which newer technology could fix. I don't want this conversation to become sidelined into support for the millage, because the concerns here bigger and broader than that. Dr. Green had a similar response at the BOE meeting. In fact, not only does this have very little to do with our aging technology, but the fact that the administration is holding it up as a potential reason for supporting the millage makes me more reluctant to support it in the first place. (I mention this in my notes).

    I really believe that we need to stop this test for next year. The pilot has raised too many concerns. Costs do not outweigh benefits. I'm not even sure WHY we are giving this test. Teacher evaluation? Not valid. Student "benchmarking?" What about all the SRI and fast math tests, and the extremely thorough and specific report cards that come home three times per year? I trust those way more than this test, and they are not rife with the same set of problems we see here.

    ReplyDelete
  5. One more thing. A parent friend of mine pointed this out.... In the younger set (K-2), those kids with access to computers at home are likely to fare better than those without (there are kindergarteners who have never even seen a mouse!). This can skew the data based on socioeconomics. More problems.
    Julie

    ReplyDelete
  6. Julie,

    I agree that this should not become a conversation about the millage. My interaction with Mr. Van Riper originated with asking that the test score be eliminated from the screen at the end of the test and eventually evolved into the comment that I posted. The technology piece (in my mind) is seperate (or hopefully should be). But I do think it is interesting that this is the response that I received. And like you, my response is to be reluctant to support the millage if testing is the primary use of that technology (or if that is the reason that the district gives).

    The information you brought to the board meeting was researched and well thought out. My hope is that many parents and educators agree with the action plan and are willing and able to talk about it and support it. When I talk to parents, they often do not like the additional testing but are unsure what they can do to prevent it and keep it from escalating - it seems inevitable. You have started a great conversation - how do we keep it going?

    Shelly

    ReplyDelete
  7. Hi Shelly! I think the Board and the Administration needs to hear from parents individually. And teachers and students as well, anyone who has concerns. I'm afraid they sometimes act in a bit of a vaccuum.... unless we tell them how we feel about stuff, how do they know? The more they hear from the community, the more they'll be inclined to understand the very serious issues here.

    The district has lots of interest in keeping AAPS parents engaged and satisfied with our schools. If the increased testing is interfering with that, they need to know.

    I'm glad you are lending your voice as well, Shelly!
    Julie

    ReplyDelete
  8. Disclaimer: I'm an a2 public high school teacher and parent of an elementary school kid in the district.

    I'm not a fan of our current testing craze, but I'm a little nervous about the conflation of anti-testing sentiment with the tech millage. Had a long conversation with our building tech guy the other day, who is terrified the millage won't pass. He already is the busiest guy in the building trying to keep our aging tech afloat. An important piece of non-computer tech for my program died in October. If the millage doesn't pass, it is just gone forever, at least acc. to our tech guy, who is a pretty apolitical straight shooter in my experience.

    I can't speak for elementary school testing, but at the high school level, in terms of using tech for testing, twice a year, my students do SRI testing in the computer labs. Yeah, it's time not in the classroom, but I actually find the scores very useful as an educator, though I always take them with a grain of salt. (I did not feel positively about My Access, the computer-based writing assessment, and am glad it is gone.) Beyond those two days of SRI testing, I'm not aware of any tech used for testing. Our laptop carts are checked out almost every hour of every day, but teachers are using them for curriculum. The computer labs are signed out weeks in advance. I have no idea what percentage of millage funds would be used primarily for the purpose of testing, but my suspicion is that it would be very small. There might be other good reasons to reject the millage, but I don't think anti-testing sentiment is one of them. I would assume most of the funds are going toward updating technology that teachers use every day to advance classroom curriculum.

    I could be wrong, though. Is there some kind of spending proposal that is available to the public, or does the district prefer to keep these things vague? (You'd think I'd know, but I don't.)

    ReplyDelete
  9. While it's a lot of testing, after the experience we had in AAPS, I have to say that this kind of indepth testing is very warranted. My kid went to a Title 1 school and the district, and hiding behind the non grade grades, he actually wasn't taught much. I also know that African American kids coming out of that elementary weren't taught that much either. "You know, that's an at risk kid" some teacher would say in low tones, like that explained why the only first grader in the class who couldn't read a thing, a cute little African American girl was left to her own devices during reading time. Seriously, this kind of scrutiny is needed in the district.
    the kids will live through the testing, and they'll be fine.

    ReplyDelete
  10. In answer to this question: "Is there some kind of spending proposal that is available to the public, or does the district prefer to keep these things vague?" (Anon 8:21 PM), I'm assuming that you are asking about the proposal for how the tech millage monies will be spent. In fact, there is a detailed listing on the AAPS web site right here.

    A very large amount of the tech bond monies will be used for infrastructure. Nevertheless, the association between testing and the tech bond persists because--to give one example--when we complained about not having access to the computer labs for weeks on end due to testing, the response from Dr. Green was that passing the tech bond would mitigate that problem. I personally don't want to pay for testing, since I don't believe it is necessary, and while I do believe in the importance of a technology infrastructure, to the extent that funding technology will make testing ever more ubiquitous, I am leery of funding technology. Yet, I do hear what you are saying, anon 8:21.

    Anon 8:41, I hear you describing a situation that is unacceptable, and yet I am not at all clear that testing is the fix that will right the wrongs. It seems clear that in the example you give, the teachers were aware that this student was performing well below average. They didn't need testing to tell them that. What they needed was someone to hold them accountable for doing something about it.

    ReplyDelete
  11. Exactly, and unless you get some documentation you can't hold someone accountable.

    ReplyDelete
  12. I whole heartedly agree with many of the points Julie made, but the one I find most disturbing is the fact that the district intends to use this test for teacher evaluation when it was not designed to do this. As someone with experience with the tests used for Special Education evaluation and qualification for services, this strikes me as similar to using a Conners ADHD rating scale to diagnose dyslexia. If someone did that in a MET meeting they'd be laughed out of the room because, as Julie notes, the results would be statistically invalid. But here we have an entire district, spending a crazy amount of money (and taking away valuable instruction time) to use a flawed instrument to evaluate their teachers. Absolutely absurd.

    ReplyDelete
  13. My kids have been taking this test on and off (due to out-of-state moves) since first grade (oldest is now in 6th grade), and we've come to look forward to it's results in Sept., Jan., and May. While MEAP measures proficiency, the MAP measures growth and while teachers may have a clear idea about our sons' abilities, parents may not always know their individual strengths and weaknesses in the core subjects. MAP enables us to see that. In one school, it also formed the basis of student led conferences - students discussed their strengths and set goals to address weaknesses. In one school, the administration and teacher added more time to Math instruction and skill review when the growth target was not reached. So, there is value when deployed properly. I think the points raised by Ruth and Julie are valid and can be addressed by the administration. The viewing of the score immediately following the test is unnerving and I'll ask my sons if this is the case for them as well (or perhaps this is due to the older OS running on AAPS' desktops and an older version of MAP), a one-on-one skills discussion with the teacher would be preferred. I look forward to reading more on this and will check out the link shown on NWEA's report re: teacher evaluation.

    I will say that my sons tend to perform above grade level and that may contribute to my pro position on MAP. However, if I had a child that scored less than proficient on MEAP, I'd next want measures of growth at regular intervals, and MAP does that. With the higher cut scores, more students are scoring below proficient and having evidence of growth is valuable.

    -dswan/common_cents

    ReplyDelete
  14. My kids have been taking this test on and off (due to out-of-state moves) since first grade (oldest is now in 6th grade), and we've come to look forward to it's results in Sept., Jan., and May. While MEAP measures proficiency, the MAP measures growth and while teachers may have a clear idea about our sons' abilities, parents may not always know their individual strengths and weaknesses in the core subjects. MAP enables us to see that. In one school, it also formed the basis of student led conferences - students discussed their strengths and set goals to address weaknesses. In one school, the administration and teacher added more time to Math instruction and skill review when the growth target was not reached. So, there is value when deployed properly. I think the points raised by Ruth and Julie are valid and can be addressed by the administration. The viewing of the score immediately following the test is unnerving and I'll ask my sons if this is the case for them as well (or perhaps this is due to the older OS running on AAPS' desktops and an older version of MAP), a one-on-one skills discussion with the teacher would be preferred. I look forward to reading more on this and will check out the link shown on NWEA's report re: teacher evaluation.

    I will say that my sons tend to perform above grade level and that may contribute to my pro position on MAP. However, if I had a child that scored less than proficient on MEAP, I'd next want measures of growth at regular intervals, and MAP does that. With the higher cut scores, more students are scoring below proficient and having evidence of growth is valuable.

    -dswan/common_cents

    ReplyDelete
  15. Here is an article that indicates that yes, this test is indeed intended by our district to be used for teacher assessment: http://www.annarbor.com/news/ann-arbor-school-board-to-consider-assessment-program-to-measure-individual-student-growth/.

    There is so much wrong with this, I don't know where to start. If you read the attachment above, it gives a good analysis of the statistics. For example, students are not randomized into classrooms. Perhaps students with challenging behavior issues are steered away from brand-new teachers till they get their feet wet. Or 'we know teacher X does well with such-and-such type of student.' I actually had a teacher tell me this week that instead of taking on the challenge of teaching ALL kids, including those with difficulties, which she enjoys... this sort of thing will make her more inclined to want a high-achieving, "easier" group to teach each year. Of course it will!

    And what about science teachers? There's no science on this test. Or art? Music? Librarians? Special ed teachers?

    I would add.... my kids' teachers were able to tell me precisely where they thought the test accurately reflected my kids' strengths and weaknesses, and where they thought it didn't. They had a much more nuanced understanding of my kids than the test did. Why don't we trust these people who have taken on the job of educating our kids???

    ReplyDelete
  16. Common Cents, I'm interested in your positive experiences with the test. I haven't taken or seen the test so I don't have a sense of whether it is a "good" test (as a test) for assessing students. And perhaps if we dropped some other testing (SRI, Fast Math, MEAP--oh wait, we can't drop the MEAP) then there could be a place for it. Here, middle school students don't have full-length conferences at most of the schools unless they are having trouble or their parents are insistent on it. And the information I got on my son's test (I will see if I can post a picture of it) was just a single line score. So I don't see it being used as feedback for students the way that you describe unless some things were to really change. It does make me think, though, that if the district did a thorough evaluation and decided they wanted to keep the test, they should see what are the "best practices" for getting the most out of the test. And in that case, the district you came from might be a good model.

    ReplyDelete

AddThis