After analyzing and re-analyzing the MCAS scores that we published over the past few days we have some thoughts and feedback.As part of our analysis and review we took a look at the 2009-2010 Annual Report to see how MVRCS felt they compared to sending districts for the prior years MCAS scores (we provided 2010 scores whereas the Annual Report deals specifically with 2009 MCAS scores). Beyond the difference in testing years, it is worth noting that MVRCS does its analysis by districts while we provided scores for individual schools within each sending district. In the 2009-2010 report, MVRCS claims to have met their goals of having test scores 19-32% higher than the test scores for the sending districts (although they come to this conclusion using 'weighted composite averages' rather than just district scores). We found the whole thing unnecessarily complex, confusing and a great smoke screen to make them appear superior to sending districts. One goal we found particularly interesting was the one that spoke to MVRCS making adequate yearly progress every year....but ONLY in certain subgroups. Our reason for pointing this out is that numbers aside, MVRCS apparently doesn't consider Special Education students to be (statistically) significant as that group did not make AYP. Unlike MVRCS's analysis we didn't take into consideration the number of students that came from a sending district (weighted composite averages) nor did we look at the districts as a whole but rather at individual schools within each district.
Our main feeling towards the MCAS scores that we recently published was pure disappointment as MVRCS didn't do nearly as well as some of the other schools of its sending districts. In a few instances schools within Melrose, Stoneham, or Wakefield score lower than MVRCS but most of those instances the percentages were close. With the exception of Science, MVRCS did do comparatively well in the higher grades but we found the scores for the lower grades to be quite alarming. As for the upper grades, MVRCS students did not perform well on the Science but it should be noted that none of the districts fared any better (the high % was Wakefield and MVRCS with both scoring 45% of their students performing at or above grade level). We were incredibly disappointed and alarmed that only 50% of the 5th graders at MVRCS are reading at or above grade level. Additionally only 48% of these same students are at or above grade level in Math and 35% in Science. That means that more than 50% or more of the current 6th graders were below the state standards for 5th grade last year. In thinking back to the recruitment ad that MVRCS took out in Malden last October, we think they may have been too quick to put Malden down and toot their own horn. The 4th grade class last year did equally as poorly with 46% performing at or above grade level in English and 51% in Math. Again that leaves approximately 50% performing below state standards grade level. The 3rd graders last year did slightly better but were once again below most Stoneham, Wakefield and Melrose schools.
The scores from last year show an improvement over the 2008 and 2009 MCAS scores and in line with 2007 scores. We're not sure what this means besides the schools inability to maintain consistently acceptable MCAS scores.We'll also be interested in seeing how these scores are disguised and discussed within the next MVRCS Annual Report. The different test scores can be found on our website and/or at the DESE website using the links below.