In the 1990s, Massachusetts introduced the mandatory student testing program Massachusetts Comprehensive Assessment System (MCAS). According to the Massachusetts Department of Education, "based on 1998 MCAS test scores, all schools were placed in one of six baseline categories, Very High, High, Moderate, Low, Very Low or Critically Low. Each category has a point range for expected improvement. Using the average of the 1999 and 2000 MCAS test scores, each school was then rated as to whether they met their individual goal." In January 2001, the Boston Globe ran a front-page article on the improvement results released by the Department of Education. The article read in part:
A state report rating public schools on three years of MCAS results shows some of Massachusetts's highest-performing schools "failed to meet" expectations, a description that angered many educators across the Commonwealth. In the state's first attempt to give schools targets to reach on the Massachusetts Comprehensive Assessment System exams, 56 percent of 1,539 schools did not improve enough to meet those goals. The remaining 44 percent "exceeded, met, or approached" their growth targets, according to the report released yesterday.
The results largely bode well for urban districts that had the most ground to make up. But the state's top public schools ? in wealthier districts such as Harvard, Newton, Wellesley ? performed so well in 1998 that boosting scores any higher was nearly impossible. As a result, many of those schools received an "F," for "failed to meet" expectations.
State Education Commissioner David P. Driscoll said the ratings are fair because each school is simply compared to itself, but the formula is drawing fire from communities with some of the state's best MCAS scores. Brookline High School, for example, with 18 National Merit Scholarship finalists and the highest SAT scores in years, missed its test-score target ? a characterization blasted by Brookline Schools Superintendent James F. Walsh, who dismissed the report.
"This is not only not helpful, it's bizarre," Walsh said. "To call Brookline, Newton, Medfield, Weston, Wayland, Wellesley as failing to improve means so little, it's not helpful. It becomes absurd when you're using this formula the way they're using it." Acknowledging the apparent disconnect between the scores and the labels, Driscoll urged high-achieving districts not to take the results out of context. At a news conference he sounded themes that could have been delivered to poorly performing schools with dismal MCAS scores, instead of top-scoring schools. "It doesn't mean they're failing," Driscoll said. "It just means they didn't meet that standard. There could be any reason."
Critically (and briefly) comment on the results reported in this article. Why may some of the state's top public schools have failed to meet their improvement goals? Can Education Commissioner Driscoll legitimately claim that "the ratings are fair"?
Answer: Students at the state's top public schools were already performing at the very high end of the achievement spectrum when the baseline MCAS scores were obtained n 1998. Thus, in these top-performing schools there was only a small room for improvement, whereas students at schools that achieved lower had more room for growth. In other words, there appears to be a ...
Massachusetts's top public schools failing to meet improvement goals are examined. Whether the education commissioner Driscoll legitimate claim that "the ratings are fair" are determined.