The state Department of Education has come up with a new measuring tool to assess public school student performance in the Common Core curriculum era. The new assessment program replaces the Academic Performance Index (API) used for many years by the state. Although the new California School Dashboard that was rolled out earlier this month provides some useful data to help school districts identify areas in need of improvement, it has shown itself to be in considerable need of improvement itself.
Thankfully, the first issuance of the new “high-tech report card” — so described by state Superintendent of Schools Tom Torlakson — is considered a field test. The Dashboard’s “design and features will be changed over time based on user feedback,” according to information at caschooldashboard.org, where the first Dashboard rankings of the state’s public schools can be found.
Those rankings cover absenteeism, suspension rates, English learner progress, and performance in English language arts and math. The problem is that, with this new metrics system, “performance” is assessed not only by test scores but also by the rate of improvement in those areas from one year to the next — all in one measure. This offers the school community a misleading picture of how closely students are actually coming to meeting academic standards at their particular grade level.
Measuring improvement is essential. But to lump that measure in with student achievement as measured by tested performance skews the assessment. Why not two assessment categories: level of improvement, and performance measured against academic standards?
For an example of a misleading assessment in the first Dashboard report card, compare the ranking of Stevenson Elementary with that of Theuerkauf, both in the Mountain View Whisman School District. Although only 45 percent of Theuerkauf students met or surpassed state standards in English language arts, it was given the same high ranking in that category as Stevenson, where 84 percent of students met or exceeded the standards.
On the positive side, the new measuring tool can help schools target possibly systemic problems in areas such as student absenteeism and suspensions, and the school district is already working with principals in some of its schools on lowering the troubling rate of suspensions; the high suspension rate in several of the schools particularly affect children with disabilities.
The state education department deserves kudos for seeking feedback from the school community on the Dashboard “field test.” Next fall, it intends to put a fine-tuned assessment tool in place, and begin to use the Dashboard data for individual school accountability purposes. In the next few months, the state will focus on improving its first version of the Dashboard. We hope that Version 2 is less confusing and that it will provide school districts with meaningful data to help them improve student achievement.



