Quantcast

NYC to Release Teachers' 'Value-Added' Ratings: Why It's Not Fair | The Nation

  •  
Dana Goldstein

Dana Goldstein

 Education, health, women's issues and politics.

NYC to Release Teachers' 'Value-Added' Ratings: Why It's Not Fair

The New York Times and WNYC are preparing to publish online the “value-added” ratings of 12,000 New York City teachers—an estimation of each teacher’s impact on his or her students’ standardized test scores in math or English.

Value-added, a tool developed by economists, is highly controversial, and the Times and WNYC acknowledge the measure is volatile. (To read about how value-added scores are calculated in New York, click here.) For math teachers, the margin of error in estimating a teacher’s impact on students’ test scores could be up to thirty-five points on a 100-point test; for English teachers, the margin of error could be up to fifty-three points. A state court ruled against the United Federation of Teachers’ attempt to prevent the city from releasing the data to news agencies.

In 2010, the Los Angeles Times created an online database of value-added scores, searchable by teacher name. Ever since, the question of whether to publicly release such reports has split the standards-and-accountability school reform movement. New York Mayor Mike Bloomberg and federal Secretary of Education Arne Duncan support publication with names attached. But yesterday, Bill Gates wrote a Times op-ed arguing that although value-added is a useful tool when combined with more holistic evaluation methods, such as classroom observation, he opposes releasing individual teachers’ value-added scores to the public, calling publication a “shaming” device. Teach for America founder Wendy Kopp is also on the record opposing publication.

For what it’s worth, I agree with Gates and Kopp: value-added is a promising tool, but must be further refined and deployed with extreme caution. My friends at GothamSchools, the best indepdendent news source for the New York City public schools, have decided not to publish the data reports with names attached, citing value added’s high volatility from year to year, as well as questions about the reliability of New York’s standardized tests, on which the value-added scores are based. The GothamSchools team also points out that teachers who earn high value-added ratings may be teaching to the test.

Stephen Lazar, a Brooklyn public school English and social studies teacher, has published a list of what value-added can’t measure:

  • They don’t tell you that last year I taught 100% of our juniors who are special education students and/or English Language Learners, even though I only taught 50% of our juniors. They also don’t tell you I requested these most challenging students.

  • They don’t tell you that I spent six weeks in the middle of the year teaching my students how to do college-level research. I estimate this costs my students an average of 5-10 points on the Regents.

  • They don’t tell you that when you ask my students who are now in college why they are succeeding when most of their urban public school peers are dropping out, they name that research project as one of their top three reasons nearly every time.

  • They don’t tell you which of my students had a home and a healthy meal the night before the test.

Lazar’s whole list is worth a read, and reminds you just how difficult—and difficult to measure—a teacher’s work is.

Before commenting, please read our Community Guidelines.