Skip to: Content
Skip to: Site Navigation
Skip to: Search


Test-score 'bomb': How far is too far in teacher accountability push?

The Los Angeles Times is planning to publish a database that shows how much students' test scores have improved under specific area teachers. Some say it's a useful teacher accountability tool. Critics say it's not a fair portrait.

By Staff writer / August 17, 2010

The Los Angeles Times plans to publish a database of more than 6,000 third- through fifth-grade teachers later this month, along with the test scores of their students.

Newscom/File

Enlarge

Should parents – and everyone else – have the right to see just how much kids’ test scores have changed under individual teachers?

Skip to next paragraph

According to the LA Times, the answer is yes.

An analysis the paper has done of just such data – and its plan to publish a database of more than 6,000 third- through fifth-grade teachers later this month, along with their results – has touched off a large debate around the country. It’s the first time such information will be made public for a district.

Already, union leaders have denounced the idea and called for a boycott of the paper, while education officials including Education Secretary Arne Duncan have weighed in to support such transparency. But even some of the biggest advocates of the data – and of using it to evaluate teachers and to make hiring and firing decisions – are critical of the way it’s being used here.

“I’m all in favor of using these tools, but I think it’s way early in the game to be putting individual teachers’ names in the paper with certain scores and suggesting this is a highly informative look at how good a job they’re doing,” says Frederick Hess, director of education policy studies at the American Enterprise Institute.

The data in question is called “value added.” In theory, it’s a way to link teachers with the students they taught and the gains those students made. Since it just looks at growth – not overall proficiency – it should minimize many of the external factors that often affect performance, and focus attention on what contribution the teacher made.

But this is the first time individual teacher performance using value-added data will be made public. Some argue that doing so will send the wrong information about some teachers, making some look better or worse than they are. Others say the public is ill-equipped to understand what the data means and what its limitations are.

“This puts it out to the public before it puts it into the hands of educators and trains them how to use it,” says Paige Kowalski, a senior associate at the Data Quality Campaign, which works with states to get better data systems in place. “To just throw it out there kind of sets it off with a bomb.”

Permissions