http://www.cleveland.com/metro/index.ss ... alue-.html
Ohio is one of 32 states experimenting with more rigorous, data-driven systems of grading teachers. In Ohio's case, teachers earn one of five value-added ratings that are a key part of their overall "final grades." Those grades range from "Most Effective" at the top end of the scale to the "Least Effective" rating Plecnik received.
The rating is based on a statistical measure called "value-added."
What's riding on these grades varies depending on how school districts choose to use them. But more than a teacher's pride is at stake. The ratings could eventually become part of decisions about how much teachers are paid, what classes they teach and, if a district has to lay off teachers, their place on the list of who stays and who goes.
Many policymakers view this data-driven approach to sizing up teacher performance as crucial to weeding out bad teachers and rewarding good ones. But some teachers see the measure as a flawed attempt at quantifying something that isn't easily quantifiable.
With better measures of teacher quality in place, supporters say, the best teachers can be encouraged to continue teaching by offering them bigger raises, recognition and extra perks like more planning time. The worst teachers can be given intensive training. And if they don't improve, they can be encouraged to leave the teaching profession through the stigma of receiving low marks, or be fired.
Plus, value-added is relatively cheap and easy to put in place compared with other school improvement efforts like those that involve hiring additional staff.
But many teachers believe Ohio's value-added model is essentially unfair. They say it doesn't account for forces that are out of their control. They also echo a common complaint about standardized tests: that too much is riding on these exams.
"It's hard for me to think that my evaluation and possibly some day my pay could be in a 13-year-old's hands who might be falling asleep during the test or might have other things on their mind," said Zielke, the Columbus middle school teacher.
A StateImpact/Plain Dealer analysis of initial state data suggests that teachers with high value-added ratings are more likely to work in schools with fewer poor students: A top-rated teacher is almost twice as likely to work at a school where most students are not from low-income families as in a school where most students are from low-income families.
The newspaper helpfully provides a complete listing of teachers who have been rated over the past 2 years, in case anyone wants to look that up. I wonder how teachers feel about their evaluations being made public that way, even if their ratings are public records.Plecnik is through. She's quitting her job at the end of this school year to go back to school and train to be a counselor -- in the community, not in schools.
Plecnik was already frustrated by the focus on testing, mandatory meetings and piles of paperwork. She developed medical problems from the stress of her job, she said. But receiving the news that despite her hard work and the praise of her students and peers the state thought she was Least Effective pushed her out the door.
"That's when I said I can't do it anymore," she said. "For my own sanity, I had to leave."
Modern solutions to education seem to involve an awful lot of beating teachers over the head with this sort of thing.
Start billing parents for every F Johnny gets and maybe that will result in improved grades. Hell, it's as good an idea as anything else that politicans dream up.
Er, um, sorry, no, that really doesn't wash. A teacher who is being evaluated according to these statistical methods and rules should have the right to see the criteria by which they are judged.Some of the confusion may be due to a lack of transparency around the value-added model.
The details of how the scores are calculated aren't public. The Ohio Department of Education will pay a North Carolina-based company, SAS Institute Inc., $2.3 million this year to do value-added calculations for teachers and schools. The company has released some information on its value-added model but declined to release key details about how Ohio teachers' value-added scores are calculated.
The Education Department doesn't have a copy of the full model and data rules either.
The department's top research official, Matt Cohen, acknowledged that he can't explain the details of exactly how Ohio's value-added model works. He said that's not a problem.
"It's not important for me to be able to be the expert," he said. "I rely on the expertise of people who have been involved in the field."