Schools of education around Indiana have traditionally done all they could to prepare their students so they would be ready to handle a classroom full of energetic kids.
But that’s no longer enough.
The schools are increasingly looking to extend their reach beyond the commencement ceremony to help their graduates be more effective in their first few years of teaching.
Why? Because now more than ever, schools of education can get egg on their faces if their graduates underperform in classrooms.
Data released in April by the Indiana Department of Education show the public, for the first time, how the previous three years of graduates from specific teacher training programs performed in the classroom. Those numbers are based on annual teacher evaluations mandated by the Legislature in 2011 that now play a large role in whether teachers earn raises.
Till now, teacher training programs have proved their
worth helping graduates pass professional licensing exams and get jobs. But going forward, the reputations of schools of education could rise and fall based on how those teachers actually perform in the classroom.
The data have a host of problems, including inconsistencies, that hinder their usefulness. But in spite of those problems, the mere publishing of the teacher effectiveness data has already spurred schools of education to action.
Some are offering continuing education for free to their recent graduates. Others are developing programs that essentially move the first two years of teaching—instead of only one semester of student teaching—inside the teacher training program, to ensure no one graduates unless they prove their effectiveness.
The idea is to prepare teachers more like physicians. Medical education doesn’t end when students receive their MDs. Instead, they must complete a residency program in which they treat patients under close supervision of veteran doctors. Only then are MDs allowed to treat patients on their own.
“I think you’ll see more residency models. It’s possible you will see longer pre-service partnerships between training programs and districts,” said Lindan Hill, director of the Marian University Academy for Teaching and Learning Leadership. “It will change a lot of things. Frankly, I’m looking forward to those things.”
Indeed, the pressure on schools of education to ensure graduates are performing well is only going to grow in the next few years.
A new law, which took effect July 1, requires the Department of Education to publish on its website the scores the graduates of each teacher training program earn on their teacher licensing exams, as well as how many times they had to take the tests to pass.
House Bill 1388 also requires the department to compile reports based on surveys that gather principals’ feedback on the quality of first-, second- and third-year teachers in their schools. The reports will be broken down by each teacher preparation program in Indiana.
The initial push to evaluate teachers annually, to base those evaluations in part on student performance on standardized tests, and then to link those evaluations back to teacher training programs came from President Obama’s administration. In its 2009 Race to the Top competition, the U.S. Department of Education encouraged states to create “longitudinal data systems” that could show the connections.
Indiana didn’t win Race to the Top funding, but former Indiana schools chief Tony Bennett pushed hard, anyway, to make that evaluation vision a reality. Bennett, who was voted out of office nearly two years ago, saw the data as one of several efforts—including his more controversial changes to the rules for teacher licensure—to transform the way teachers are trained.
“It caught every institution by surprise that we were going to be evaluated,” said John Jacobson, dean of the School of Education at Ball State University, referring to 2011 changes to the law. But most education schools supported HB 1388.
“Obviously, we don’t want any of our graduates to be less than effective. That’s our hope,” Jacobson said.
More than hope
Ball State appears to have done better than most, according to the teacher effectiveness data. In the 2012-2013 school year, 21.3 percent of its graduates in their first three years of teaching were rated highly effective compared to a state average of 19.5 percent.
Also, 3.1 percent of Ball State’s recent graduates were rated as ineffective or improvement necessary compared with a statewide average of 3.7 percent.
In May, however, Ball State trustees decided hope wasn’t cutting it anymore. They approved a plan to ensure all Teachers College graduates are “profession-ready.”
Ball State is overhauling student teaching this year with what it calls a “co-teaching, co-planning” model. Ball State has trained 600 K-12 teachers to work side-by-side with student teachers planning lessons, teaching them, and then giving the student teachers pointers on how to do better.
All student teachers will now be formally evaluated six times over the semester, and they must pass each to graduate.
Also, each student teacher must complete a project in which he or she designs a unit of instruction, a pre-test to assess students’ abilities, a post-test to gauge what they learned, and a self-assessment to gauge whether the student teacher’s approach to teaching that unit worked, or not, and why.
“No longer does a teacher with a student teacher walk out of the classroom,” Jacobson said.
If Ball State students still struggle in their first year of teaching—as most first-year teachers do, according to numerous research studies—Ball State has developed a series of online instructional “modules” the teachers can take to improve in specific areas.
The modules, each of which lasts four weeks, cover such skills as classroom management, lesson planning, differentiated instruction and assessment.
Ball State developed the modules a year ago and offered them to all teachers in Indiana for $199 each to help veteran teachers improve their ratings from effective to highly effective and thereby earn a raise from their school districts.
But this year, Ball State offered the modules for free to its graduates in their first year of teaching who were judged ineffective or “improvement necessary” by their principal on the state-mandated evaluations. Ball State calls this its Cardinal Commitment.
“We have a commitment to these teachers,” Jacobson said.
Butler University in Indianapolis is taking a similar approach as Ball State, although in a less formal way. It is trying to get the Indiana Department of Education to reveal the identity of its graduates who were rated ineffective or improvement needed.
“Many of our faculty still work with first-year, second-year, third-year students,” said Debra Lecklider, associate dean at Butler’s College of Education, adding that knowing which of the graduates are struggling could help it focus those efforts. “We really want to work with the Department of Education on this data so we can use it.”
In the data released by the Department of Education, Butler had the lowest percentage of its teachers rated as less than effective, at just 1.4 percent. The percentage of its teachers rated highly effective was slightly above the state average, at 20 percent.
Marian University, a small Catholic school in Indianapolis, is making even more radical changes.
It recently launched a program with The New Teacher Project, a New York-based not-for-profit that trains teachers for the neediest school districts. Students in that program, who already have a bachelor’s degree in some other field, teach in K-12 schools on temporary licenses for two years while taking teacher training courses at Marian.
The New Teacher Project evaluates the students along the way and the results of its evaluations decide whether the students graduate.
“They’ve got to be either effective or highly effective after two years. If not, then we don’t license them,” said Hill, Marian’s director of its teacher academy.
Marian had the second-highest percentage of its graduates rated less than effective—7.7 percent—and the second-lowest percentage of teachers rated highly effective—12.1 percent.
Hill, however, waved off those statistics. He said Marian’s results might be a bit lower, anyway, because it gears its program to send teachers into high-need school districts, where the challenges are the greatest.
On top of that, Hill said, it’s simply not believable that nearly 97 percent of teachers in their first, second or third year of teaching are effective—as the statewide statistics showed. Not when lots of other studies show that first-year teachers, no matter where they were trained, just aren’t very good.
“You’re not going to have those kinds of numbers, if you’ve got a system that reflects activity, and reflects performance, and reflect student learning outcomes,” Hill said. He suspects the teacher evaluation systems were not rigorous enough or that school administrators used them more to encourage struggling teachers than to give a truly accurate assessment of performance. “The longitudinal data will turn that into a bell curve.”
Even if the teacher evaluations were rigorous, the data have far too many problems to be relied upon yet.
• For all but the largest teacher training programs, there haven’t been enough graduates evaluated to have much statistical meaning.
• The data also, according to the schools of education, have numerous errors—such as teachers assigned to colleges they never attended.
• On top of that, about one in four school districts designed their own teacher evaluation systems rather than following a model system developed by the state.
• And some districts, if they renewed their contracts with teachers unions before the law on teacher evaluations took effect, have not yet had to implement evaluations.
All schools are tracking and analyzing the new teacher effectiveness numbers. In fact, they are required by the accrediting organizations that oversee them to track just about any meaningful data they can—such as grade point averages, pass rates on professional licensing exams, placement rates and surveys they conduct after graduation with former students and the principals they work for.
Not all schools are swinging into action, however. That’s partly because of the limitations of the data.
But some schools are also sticking with their traditional mission, which was to prepare undergraduates to be solid first-year teachers, then to offer graduate-level programs as those teachers, on their own, sought to improve their skills and move up their district’s pay scale.
Staff at the School of Education at Indiana University-Bloomington will continue to look at teacher effectiveness data to pick up ways the university can improve its program for future students.
Those programs already look pretty good. According to the teacher effectiveness data, 23.1 percent of IU-Bloomington’s recent graduates were highly effective, above the state average. And 3.2 percent of its graduates were rated ineffective or improvement necessary, which was a tick below the state average.
But IU educators aren’t deliberately reaching out to help their graduates.
“Currently, that would be an unfunded mandate for us, and the [K-12] schools,” said Jill Shedd, IU-Bloomington’s assistant dean for teacher education. However, Shedd acknowledged there have been conversations “over the years” about extending the reach of IU’s program beyond graduation.
“Our goals and our expectations are to prepare well, first-year teachers,” Shedd said, “and to prepare them with a professional foundation so that they continue to grow.”•