Editorial: Publish OMET results to raise response rate
January 28, 2013
Toward the end of last semester, among the exams and final papers, you might recall a string of emails from the Office of Measurement and Evaluation of Teaching beginning with “Online survey reminder for Professor…”
OMET was probably your best friend some days, cheerfully filling your inbox with a gentle reminder to do your part to help Pitt’s teaching quality.
The good news to people who enjoyed these notifications is that we might be getting many more of these emails. According to a story in the Jan. 24 issue of University Times, the faculty and staff newspaper published by the University, these “pester emails” were effective at getting students to respond.
Overall, however, response rates were low. Online implementation did save the University about $35,000, but students only returned 50 percent of requested evaluations, down from the typical average of 70 percent.
If the University continues with plans to move to only online evaluations by fall 2013, these rates will need to improve.
The decline wasn’t entirely unanticipated. Pilot studies by the Pitt Advisory Council on Instructional Excellence suggested digital evaluations wouldn’t be greeted with overt enthusiasm.
We aren’t surprised, either. While the University Times story shows that deans and administrators are concerned with low response rates, their solutions seem to revolve around different advertising plans and increasing levels of in-class professor announcements to improve response rates.
But keeping solutions restricted to marketing will not drastically improve response rates. The fact is, by shifting evaluations from a class-time activity to a leisure-time activity, a lower response rate is inevitable.
The only way to improve rates is to make it worthwhile for students to complete evaluations. Other than bribery through iPad raffles or other prizes, the only way to do this is to let students see the fruits of their labor.
OMET results must be made public. Or at least partially public.
The new online survey method makes sense, and it is in everybody’s interest that response rates increase. Digital forms are cheaper and more environmentally friendly. Plus, spending 15 minutes in class evaluating the professor isn’t a wise way to spend valuable lecture time.
But out-of-class evaluations present a burden to students. The finals-week gestalt leaves little room for thoughtful reviews or honest reflection. Too many other activities simply take priority over professor evaluation.
Therefore, to make OMET evaluations a priority, there must be some benefit for students. When it is unclear if professors are taking suggestions seriously or being offered training and guidance based on student suggestions, it is hard to make the decision to stop studying (or partying, frankly) and spend 15 minutes writing about a teacher’s greatest assets.
The public availability of evaluations is not a revolutionary concept. Many other schools, including Carnegie Mellon, allow students to see evaluations. It’s not an outrageous demand: If we are paying tuition, shouldn’t we be able to closely evaluate the product?
Another, perhaps less dramatic, method of improving response rates would be extending the deadline for completing the evaluations beyond finals week. Even if only for a few days, allowing students to reflect on a class after it has been completed would not only give people time after finals are complete, but might also lead to improved evaluations: Judging a class before it is over is a lot like reviewing a film before the conflict resolution.
Regardless of how OMET attempts to improve response rates, it must acknowledge that it can’t just market its way to high levels of student participation. Pester emails will not substitute for real methods of making the evaluations more meaningful and simple for students.