Categories: NewsTech Science

Finals Edition: Professors embrace software meant to combat plagiarism

Jonathan Misurda was grading a student’s computer programming assignment a few years ago when he noticed something odd.

A line of code the student had ostensibly written contained the last name of a student who had been in one of Misurda’s previous courses.

“It was not a common last name,” Misurda said, and it indicated that some of his students were passing around old coursework. 

While educators often rely on their instincts to spot students’ attempts to cheat, some are using online programs to help them hone in on work students might have copied from one another or plagiarized from published texts without adding citations.

Pitt maintains licenses that allow instructors to use two such services — Turnitin and Safeassign — to check work students submit. 

Cynthia Golden, director of the Center for Instructional Development & Distance Education, said such services are becoming increasingly popular among instructors at the University, and about 150 instructors use Turnitin in a given semester. She based this estimate on the number of accounts instructors set up to use the service in their courses. 

While it would be harder to find the number of instructors who use Safeassign, another service independently available to instructors through Courseweb, Pitt instructors are showing a growing interest in the online services.

“I can say for sure that there has been increased interest in these tools during the last year,” Golden said in an email. “Our educational technology center has been getting more inquiries about it than in the past.”

Turnitin and Safeassign operate similarly, pointing out sections of students’ work that resemble other authors’ work.

Educators who use the services can upload students’ work themselves or require students to post their own work.

The services check students’ submissions against sources publicly available through Internet searches and their own databases of published work, which include sources such as journals and periodicals, according to each respective website. Both services also store old assignments to compare them with subsequent submissions.

The services then generate reports that flag matches between students’ work and other text.

The programs do occasionally produce false matches. 

According to an Inside Higher Ed report from 2009, researchers at Texas Tech University found that the services, especially Turnitin, frequently flagged nonplagiarized work because of coincidental similarities to other texts. 

But Chris Harrick, vice president of marketing for iParadigms, the company that owns Turnitin and is based in Oakland, Calif., cautioned against accusing students of plagiarism based solely on a report from the service.

“We’re just a tool to detect unoriginal content,” Harrick said

For example, a paper might contain common turns of phrase, or a student might use citations the program doesn’t recognize. It is up to the instructor to decide whether a student has cheated on an assignment.

Michael Kessler, the director of undergraduate studies for Pitt’s Department of Philosophy, said that an instructor wouldn’t accuse a student of cheating because a single sentence resembled one found in text elsewhere.

“If big chunks of it look like they were taken from another source, then it probably is plagiarized,” Kessler said. 

While they are not fool-proof, Kessler said services like Turnitin and Safeassign automate a process that could take instructors much longer to perform without them.  

Turnitin takes an average of 25 seconds to compare a student’s paper with the materials in its databases, submissions from others in the same course and online sources, according to information on its website.

Carol Bové, a senior lecturer in Pitt’s department of English, said she is considering using Turnitin next fall to save time as she grades papers, but said she still has reservations about the service. 

She said she worries that using the service might give students the impression she is suspicious of them and might upset the delicate “atmosphere of trust” that helps students learn.

But Panos Chrysanthis, a computer science professor, said such services hold one advantage for educators who must confront students who might have cheated: Students can’t claim they have been unfairly singled out.

“The great thing about these automated tools is that you cannot blame the TA or professor,” Chrysanthis said.

Writing-intensive disciplines are not the only areas where instructors are using online programs to check for plagiarism.

Misurda, a computer science lecturer, said he has previously used Moss, a service available to educators for free on Stanford University’s website, during two or three semesters while he taught computer engineering.

The program allows instructors to submit the computer programming students have handed in and then compares the assignments to one another.

In programming assignments, some students might take code from their peers and then change superficial elements of it to make it look like their own work, according to Misurda.

Such shallow changes might escape a cursory inspection by a grader. But when substance of the program — the portion that includes instructions for the computer — matches the work, Moss would recognize these similarities immediately, according to Misurda.

He added that, as with the case when he recognized the name of one of his old students in someone else’s work, educators can often spot cheating without software.

“[Moss] is really about verifying an instinct and catching things that are a little more subtle,” he said.

Bové said that she has entered text from students’ papers into a search engine when she’s suspected that parts had been plagiarized. For example, a paper that seems more polished than she would expect might make her skeptical about whether it is the student’s own work.

“Oftentimes, it doesn’t turn anything up,” she said. “But once or twice, it has.”

Like Bové, Misurda said he considers checking for plagiarism an unpleasant task that is secondary to his role as an educator. He said software doesn’t stop students from getting to the root causes of cheating. Misurda said he suspects most students grow desperate when they’re up against deadlines and then copy work from elsewhere despite their better judgment.

“The problem with throwing technology at this problem is that it’s not a technology problem,” he said. “It becomes pretty easy [for students] to say, ‘I’m stuck. Let me Google this.’”

 
Pitt News Staff

Share
Published by
Pitt News Staff

Recent Posts

Pitt Faculty Union votes to ratify first labor contract with university

After more than two years of negotiations with the University and nearly a decade of…

2 days ago

Senate Council holds final meeting of semester, recaps recent events

At the last Senate Council meeting of the semester, Chancellor Joan Gabel discussed safety culture…

3 days ago

Op-Ed | An open letter to my signatory colleagues and to the silent ones

In an open letter to the Chancellor published on Apr. 25, a group of 49…

2 weeks ago

Woman dead after large steel cylinder rolled away from Petersen Events Center construction site

A woman died after she was hit by a large cylindrical steel drum that rolled…

2 weeks ago

Pro-Palestinian protesters gather on Pitt’s campus, demand action from University

Hundreds of student protesters and community activists gathered in front of the Cathedral of Learning…

3 weeks ago

SGB releases statement in support of Pitt Gaza solidarity encampment

SGB released a statement on Sunday “regarding the Pitt Gaza solidarity encampment,” in which the…

3 weeks ago