Categories: NewsTech Science

Finals Edition: Professors embrace software meant to combat plagiarism

Jonathan Misurda was grading a student’s computer programming assignment a few years ago when he noticed something odd.

A line of code the student had ostensibly written contained the last name of a student who had been in one of Misurda’s previous courses.

“It was not a common last name,” Misurda said, and it indicated that some of his students were passing around old coursework. 

While educators often rely on their instincts to spot students’ attempts to cheat, some are using online programs to help them hone in on work students might have copied from one another or plagiarized from published texts without adding citations.

Pitt maintains licenses that allow instructors to use two such services — Turnitin and Safeassign — to check work students submit. 

Cynthia Golden, director of the Center for Instructional Development & Distance Education, said such services are becoming increasingly popular among instructors at the University, and about 150 instructors use Turnitin in a given semester. She based this estimate on the number of accounts instructors set up to use the service in their courses. 

While it would be harder to find the number of instructors who use Safeassign, another service independently available to instructors through Courseweb, Pitt instructors are showing a growing interest in the online services.

“I can say for sure that there has been increased interest in these tools during the last year,” Golden said in an email. “Our educational technology center has been getting more inquiries about it than in the past.”

Turnitin and Safeassign operate similarly, pointing out sections of students’ work that resemble other authors’ work.

Educators who use the services can upload students’ work themselves or require students to post their own work.

The services check students’ submissions against sources publicly available through Internet searches and their own databases of published work, which include sources such as journals and periodicals, according to each respective website. Both services also store old assignments to compare them with subsequent submissions.

The services then generate reports that flag matches between students’ work and other text.

The programs do occasionally produce false matches. 

According to an Inside Higher Ed report from 2009, researchers at Texas Tech University found that the services, especially Turnitin, frequently flagged nonplagiarized work because of coincidental similarities to other texts. 

But Chris Harrick, vice president of marketing for iParadigms, the company that owns Turnitin and is based in Oakland, Calif., cautioned against accusing students of plagiarism based solely on a report from the service.

“We’re just a tool to detect unoriginal content,” Harrick said

For example, a paper might contain common turns of phrase, or a student might use citations the program doesn’t recognize. It is up to the instructor to decide whether a student has cheated on an assignment.

Michael Kessler, the director of undergraduate studies for Pitt’s Department of Philosophy, said that an instructor wouldn’t accuse a student of cheating because a single sentence resembled one found in text elsewhere.

“If big chunks of it look like they were taken from another source, then it probably is plagiarized,” Kessler said. 

While they are not fool-proof, Kessler said services like Turnitin and Safeassign automate a process that could take instructors much longer to perform without them.  

Turnitin takes an average of 25 seconds to compare a student’s paper with the materials in its databases, submissions from others in the same course and online sources, according to information on its website.

Carol Bové, a senior lecturer in Pitt’s department of English, said she is considering using Turnitin next fall to save time as she grades papers, but said she still has reservations about the service. 

She said she worries that using the service might give students the impression she is suspicious of them and might upset the delicate “atmosphere of trust” that helps students learn.

But Panos Chrysanthis, a computer science professor, said such services hold one advantage for educators who must confront students who might have cheated: Students can’t claim they have been unfairly singled out.

“The great thing about these automated tools is that you cannot blame the TA or professor,” Chrysanthis said.

Writing-intensive disciplines are not the only areas where instructors are using online programs to check for plagiarism.

Misurda, a computer science lecturer, said he has previously used Moss, a service available to educators for free on Stanford University’s website, during two or three semesters while he taught computer engineering.

The program allows instructors to submit the computer programming students have handed in and then compares the assignments to one another.

In programming assignments, some students might take code from their peers and then change superficial elements of it to make it look like their own work, according to Misurda.

Such shallow changes might escape a cursory inspection by a grader. But when substance of the program — the portion that includes instructions for the computer — matches the work, Moss would recognize these similarities immediately, according to Misurda.

He added that, as with the case when he recognized the name of one of his old students in someone else’s work, educators can often spot cheating without software.

“[Moss] is really about verifying an instinct and catching things that are a little more subtle,” he said.

Bové said that she has entered text from students’ papers into a search engine when she’s suspected that parts had been plagiarized. For example, a paper that seems more polished than she would expect might make her skeptical about whether it is the student’s own work.

“Oftentimes, it doesn’t turn anything up,” she said. “But once or twice, it has.”

Like Bové, Misurda said he considers checking for plagiarism an unpleasant task that is secondary to his role as an educator. He said software doesn’t stop students from getting to the root causes of cheating. Misurda said he suspects most students grow desperate when they’re up against deadlines and then copy work from elsewhere despite their better judgment.

“The problem with throwing technology at this problem is that it’s not a technology problem,” he said. “It becomes pretty easy [for students] to say, ‘I’m stuck. Let me Google this.’”

 
Pitt News Staff

Share
Published by
Pitt News Staff

Recent Posts

Frustrations in Final Four: Pitt volleyball collects fourth straight loss in Final Four

The best team in Pitt volleyball history fell short in the Final Four to Louisville…

2 days ago

Olivia Babcock wins AVCA National Player of the Year

Pitt volleyball sophomore opposite hitter Olivia Babcock won AVCA National Player of the Year on…

3 days ago

Photos: Pitt women’s basketball falters against Miami

Pitt women’s basketball fell to Miami 56-62 on Sunday at the Petersen Events Center.

3 days ago

Photos: Pitt volleyball downs Kentucky

Pitt volleyball swept Kentucky to advance to the NCAA Semifinals in Louisville on Saturday at…

3 days ago

Photos: Pitt wrestling falls to Ohio State

Pitt Wrestling fell to Ohio State 17-20 on Friday at Fitzgerald Field House. [gallery ids="192931,192930,192929,192928,192927"]

3 days ago

Photos: Pitt volleyball survives Oregon

Pitt volleyball survived a five-set thriller against Oregon during the third round of the NCAA…

3 days ago