APU Careers & Learning Online Learning Original

Making the Most of the Turnitin Originality Assessment Tool

By Dr. Gary Deel
Faculty Director, School of Business, American Public University

This is the second of two articles on using Turnitin to judge the original work of students.

In the first article, I discussed how some of the Turnitin audit settings can be adjusted to render more meaningful results. So what can good instructors do after the audit settings have been optimized?

Start a degree program at American Public University.

Once settings have been adjusted and the audit has been rerun, the instructor’s legwork begins to ensure the validity and usefulness of the results. Fortunately, in addition to providing an OV score, Turnitin also provides a transcript of the submission with the areas highlighted that flagged as matches. The highlighted excerpts are color-coded to correspond to the various match sources. Turnitin lists those matched sources in numerical order from most to least prevalent.

So if the most prevalent match was 20 percent — meaning 20 percent of a student’s paper was flagged as a match to a particular source — that source would appear first on the list. If the next most-prevalent source was 15 percent, it would appear second, and so on.

Using the Annotated Transcript, Instructors Can Quickly Scan Unoriginal Content

Using the annotated transcript, instructors can quickly scan the areas of unoriginal content and determine whether they’re worthy of concern, and if so, why. For example, perhaps a short paragraph is flagged as a perfect match to a segment of a web article. Why might the instructor care about this?

The first issue would be to see whether the student properly cited the source. If she did, then there are no plagiarism concerns; if she did not, that would be grounds for a reduction in grade and perhaps even disciplinary action, depending on the frequency and severity of the offense.

Moving beyond plagiarism, another question an instructor might ask about a flagged match is whether it was necessary for the student to quote the original author versus paraphrasing his or her thoughts. The latter should be the default rule, which would still require a citation; whereas quoting the original author’s words verbatim is preferable only when some unique phrasing or important context is needed.

It is quite common for students to provide direct quotes not because the words are appropriate, but because copy-pasting from the source article is faster and easier than summarizing it in the student’s own words. So if the student took the easy route and included direct quotes where they weren’t necessary, this would be another point of potential feedback.

Most Matches Are Fair Game for Originality Audits, but Occasionally Some Content Can Be Ignored

Finally, instructors should look at each instance of flagged content to determine whether it should appropriately be considered in the calculation of the OV score. Generally, most matches are fair game for originality audits, but occasionally some content can be ignored.

An example would be if a student added a bibliography excerpt to her submission that matches previous submissions in other classes where the same excerpt was used. Obviously, this kind of information would be consistent from one paper to the next, and it would not normally be the type of unoriginal content that would be grounds for credit deductions.

Turnitin Can Be Directed to Ignore Matches in the OV Calculation

If matches are identified that should not be included in the OV calculation, Turnitin can be directed to ignore them using the “exclude source” button. Or the instructor can simply deduct the percentage of the match from the total OV score. For example, if a student’s bibliography match accounts for five percent of the total OV score, the instructor can simply subtract five points from the overall OV score to get an idea of the net score.

But sometimes deductions like this are a bit more complicated. Imagine a situation where an instructor determines that certain matches to a given source should be ignored but other matches to the same source should be included. How might this be addressed?

Unfortunately, Turnitin doesn’t always have the ability to delineate between parts of a source match. In these cases, as an instructor I often just do a manual calculation of the portion to be ignored — by adding up the number of words and dividing the result into the total word count of the submission — and then simply subtract that from the OV score.

How Much Outside Content Is Too Much Outside Content?

What happens after all the filtering work is completed and the final OV score has been calculated? How do we interpret it? This is perhaps the topic with the largest degree of ambiguity. How much outside content is too much outside content? What is appropriate and what is not?

Instructors’ opinions obviously differ. Turnitin provides a color scheme to its OV scores which is suggestive, though not objectively persuasive, as to how these scores should be viewed. In Turnitin, OV scores between one and 24 percent are coded green; scores between 25 and 49 percent are in yellow. Orange is assigned to scores between 50 and 74 percent. And 75 percent and up are red.

On some level, this color scheme is suggestive of the fact that scores between 1 and 24 (green) are not alarming. But the color-coding is obviously just a simple scheme for dividing the score scale into quartiles rather than a proposed philosophy for scrutinizing originality.

Surely there will be assignments where scores of less than 25 percent are worthy of criticism. For example, if students are asked to write about a personal experience (as opposed to a research paper), we would expect the OV scores on submissions to be extremely low.

There may also be occasions when scores in the yellow, orange and red ranges are less suspect. For example, if we design an assignment with a template form and require students to use the template for their submissions, then any content that is part of the template will inevitably flag as a Turnitin match across all student submissions. These matches can obviously be ignored, though, regardless of their weight in the OV scores.

Instructors Should Adopt Policies for Originality Scores Appropriate for the Coursework

In the end, instructors should adopt policies for originality score interpretations that are appropriate for the coursework and consistent with the rigor expected in the classes they teach.

Of course, some of the work in the Turnitin originality audit process can be tedious. Interpretations of originality scores require careful considerations about the kinds of assignments instructors require and what they’re expecting of their students.

Whether we like it or not, the algorithms involved in Turnitin’s auditing software haven’t evolved enough yet to do all this work without making mistakes along the way. There are simply too many variables and corner cases. However, if we instructors commit ourselves to applying the Turnitin tool in a well-thought-out way, we can provide more helpful feedback to our students and in a more consistent manner.

About the Author

Dr. Gary Deel is a Faculty Director with the School of Business at American Public University. He holds a JD in Law and a Ph.D. in Hospitality/Business Management. He teaches human resources and employment law classes for American Public University, the University of Central Florida, Colorado State University and others. 

Comments are closed.