I’m not usually one to moan or highlight something that annoys me, but when I read this article this morning I had that sinking feeling and face-palm moment!
What this article totally missed, and what I’ve been saying to academics (and anyone who’ll listen) is that these online plagiarism detection systems are only as good as the people looking at the results.
The TurnItIn ‘score’ is no measure of originality, despite being called an ‘originality report’. It is a measure of what percentage of the submitted text is matched against known sources (student papers, journals, books, Internet sources, etc.). So, a paper that gets a 50% score means that 50% of the text has been used or can be found in other submitted papers to the system. It is not saying 50% is copied/plagiarised from other sources. These matches could be down to poorly referenced work, badly quoted and or badly cited work, and even popular quotes. The 50% is only an indication to the academic that further investigation is required, that they need to
This annoyed me most:
“It is clear that this type of cheating is virtually undetectable by academics when students take precautions against being caught,”
If you rely solely on something like TurnItIn then yes, I agree, it is virtually impossible to detect cheating. But these submitted papers by the student should never be viewed in isolation – the academic(s) should have other opportunities for the students to submitted written assessment, along with email evidence and even forum/online comments, so the writing style of the student can be seen. Then, when the online paper comes in the academic can see quite easily that the style, language, grammar, punctuation, etc. is different, sometimes wildly so. From here the different pieces of work can be compared and an informed opinion can be made.
“But [Dr Lisa Lines] argues that much more radical steps will be needed to combat the use of essay mills, including greater use of exams and requiring students to give oral presentations on the topic of completed essays.”
Again, no. You don’t need to add more exams or more ‘radical steps’, you just need to be more prepared to get to get to know your students, their writing styles, their use of grammar and language. I’m sure, once you know this, it’ll be far easier to spot work that is out of the ordinary, for that student, even without the reliance on TurnItIn.
TurnItIn and other systems like this are to be used as part of a wider assessment strategy. The main focus of that strategy though should be the relationship between academic/teacher and student.
The scary part of this reliance on TurnItIn [other plagiarism detection tools are available] is this – I worked with one academic (a few years ago now) who actually based the student’s grade on the TurnItIn score. Yes! They even admitted they didn’t even look at the details or report or the paper itself. Anything with 50% matched or over got 50% grade or less (but never a fail). Anything between 20-50% match got a 50-60% score, and anything less than 20% of text matched got a better/higher score over 60%.
One another related topic, this article on the Guardian website – An essay I bought online was so bad I want a refund – but the firm won’t pay up – was also worrying. But at least the piece did answer the bigger picture here. It’s not about the ethics of a refund for the paper, it’s about the integrity of the student in trying to subvert the system. And from a law student too?!
“The use of these types of websites not only raises serious questions about whether an individual is meeting the standards required, but also whether somebody has the right character to enter a profession where honesty and integrity is crucial.”