Posts Tagged ‘turnitin’
Every semester I provide formative feedback on written work submitted by graduate students. Before I do this, the students submit their assignments to Turnitin via an institutional LMS to determine the extent to which their work matches other work in the database.
Every semester I get at least one email from a concerned student worrying about the matching score. The worry is good in that the plagiarism talks they attend have an impact. However, the worry is bad because they misunderstand what plagiarism is and how tools like Turnitin work.
Turnitin runs on formulae and algorithms. It has a huge database of references and previously submitted work. Any new student work is compared against this content. The extent to which the new content matches with the existing work is a percentage that I call the matching score.
Some students seem to think that the matching score is the same as plagiarism. This is not necessarily the case.
If a student uses a template provided by curriculum committee or tutor, the headers and helping text will match. If another student correctly and ethically cites common quotations and lists reference, these will match with other existing work. All these means that the matching scores go up, but this does not mean the students have plagiarised.
In 2009, I provided examples on how the scores alone are not valid or reliable indications of plagiarism. A low score could hide plagiarism while a high score could actually come from the work of conscientious student with lots of correctly cited references.
Both the students and I have the benefit of not just the quantative matching scores, but also the qualitative highlights of matching texts. The latter should allay fears of plagiarism or highlight what is potential plagiarism. The student can take remedial action and I can determine if a score actually indicates plagiarism.
The problem with the system is the human element. Grading teams, administrators, librarians, advisers, and supervisors often arbitrarily set ranges of matching scores to mean no plagiarism, possible plagiarism, or definite plagiarism. The numbers are an easy shortcut because they take out human decision-making. The reports with highlighted text require reading and evaluation and thus mean a bit more work.
Both faculty and students need to be unschooled in focusing on numbers and playing only the numbers game. Life is not just about what can be quantified. Neither is the quality of a student’s assignment and their mindset on attribution.
One of my favourite tools for written feedback is Turnitin’s document viewer.
The name viewer is a misnomer since the tool also allows an evaluator to annotate digital documents.
The screenshot above is of the side bar of the tool and this is what makes it useful and powerful. (I pixellated some of the customised content to respect the work of others and left one of my examples plain to see.)
As a lone evaluator, I can add frequently-used comments to the side bar. When I notice something in an essay that triggers concern, I highlight some text in the essay and click on a button that represents that comment.
For example, I might find that someone has a misplaced trust in “learning styles”. I highlight those words in the online document and click on my “learning styles” button. My entire comment (text in bottom window) is added to the document as feedback in a speech bubble.
Users who receive feedback do not need to install anything or have a browser extension. They revisit their graded work and hover a cursor over the speech bubbles in order to read the feedback.
Even better than this convenience is another affordance: When evaluating as a group, each member can add their own comments which other evaluators can see and use. The tool becomes a pool of distilled wisdoms in the form of critical feedback.
Unfortunately, this tool is limited to educational institutions that pay for the service and add-on. I had long wished for a similar tool that was more open and preferably free. I might have found something close in the form of JoeZoo.
JoeZoo is a Chrome extension and I have yet to explore it fully. It promises the efficiency of reusing comments, but it does not seem to pool shared wisdoms.
Like Turnitin, students receiving feedback on their work do not need to install anything on their computing systems to view the feedback.
If you are like me and security conscious, you might block third-party cookies from your browser. If you do this, JoeZoo will not work. To get around this issue, you will need to create this cookie exception: [*.]googleusercontent.com.
My ideal feedback and grading tool would be a hybrid of these two tools.
- Very simple to use like Turnitin’s side bar
- Visually appealing for teachers and student like JoeZoo
- Shared or pooled comments like Turnitin
- Free and open like JoeZoo
Disclosure: I have not been paid or otherwise compensated for mentioning Turnitin or JoeZoo.