Another dot in the blogosphere?

Posts Tagged ‘turnitin

Recently I shared my thoughts on Turnitin’s latest attempt, Feedback Studio. I gave a lightning review of its iOS app and commented on how form did not meet function in its web app.

A colleague of mine also used the same tool to grade and provide feedback on student essays. He contacted Turnitin directly by email over a form-does-not-meet-function issue: Papers were not arranged in alphabetical order once the tool was launched from an LMS.

He described the problem clearly and provided a simple programming solution. The various tech and other support people he communicated with practised tai-chi, i.e., they deflected and redirected.

I will not share the details of the email exchanges because they were restricted conversations. But I will say this — they were amusing and frustrating to read.

I had a wry smile on my face as I identified immediately with frustration of trying to get someone from tech support to recognise and empathise with a problem.

I actually LOL-ed when I read the standard signature that the Turnitin folk used: “Revolutionizing the experience of writing to learn”. What was revolutionary about bad design, low empathy, and deflective service?

The email exchange and my own experience reminded me of Seinfeld’s Soup Nazi.

Video source

The Soup Nazi wanted things done a certain way and was closed to feedback. Any suggestion (no matter how good) or complaint (no matter how valid) was ignored and summarily dealt with — no soup for you!

My colleague and I were only thinking of improving the service and helping other users. This would ultimately benefit our learners if Turnitin took our critiques in the spirit they were offered.

Turnitin seemed to behave more like Turnyoudown. Perhaps some revolutions are the dictatorial sort.

Turnitin’s Feedback Studio needs some serious feedback.

Yesterday I shared how its web application, integrated into an institutional LMS, kept logging me out and had UI controls reminiscent of the 80s.

If its web application was unstable and finicky, then its iOS app was bare-bones and underwhelming.

Turnitin’s Feedback Studio

I was hoping that I could do on the iOS app much of what I could already do with the web application. I was disappointed early on.

As I accessed Turnitin from an institutional LMS (BlackBoard), I had to log in with an “activation code”.

According to the instructions (as of 10 September 2017), I had to first log in to the institutional LMS on my iPad, pick any assignment, and click on an information (i) icon to reveal a “Generate Code” button.

When I tapped on the button, nothing happened. I could not get a code with the iOS app.

Hoping that the code was not tied to a device, I decided to try this on my laptop. Clicking on the button using my laptop gave me the code I needed. I had to use this workaround because Turnitin’s instructions did not work.

The UI of the app is simple. At first I was disappointed that what I did not use was plain to see and what I really needed to use did not seem available.

UI of Turnitin’s Feedback Studio

What was clearly visible were tappable areas for a rubric, summary comment, voice comment, and similarity (matching scores to other artefacts in the database) at the top of the page. I did not rely on any of these.

I do not even use the scoring element because 1) I keep the marks elsewhere, and 2) the point of this assignment is for students to respond to feedback via a reflection and to incorporate changes in the next assignment. Provide a score and the learning stops (and the badgering for marks begins)!

The actual tools for providing specific formative feedback, i.e., highlighting, commenting, selecting canned responses, etc., were not obvious. There was no initial-use help on screen. Such a job aid is practically a standard feature from app creators who practice user-centric design.

Thankfully tapping on the screen a few times revealed the highlighting, commenting, and type-over functions. I managed to markup and comment on a student’s work. In the screen capture above, I pixellated the work (grey) as well as my comments, canned comments, and highlights (all in blue).

Highlighting was somewhat laborious as the app selected an entire line when I wanted to focus on one word. It was also not easy to select several sentences in a paragraph, but I suspect that this problem is common to apps that display PDFs.

As I was trying this at home where the wifi was fast and stable, the markups in the app synchronised with the web version almost immediately. A better test might be at a public hotspot or transport where the signal is less reliable. This would test Turnitin’s claim that any app edits would update the web versions when a reliable connection was established.

I am not sure I would recommend the app for processing class upon class of scripts. The typing of comments alone would be a pain. An external keyboard might alleviate this issue, but not everyone has one. There is also the option of audio feedback, but this does not highlight specific parts of an assignment.

I would not recommend this app to the paper and pencil generation. I would hesitate to do the same even to those who consider themselves mobile savvy. I would not want my recommendations to be soured by association with an app that feels like it is in perpetual beta.

The basic tenet of most types of design is that form must meet function. This principle is applied in the design of cars, buildings, furniture, websites, human-device interfaces, etc.

I tweeted this as I was in a neighbourhood library grading and providing formative feedback on assignments.

The library itself had questionable design. It shared a wall with a community centre. With boisterous activity comes happy noises. That is to be expected at a community centre. When the noise leaks to the study area, the people in the library half of the building become unhappy.

I became doubly unhappy as I was at the library to grade and leave feedback on assignments. The screenshot below illustrates the problem.

Turnitin Feedback Studio logs me out while I'm providing feedback!

Turnitin calls its “improved” tool Feedback Studio. It logs me out in the background while I am providing feedback on an assignment. I find out only after trying to leave feedback on a document and am shown the error message on screen.

I cannot even click on the “OK” button and have to exit the session and start all over again. This means having to close the pop-up window where the assignment and feedback are, returning to the Blackboard interface from which Turnitin was launched, and refreshing that page.

When I do that, I find out that I am not logged out from Blackboard. There seems to be some sort of invisible timer or quota for this “log out” problem. I have discovered that I can process six to eight assignments before the problem rears its ugly head.

This is an unwelcome distraction when I have about 30 scripts per class and a few classes worth of assignments to process. I am never sure whether my next set of comments is going to be saved with the assignment. I only find out when the error message pops up and I have to retype everything I did earlier.

I also cannot scroll the contents of the assignment window with a touch pad or mouse. I have to move the cursor to the scroll bars and move them up/down or left/right. Have we regressed to Apple’s single mouse button and ancient UI era?

The previous version of the same tool did not do these things. It was marginally less pretty, but it let me do my job efficiently and effectively.

It is one thing to be frustrated with the quality of student assignments. It is another to be antagonised by an unstable system. To the application designers and developers I say: Form must meet function. It does not matter if it looks nice but functions like an airhead.

Every semester I provide formative feedback on written work submitted by graduate students. Before I do this, the students submit their assignments to Turnitin via an institutional LMS to determine the extent to which their work matches other work in the database.

Every semester I get at least one email from a concerned student worrying about the matching score. The worry is good in that the plagiarism talks they attend have an impact. However, the worry is bad because they misunderstand what plagiarism is and how tools like Turnitin work.

Turnitin runs on formulae and algorithms. It has a huge database of references and previously submitted work. Any new student work is compared against this content. The extent to which the new content matches with the existing work is a percentage that I call the matching score.

Some students seem to think that the matching score is the same as plagiarism. This is not necessarily the case.

If a student uses a template provided by a curriculum committee or tutor, the headers and helping text will match. If another student correctly and ethically cites common quotations and lists reference, these will match with other existing work. All these means that the matching scores go up, but this does not mean the students have plagiarised.

In 2009, I provided examples on how the scores alone are not valid or reliable indications of plagiarism. A low score could hide plagiarism while a high score could actually come from the work of conscientious student with lots of correctly cited references.

Both the students and I have the benefit of not just the quantative matching scores, but also the qualitative highlights of matching texts. The latter should allay fears of plagiarism or highlight what is potential plagiarism. The student can take remedial action and I can determine if a score actually indicates plagiarism.

The problem with the system is the human element. Grading teams, administrators, librarians, advisers, and supervisors often arbitrarily set ranges of matching scores to mean no plagiarism, possible plagiarism, or definite plagiarism. The numbers are an easy shortcut because they take out human decision-making. The reports with highlighted text require reading and evaluation and thus mean a bit more work.

Both faculty and students need to be unschooled in focusing on numbers and playing only the numbers game. Life is not just about what can be quantified. Neither is the quality of a student’s assignment and their mindset on attribution.

One of my favourite tools for written feedback is Turnitin’s document viewer.

The name viewer is a misnomer since the tool also allows an evaluator to annotate digital documents.

Turnitin's assignment viewer and markup tool.

The screenshot above is of the side bar of the tool and this is what makes it useful and powerful. (I pixellated some of the customised content to respect the work of others and left one of my examples plain to see.)

As a lone evaluator, I can add frequently-used comments to the side bar. When I notice something in an essay that triggers concern, I highlight some text in the essay and click on a button that represents that comment.

For example, I might find that someone has a misplaced trust in “learning styles”. I highlight those words in the online document and click on my “learning styles” button. My entire comment (text in bottom window) is added to the document as feedback in a speech bubble.

Users who receive feedback do not need to install anything or have a browser extension. They revisit their graded work and hover a cursor over the speech bubbles in order to read the feedback.

Even better than this convenience is another affordance: When evaluating as a group, each member can add their own comments which other evaluators can see and use. The tool becomes a pool of distilled wisdoms in the form of critical feedback.

Unfortunately, this tool is limited to educational institutions that pay for the service and add-on. I had long wished for a similar tool that was more open and preferably free. I might have found something close in the form of JoeZoo.

JoeZoo is a Chrome extension and I have yet to explore it fully. It promises the efficiency of reusing comments, but it does not seem to pool shared wisdoms.

However, JoeZoo seems more visually-appealing than Turnitin and offers useful options like feedback categories, rubrics, and grading scales.

Like Turnitin, students receiving feedback on their work do not need to install anything on their computing systems to view the feedback.

If you are like me and security conscious, you might block third-party cookies from your browser. If you do this, JoeZoo will not work. To get around this issue, you will need to create this cookie exception: [*.]

My ideal feedback and grading tool would be a hybrid of these two tools.

  • Very simple to use like Turnitin’s side bar
  • Visually appealing for teachers and student like JoeZoo
  • Shared or pooled comments like Turnitin
  • Free and open like JoeZoo

Disclosure: I have not been paid or otherwise compensated for mentioning Turnitin or JoeZoo.

Click to see all the nominees!

QR code

Get a mobile QR code app to figure out what this means!

My tweets


Usage policy

%d bloggers like this: