Flubaroo or flub, ah poo!
Posted October 6, 2012on:
I think I tested the limits of what Flubaroo and Google Forms can do in a ‘live’ grading of a large number of participants. I used the two tools to create a quiz near the end of interactive talks. Some background on my talks  .
The positive things I took away from the implementation was that the quiz:
- was a means of taking attendance in situ
- kept learners on their toes
- reminded them of some key concepts
But there were also things that could have worked better.
I do not think that Flubaroo was designed to process so many submissions so quickly. The sign-ups for each of the five sessions ranged from 50 to over 200 student teachers.
The Flubaroo “wizard” reminded me that the grading could take a minute or two. But that is an eternity when you need the results in real time so that the audience can see the results and I can immediately give out prizes to the five who answered the questions correctly and quickly.
I have concluded that a few things might have held this process back.
- Flubaroo is not designed with large crowds and immediate feedback in mind
- The constant influx of submissions slows the process down
- The lecture theatre’s wifi cannot handle the all the simultaneous connections
I cannot confirm or change the first factor. I can only look for another tool designed specifically for that purpose.
In order to mitigate the second factor, I can close access to the quiz and remind participants to stop submissions while data is analyzed in real time.
The least likely factor is wifi as the bottleneck because quiz submission is a quick, one-off step. I doubt many were doing anything data-intensive at that time.
I noticed that Flubaroo worked very quickly at my first session that was attended by about 50 participants. The second day saw 340 participants over two sessions. The processing time was obvious. The last two sessions drew a very similar number of participants but the processing time was even longer.
I do not blame the tool. I can only blame myself for the choice of the tool and the design of the task.
A tool designed to do one thing but appropriated to do something even bit different (or not used properly) will not perform as expected.
I opted to risk a quiz at the end of a lecture knowing that the processing time might be speed bump. But in this case, I do not blame the quiz component. (Warning: Rant ahead!)
I stepped up to fill a void in content because the topic of open learning is something I believe in. But under the circumstances, the only instructional option was delivery by lectures. So I merely softened the blow by making them interactive.
I blame our reliance on face-to-face lectures. I will say this again: Face-to-face lectures are one time, one size, one circumstance, one need fits all. They are out of sync with the times and learner expectations.
Giving a lecture techno bells and whistles will not pull them into the 21st century when they are 15th century relics (lectios by medieval Schoolmen, see article). Some Aussie unversities are rethinking lecture halls and even the term “lecturer”.
This week I stepped back in time and was reminded by why I moved with the times. I am going to refrain from looking and walking back.
What we need is change. We need to chip away at mindsets do not challenge lectures. We need to stop giving the excuse that they are efficient or that other ways are difficult. Doing that is like Noah complaining that it is too cold and wet to work on the ark.
Do or drown.