Another dot in the blogosphere?

Posts Tagged ‘analytics

Twitter is like a rapidly flowing stream of consciousness. Follow more people or engage in an exciting chat and your Twitter timeline might look like a torrent.
 

 
Thankfully there are ways to take the occasional snapshot and to catch a few fish so that you can get a sense of what is going on. I briefly describe three tools and strategies.

Storify allows you to archive and tell a story of, say, a Twitter-based chat. To the uninitiated, a Twitter chat can be disorienting because the chat is not threaded, the @handle replies easily lose context, and the conversation appears in reverse chronological order. Storify is a good way consolidate a chat.

Here is an example of the affordances of Storify based on a chat on self-directed learning that #edsg had six months ago. In the example, I illustrate how to use forward chronology, chunking, and commenting to add value to an archive.

Twitter analytics used to be available only to corporate or paying customers. Now everyone on Twitter has access to their own dashboard. Now everyone can see how much (or more likely, how little) reach or impact they have!

Instead focusing on how each and every tweet is doing, you might be more interested in where your followers are from. Assuming you have authentic followers, you can click on the “followers” section of the Twitter analytics dashboard to see this. I did this when prompted after an #NT2t question a few weeks ago.

twitter_follower_locations

I discovered that about a third of my followers were from Singapore, followed by the USA, Australia, UK, and everywhere else. About 60% of my followers were interested in education-related news, which is great as that is what I mostly tweet about.

I amplify my blog reflections by posting Twitter blurbs via WordPress. I do so just once a day between 10 and 11am at Singapore time, so this does not necessarily reach all my readers. Despite this, there are more visitors to my blog from the USA than Singapore.

The Twitter analytics snapshot hints at other entry points and visitor expectations. For example, I now have some evidence that my Singapore followers are more content to just read the Twitter blurbs while the ones from the USA take the trouble to read my blog and/or find my content some other way.

Brid.gy is a trackback and feedback-to-source tool I discovered only a few weeks ago. It takes a bit of setting up to integrate it with a blog, but once done, might fill in a much needed gap.

Most bloggers discover that they lose conversations and comments about their blog content because these happen in Facebook, Twitter, and other platforms. Brid.gy attempts to find and link some of them back to the blog.

The system is not perfect though. My WordPress counts of tweets often exceeds the number of Brid.gy returns, but that is probably because of the Twitter preferences of my followers.

For example, Brid.gy seems to only trackback the auto-tweets WordPress sends to Twitter on my behalf. Twitter followers who tweet comments and link to my blog independently of my tweet, reply to those tweets, or auto-link to other services like scoop.it are not detected.

Collectively, using these tools is like trying to catch fish from the stream with a net. You are not going to get a perfect catch all the time, but it is far better to try and to know than to operate blindly.

Tweets are fleeting and might represent a collective stream of consciousness. Anything that bobs its head several times in that stream tends to stand out.

If you are not a celebrity and something you tweeted was retweeted 66 times, you might be happy to note the agreement, endorsement, or share-ability of the idea. (By comparison, if you are a celebrity, you could tweet that you pooped and it could be retweeted thousands of times.)

However, your poopless joy should be tempered with context. Take the analytics of a recent tweet of mine.

To date, the tweet has been viewed 2,145 times, retweeted 66 times, and has an engagement rate of 3.1%. In the context of the number of views, that is a low return. In a good week, each of my tweets gets 3000-4000 views within three days. Given more views, the engagement figure is likely to drop.

That is what playing only the numbers game gets you.

Exploring the context of tweeting further, Twitter analytics do not capture modified tweets. For example, someone might tweet the URL of my tweet or tweet a variation of it.

Consider another example.

I took this screenshot in October 2014 of a popular blog entry I wrote in February of the same year. The blog entry now has 107 tweets (and an unknown number of retweets). If I focused just on the numbers, I could figure out the gain of tweets per month or the average in a year.

I would rather focus on the fact that something simple I wrote still has traction today. The WordPress dashboard tells me how the entry gets found, e.g., from Google searches or the other bloggers’ efforts.

My point: Numbers can be used to tell a story, the making of a story, or to bypass the story altogether. People who focus on playing the numbers game do not care for the story. The lowest hanging fruits are what matter to them. This is like focusing on grades instead of learning.

I prefer to get the fruit that are not within immediate reach. It takes more effort to climb, but the fruits of my labour are much sweeter. I also get a better view as a bonus. That is just my way of saying that I would rather use the numbers to tell a story even if that requires a bit more work.

When I first heard the news that Twitter made its analytics dashboard available to all, I jumped on it straightaway.

I was surprised to learn of my reach or what Twitter calls impressions. That said I have no doubt that others have a far wider reach.

But now I am wondering about the reliability of the analytics dashboard.

I discovered the analytics tools on 28 Aug. However, the date and time seem to be set for some other part of the globe. That said, my reach for 27 Aug as recorded on the morning of 28 Aug was 27,263 (see screencapture below).

tweet_activity_analytics_for_ashley_1

The analytics engine was already collecting data for 28 Aug as evident by the small bar to the right of the highlighted one.

On 29 Aug, I checked for activity on the 28th. I moused over the 27th accidentally and noticed that the count went up to 28,842 (see screencapture below).

tweet_activity_analytics_for_ashley_2

I am not sure why the numbers changed.

Perhaps the counts got adjusted for the time and date difference. Perhaps older tweets were getting views two or three days after being posted and their hit counts were not yet stable.

The numbers seem to settle about two days into collection. It might be best then to monitor on a weekly or monthly basis.

That was lesson number one.

What is worrying is the low engagement. I have read a few reviews by other individual tweeters [example from Gizmodo] and they say the same thing.

Each of my tweets gets between 4000 to 5000 views. But you can count on two hands (and occasionally include the feet) the number of reader interactions with the tweets. These include retweeting, favouriting, clicking on embedded resources, etc.

The tweets with higher interactions tend to be question-oriented. Ask a question and you are likely to get responses. The tweets with lower interactions are information-oriented. Provide something of value and the consumer consumes. Do not expect a thank you, feedback, or a pass-it-on.

This behaviour is not unique to Twitter. When I was privy to my former department’s Facebook reports, our engagement rate was equally low.

If I was a company I would be concerned that customers were not engaging with me. As an educator carefully curating and sharing, I might be a bit concerned about the viewing habits of my informal audience and learners.

I used to say today’s learner seems to move at twitch speed. This is not another way of saying they have short attention spans because they do not. Anyone who has observed someone else immersed in a game or in a state of flow will realize how much focus gamers have.

I mean to say that they move superficially from one resource to another due to the breadth of information presented to them. Their rallying cry seems to be tl;dr (too long, didn’t read).

Now I am tempted to say that my followers move at Twitter speed. That might sound like superficial consumption, but at least they read and read lots of seemingly disparate information. It is the brain foraging as this MindShift article points out.

So another lesson might be to leverage on Twitter as the learner expects. Not so much in a forced provide-feedback-in-a-classroom way, but in an informal, scattered goodies way or a curiosity-driven, #hashtag-focused chat.

The K-12 Horizon Reports have been forecasting the rise of learning analytics for a few years now. But as the field is new, I think there are different perceptions of analytics.

I think that there are at least three dimensions of learning analytics.

  1. Administrative
  2. Learner
  3. Learning

Administrative analytics provide a God-mode overview of who is using WHAT and HOW OFTEN. But this does not offer insights of how the resources and tools are implemented pedagogically. In an LMS, administrative analytics might look like a dashboard that provides information to an administrator or policy maker on how many courses embed YouTube videos, employ discussion forums, or an assignment feature.

Learner analytics are like progress reports to the instructor. Imagine a dashboard that shows activity completion rates, which students are stuck at what points, and test scores. Learner analytics let you know WHERE WHO is.

 
Real learning analytics are the holy grail that rely on Big Data. A system with a backend design something like the one shown above might take the frontend form of an omnipresent AI tutor that monitors a learner’s progress and offers help. I imagine this to be like the now defunct Microsoft Clippy but on steroids.

The Amazon store might have all three equivalents. Amazon can monitor its stocks and order more from its suppliers automatically (administrative analytics). It can project what to sell based on the browsing and purchasing habits of its customers (learner analytics). It can also make recommendations aggressively to its users (learning analytics).

It is not a perfect analogy nor a comprehensive analysis, but it helps me make sense of this complex phenomenon.

This is an exception that I am making to a rule. I am responding to an email request to feature an infographic.

I am featuring it partly because the person asked nicely, had credentials, and responded to my queries. I am also including it here because it addresses an emerging but important trend that not many people understand.

Learning analytics is an edtech trend to watch. I highlighted this with the help of the 2011 K-12 Horizon Report to folks from Blackboard when we met at the eLearning Forum Asia 2011 (eLFA).

Based on a tool demonstration they provided some months later, I did not get a sense that Blackboard really understood what learner analytics was. I only saw administrative analytics, not learning or learner analytics.

The infographic below provides a better picture of this [source].

A learning analytics system does not just data mine. It reacts and responds as an intelligent system to every learner. It augments a human instructor by providing more immediate feedback and personalizing learning.

Bottom line: A good learning analytics system is not designed with an administrator or KPIs in mind. It is designed for the learner first and foremost.


http://edublogawards.com/files/2012/11/finalistlifetime-1lds82x.png
http://edublogawards.com/2010awards/best-elearning-corporate-education-edublog-2010/

Click to see all the nominees!

QR code


Get a mobile QR code app to figure out what this means!

Archives

Usage policy

%d bloggers like this: