Another dot in the blogosphere?

Posts Tagged ‘instructional design

Martin Weller weighed in on learning design (LD, otherwise called instructional design or ID) in his latest reflection on “good online learning”.

He began with an outline of the ground that LD has explored over the last decade or so, e.g., the incorporation of learning analytics (2007), Conole’s (2014) 7C model, Godsk’s (2014) efficient learning design, Buss and Georgsen’s (2017) methodology for transitioning from face-to-face to online

Then Weller offered two ways of thinking about LD: 1) pausing the way teaching is done, and 2) using shared frameworks for common reference.

He said: 

What learning design attempts to do is throw a pause in the implementation, where an educator can consider questions such as: “if I want to teach topic X, what is the best method to do so?”; “I have had a lot of activity type Y, maybe I should vary this?”; “what is the workload of these different approaches on students?”; “what can I do with this new technology that I couldn’t do before?”

This is a reflective approach that more teachers and educators should adopt if they want to walk the talk of being learner-centred. Even if choices are made up for us in macroscopic areas like the number of hours, classes, technology platforms, venues, etc., we can still make microscopic decisions in resources, tools, and strategies.

His other thought addressed an issue that has always bugged me: Progressive educators tend to be naturally reflective and/or creative, but their processes are rarely transparent to others. So if they used a common framework of LD (and perhaps had a stake in designing that framework), then their practices might offer a shared reference that others can learn from and contribute to.

Photo by Vanessa Garcia on

I like the way Weller simultaneously dived into details and rose above for broader perspectives. Now I wonder if I can incorporate these higher ideals into a possible workshop on the design of online learning. At the moment, I think probably not. These ideas are more suitable for a Masters or doctoral programme than a workshop.

Photo by Marek Piwnicki on

This might sound like I am splitting hairs, but there is a difference between tapping an icon on a screen and clicking on it.

Tapping requires you to use your finger to quickly touch a graphic element on touch-enabled screen. This is typically done on a mobile phone or a slate device.

Clicking normally requires you to use of a computer mouse to move the cursor over a graphic element on the screen to select or activate that element.

One might argue that the end result is the same, e.g., you run an app, so the terms have practically become synonymous. They are not.

This is not about being pedantic. It is about keeping to a principle of instructional design — guiding a learner as precisely and carefully as possible. 

The words we use matter because they have different meanings in different contexts. We also do not want to miscommunicate, especially online, because social cues and immediacy might be missing.

TPACK+ model
Reproduced by permission of the publisher, © 2012 by

If you asked me what the most important things to take away from the TPACK+ model of technology integration are, I would suggest:

  1. Planning for technology integration is only effective if you concurrently consider the nature of the content, pedagogical strategies, and technological affordances. This is the “sweet spot” of the TPACK+ model.
  2. An even more vital consideration is the context. This might not be obvious in the model because it is labelled at the bottom. However, it surrounds the entire model. Context should dictate decisions about technology integration.

I take context very seriously and model this for my courses and workshops. I do this by first finding out as much as I can about my learners.

For a course that just ended two nights ago, I had to make changes to adapt to participants who were collectively different from those that took the same course just five months prior.

Why? This batch learners was youthful. Seventy-one percent (71%) were teachers while the rest were leaders or managers. The same proportion had less than one year (9.7%) or no (61.3%) official teaching experience.

Five months ago, the proportion was about even between the newbies and the more experienced educators. The batch before that was almost the polar opposite: Almost two-thirds were experienced teachers while the rest were fresh faces.

If I did not conduct a survey, I could have simply gauged their experience and ICT readiness by their preferred technology. Given the choice to bring a device, my most recent class had a total of only two or three laptops. Everyone else was clutching an Android or iOS device. The earlier batches were laptop dominant and I had to cater for power strips all over the room.

The shifts were visually and qualitatively obvious to me. The shifts were clearer with quantitative data. But both forms of sensing were pointless if I did not adapt to the changes in context.

While there are many contextual elements — for example, physical environment, time of day, overall energy of learners, social cohesiveness — the technology context was a key consideration if I was to provide similar content and leverage on powerful pedagogical strategies.

To those ends, I used the new Google Sites as it seamlessly adjusted to screens on large or small devices. I embedded tools and resources that were mobile-friendly.

The access and consumption was flawless. However, creating on mobile is still an issue. For example, mind mapping tools like Coggle and even Google Docs still do not work evenly across different mobile browsers. Some of my participants could view, but not edit. Fortunately, they were grouped with others who could. Therein lay another benefit of group work.

This is the bottomline: It important to sense shifts in the ground; it is just as important to adapt to changes. Just as there are differences between individuals, one group of learners is different from the next. I reflect more so I need to react less.

I will say one thing about the classic instruction deslgn (ID) model, ADDIE, as represented in the graphic below: It is pretty.

It is also pretty misleading. It is oversimplified and thus misrepresents the complex processes in ID.

I have a Masters in this field. ID was also the foundation of my Ph.D. When I was introduced to the ADDIE model, I learnt about its theoretical underpinnings and its practical limitations.

Simplifying ID processes to an acronym and representing them in a graphic is a convenient distillation of complex processes. This is fine if you are doing this as a reflective and visible learning task as you develop expertise.

However, if used purely as an illustrative or teaching tool, the graphic is a shortcut that bypasses praxis (theory married with practice) and application (theory in action).

For one thing, ADDIE is not five main phases in non-overlapping and linear progression. The practical realities of any well-managed ID project should prevent its straight and unquestioned use.

For example, rapid prototyping might see tight cycles of design, development, and testing even before implementation. This not only breaks the linear chain, it also makes evaluation an overarching process that is reflexive and reflective.

Both a beginner and an expert might use ADDIE, but do so differently. ADDIE might be dogma for a beginner; it is a loose and pliable framework for an expert.

Put another way, ADDIE might seem like a good start. The problem is that it can also be a convenient stop if its users do not critically examine each component separately and as part of a whole.

It is one thing for instructional designers to try to summarise what they do with the help of ADDIE. It is another to use the graphic to teach someone how to do instructional design.

I would not presume that abdominal surgery is anaethetise, cut open, dig around, sew up, revive. The surgeon is a professional in whose hands a patient’s immediate future depends and oversimplifying surgery is an insult. An instructional designer is also a professional who has to juggle complex tasks but the returns on these are not obvious in the short term.

ID is not something that you can understand or master over a tweet, no matter how rich and juicy the tweet is. To accept that you can get away with that is lazy thinking. This leads to lazy action and ID, and that in turn to poor instruction and learning experiences.

Please do not oversimplify, misrepresent, or mislead. Not with ADDIE or with anything else.

Oh, and the image is not an infographic. But that is another long story…

In Singapore’s foodie culture, a crowd or queue is a sign of good eat. Following the crowd might be a good chance to take.

I read the article embedded in this tweet and was reminded why it is not always wise to do what everyone else is doing.

Microsoft’s Skype found out the hard way that following the social app crowd is not a good thing. Instead of leveraging on its strengths or developing something new, it tried mimicking Snapchat. Some users responded by giving Skype paltry ratings at app stores.

I suggest three takeaways that apply to educational technology integration, instructional design, and app development.

Do different
Going with the flow takes less effort than swimming against the current, so this might make sense in the development of curricula, course elements, and applications. However, this might be like doing the same thing as everyone else or doing the same thing differently.

Are you just delivering content and attempting to engage instead of designing to challenge and empower users? Doing the latter is more difficult, but this might be more worthwhile in the long run.

Sense accurately
According to the article, Skype Corporate VP Amritansh Raghav said that the new features of Skype were requested by users. Whether you are head of ICT or lead designer, you cannot listen only to your noisiest stakeholders because they might be a vocal minority.

You may chose to make data-informed decisions, but you need to know how accurate your sensing tools are and if the data are biased.

Needs, not wants
In 1989, Steve Jobs famously declared that the user is fickle [source].

You can’t just ask customers what they want and then try to give that to them. By the time you get it built, they’ll want something new.

Jobs relied more on his intuition than market research. Since most of us are not like Jobs, what can we do?

I say we give the user — or in education, the learner — what they need, not what they want. Being learner-centred does not mean pandering to their desires. It means being focused on their needs and future, not our hangups and past.

One more thing…
The author of the article did not like the garish colour scheme of new Skype. There is an easy solution: Opt for the dark, monotone one in settings.

I am taking a weekend break from ruminating on PSLE2021 [Part 1: An important undercurrent] [Part 2: The Dark Side] [Part 3: Differentiation vs granularity].

It is depressing to think about what we put kids through and to process the piecemeal change that is PSLE2021.

Video source

So I lighten my own mood with a YouTube video that carries important advice.

Instructional designers and teachers can learn something from the online fascination with Taylor Swift’s legs.

If more people seem to be interested in Tay-Tay’s legs than in climate change, what might carriers of the latter and more important message do?

The advice at the end of the video is this: Change tactics from persuasion by guilt to persuasion by charm. No one likes being nagged or told they are wrong.

This does not mean you cannot be critical or point out flaws. It does mean saying the same thing differently, e.g., with wit and charm.


Usage policy

%d bloggers like this: