Another dot in the blogosphere?

Posts Tagged ‘horizon

I stopped tweeting about or recommending any Horizon Reports after being privy to the processes behind one such report and reading the work of Audrey Watters [latest example].

I had insights to Singapore’s 2012 Horizon Report. Almost five years ago, I described how the trends identified then were heavily influenced by entities with edtech aspirations and how the trends were out of sync with other reports.

Audrey Watters has always been critical of the reports because the trends are disjointed. For example, a likely trend mentioned five years ago is not the norm now. While this might be due to the difficulty of forecasting, this does not explain why one long term trend appears in one report and not in a later report as a mid term trend.

This lack of continuity might be due to the fact that the self-selecting groups that form the leadership and advising boards come from different sectors. They are like the proverbial blind men describing the elephant based only on which part of the elephant they can feel.

My simple-minded critique of the Horizon Reports is that they are aptly named. You can try walking to the horizon, but you will never reach it. You cannot. You will only see more horizon.

You can also walk in any direction depending on the paths and barriers in front of you. As a result of doing this, you will see different things as you change directions.

Viewed this way, the reports are meanderings of guides who cannot be sure where to go and what to anticipate. The takeaway? Woe to anyone who buys in to what these blind men say as they cling on to different elephant parts.

This week there were at least two critiques on Horizon Reports following the release of the 2015 Higher Education report. 

Downes and Watters both lamented the poor pattern and continuity of the projections on educational technology. 

There is an underlying assumption that needs to be questioned: That edtech trends can be predicted with certainty of implementation and schedule. This is like saying you know where the horizon is. By the time you get to where you think it is, the horizon has moved. 

The New Media Consortium reports make disclaimers against these of course, but how many people actually read the fine print? My guess is about as many as do the iTunes user agreement.  

The basic methodology might also be misunderstood. The reports are often results of modified Delphi methods. Each set of experts or panels may be independent of another year’s report. These reports are not longitudinal studies, they are snapshots of thoughts. This could explain the lack of continuity.

Each panel is likely to have an agenda or include influential members with agendas. I hinted at this in Singapore’s first (and only?) report two years ago [1] [2]. The main “sponsor” had an e-book agenda and it featured prominently in the report. But e-readers and slates replacing paper and unnecessarily heavy school bags remain a futuristic fantasy in the average Singapore school.

I do not disagree with the critiques Downes and Watters. I hope I have added to the pool of insights and shed a sliver of light on why there does not seem to be continuity. 

These insights are important if Horizon Reports are taken from their descriptive domain and co-opted by administrators or policy makers to prescribe change. This has already happened with PISA scores and rankings. Such studies and reports are not gospel truth; they merely shed spotlights and laser points on large systemic issues.


The NMC’s Horizon Report for Higher Education 2014 is out.

Like the previous years, it highlighted trends over the next one to five years to look out for. According to the report, these two trends might drive change in a year.

  • Social media ubiquity (our e-Fiesta 2014 topic!)
  • Integration of online, hybrid, and collaborative learning (what a catch-all!)

The next two trends might take three to five years:

  • Data-driven learning and assessment*
  • Learners as creators

The last two trends might take more than five years for change and are the most vague of all:

  • Agile approaches to change (another catch-all)
  • Evolution of online learning (yet another)

Following a similar 2-2-2 pattern, the group also highlighted edtech developments in their document:

  • Flipped classroom
  • Learning analytics*
  • 3D printing (really?)
  • Games and gamification (at least they are listed as two separate entities)
  • Quantified self (what?)
  • Virtual assistants*

Unlike other years, this report also mentioned challenges for the adoption of educational technology:

  • Low digital fluency of faculty
  • Relative lack of rewards for teaching
  • Competition from new models of education
  • Scaling teaching innovations
  • Expanding access
  • Keeping education relevant

Here are some of my preliminary reactions.

A lot of what gets listed depends on who the NMC includes in the expert panel, how aware they are, and what agenda they have. It is worth looking back at previous reports (see 2013’s report for example) for clear patterns and outliers.

That said, anything to do with technology is difficult to predict because technology companies and policymakers can shift the goalposts overnight.

I am not sure why trends and edtech developments were separate or if they are different at all. For example, the items I asterisked (*) are all linked. Some might argue they are one and the same, but at different phases or based on different understandings and implementations of the same thing.

I am glad to see the “challenges to adoption” in this report. While previous lists might have seemed like wishful thinking and crystal ball-gazing, the addition of the challenges injects some reality.

Yesterday I reflected on the Horizon Report. I mentioned some things that surprised me. Here are some excerpts that did not.

From the executive summary comes this statement about mobile devices (p.4):

Students do not learn to use these technologies in school, but on their own and at their own pace. Tools such as mobile apps breed discovery of new information for users, and there is a need for schools to leverage and promote these informal learning experiences while integrating them with in-school learning.

On authentic learning (p.4):

The Singapore advisory board also felt that schools do not sufficiently incorporate real-life experiences in their curricula. Models such as challenge based learning, which encourages students to solve local and global problems, are interesting to schools, but have not gained enough traction and are not yet widespread. In order for students to be engaged in the material they are learning, there is a need for it to be tied to their own lives and the community around them.

Elsewhere in the report was the disconnect between learning and assessment (p.19)

There is a disconnect between the goals of assessment and personalised learning. As personal learning environments and other models of individualised learning are gaining traction in schools, forms of assessment for these models are lagging. Whereas the goal for personalised learning is to create experiences that appeal to a student’s specific learning style, pace, and needs, many current assessment tools focus on scalability and the capacity to extract data from a standard set of assignments — such as multiple choice tests and papers. The major challenge ahead for assessment is to capture ways to measure the quality of learning from different types of student outputs, including videos, and other rich media.

On formative assessment (p.20)

Assessment is an important driver for educational practice and change, and over the last years we have seen a welcome rise in the use of formative assessment in educational practice. However, there is still an assessment gap in how changes in curricula and new skill demands are implemented in education; schools do not always make necessary adjustments in assessment practices as a consequence of these changes. Another assessment gap is related to the lack of innovative uses of digital media in formative assessment. Many tools are still tied to outdated LMS and do not have the ability to assess critical data sets, such as 21st Century Skills acquisition.

On educator adoption of educational technologies (p.20)

Most academics are not using new and compelling technologies for learning and teaching. Many teachers and administrators have not undergone training on basic digitally supported teaching techniques, and most do not participate in professional development opportunities. This issue is due to several factors, including a lack of time, a lack of expectations that they should, and the lack of infrastructure to support the training. Many think a cultural shift will be required before we see widespread use of more innovative ideas and technologies. Many caution that as this unfolds, the focus should not be on the technologies themselves, but on the pedagogies that make them useful.

The Horizon Reports are named because they offer projections. But the selection that I have highlighted tell us what is already happening now.

It does not (and should not) take an external report to tell us these things. But I hope that the validation prompts action.

This week’s #edsg chat prompted me to read the almost month-old K-12 Horizon Report for Singapore.

It is rich with information but not terribly enlightening if you already follow the trends. An out-of-touch administrator or policymaker might be alarmed or informed by it.

But I was surprised by some things. Take this comparison table of the trends identified by the think tanks from Singapore, the US, and Australia.


Our outlook beyond the first year diverges from the other two groups. For example, our think tank did not seem to think that Digital Identity was a big enough blip on the radar in the two to three year time frame (see table above).

In my humble opinion, this contradicts with the top-ranked trends in another comparison table (below). Digital identity and the practices associated with it should have emerged as concerns when discussing the challenges to educators and shifting educational paradigms.


I also find it interesting that our trend analysis identifies gamification whereas it is game-based learning in the US and Australia. I think this stems from 1) a lack of clear understanding on what the differences between the two are, 2) a fear of non-serious games for education, and 3) an overly serious and conservative outlook.

Another seemingly curious emphasis might be e-books. But if you have insights into who was part of the think tank, you will realize why (hello, NCS!).

It is not easy projecting trends, of course. Nor was I privvy to the process despite the wiki that the think tank had. But I do know a few members of the committee and what they might bring to the table.

I wonder if NMC might leverage on wiki practices and culture in future to intelligently crowdsource this difficult task. I also wonder how many of the folks are active bloggers, tweeters, and Facebookers in this field and who are able to hone their thoughts daily and publicly. Could there not be a better representation of voices and trends in education without stifling the need for commercial partners to weigh in?

NMC also relies on experts who make themselves available as well as on the Delphi method. But surely the method can evolve to be more relevant, inclusive, and powerful, can it not?

Click to see all the nominees!

QR code

Get a mobile QR code app to figure out what this means!

My tweets


Usage policy

%d bloggers like this: