Another dot in the blogosphere?

Posts Tagged ‘research

The writers of Quartz, some of whom I have described as using lazy writing, wondered why one of the world’s wealthiest countries is also one of its biggest online pirates.

The country was Singapore, “the world’s fourth richest country, measured by gross national income per capita and adjusted for purchasing power”. Quartz wondered why Singaporeans still resorted to piracy despite having access to Netflix.

Does it assume that 1) there are no poor people in Singapore, 2) everyone here has heard of Netflix or other legal video streaming platforms, 3) the rich people here (all of us!) subscribe to something like Netflix, and 4) having access to legal streaming should reduce piracy significantly?

These are flawed assumptions. In not trying to answer its own question, it revealed lazy thinking and research.

For example, it did not mention how Netflix Singapore only offers 15% of TV shows carried by Netflix USA. (The exact figure might vary over time and is available in the table at this site.)

It did not mention that we have relatively low-cost fibre optic broadband plans.

Telcos here now push 1Gbps plans. One needs only a cursory examination of this chart maintained by the Infocomm Media Development Authority (IMDA) of Singapore to see that the plans hover around S$50 now.

The low access to the full Netflix USA library combined with ready access to high speed Internet point to our ability to get the same resources elsewhere.

Quartz decided to call our behaviour kiasu. That is a catchall term that avoids actual thought and explanation. The label is convenient: You are all just like that despite your money and access.

Like most sociotechnical phenomena (behaviours shaped and enabled with technology), the underlying reasons are nuanced. I have suggested just two and backed it up with the data.

Caution by dstrelau, on Flickr
Caution” (CC BY 2.0) by dstrelau

Last week I read this blog entry, Give a kid a computer…what does it do to her social life? It summarised a research paper that claimed to study how computers influenced social development and participation in school.

The paper might seem like a good read, until you realise its limitations. The blogger pointed these out:

A few caveats of these conclusions should be borne in mind. First, the study only lasted for one school year. Second, having a smart phone, with the constant access it affords, may yield different results. Third, children were given a computer, but not Internet access. Some kids had it anyway, but the more profound effects may come from online access.

The single year study is quite a feat even though a longer longitudinal study would have been better. The researchers were probably limited by schooling policies and processes like access to students and how students are grouped.

I am more critical of the other study design flaws.

My first response was: Computers only, really?

Phones are the tools, instruments, and platforms of choice among students. You can take away their computers, but you can only remove their phones from their cold dead hands. If you wanted to study the impact of a technology set that was key to social development and school participation, you should focus on the influence of the phone.

My second response was: Not consider Internet access, really?

That is like studying the impact of cars on air quality or travel stress by limiting the cars to a thimble of fuel. Much of what we do with computers and phones today requires being online. You can focus on what happens offline with these devices, but this is such a limited view. This is like saying you observe what happens in one minute out of every hour and claim to know what happens all day.

There might be a need to study the impact of, say a 1:1 programme, but this would likely happen in the larger context of Internet-enabled phone use. It does not make sense to silo study the impact of non-Internet computer use.

My third response on reading the abstract was: Self-reporting via surveys, really?

There is nothing wrong with surveying itself, particularly if the surveys were well-designed and valid. However, self-reporting is notoriously unreliable because participant memories are subject to time, contextual interpretation, emotion, and other confounding factors.

Given that the study was quasi-experimental, where were the other data collection methods to triangulate the findings? These methods include, but are not limited to, observations, interviews, focus groups, document analysis, video analysis, etc.

While my critique might sound harsh, this is the norm of academic review. If a study is to inform theory or practice, is must be rigorous enough stand up to logical and impartial critique.

There is no perfect study and on-the-ground situations can be difficult. But if the researchers do not manage the circumstances and design with better methods, then their readers should read critically with informed lenses. If the latter do not have them, this doctor offers this free prescription.

Although I am no longer an academic, I see research opportunities everywhere. One set of untapped research is in Pokémon Go (PoGo).

I am not talking about the already done-to-death exercise studies or about the motivations to play and keep playing.

I am thinking about how sociologists might add to PoGo’s trend analysis. Number crunchers have already collected data on its meteoric rise and now its declining use. While these provide useful information to various stakeholders, I wonder if anyone has considered the impact of PoGo uncles and aunties.

I am not the first to observe how much older players have started playing PoGo. I tweeted this a while ago and someone just started a thread in the PoGoSG Facebook group about uncles and aunties at play.

A quick search on Twitter with keywords like “pokemon go” and “auntie” or “uncle” might surprise you.

The PoGo aunties and uncles are quite obvious here. So far I have noticed three main types: Solo aunties, uncles in pairs or small groups, and auntie-uncle couples. There are more types, of course, but these three are common enough to blip frequently on social radar.

But I would not be content with just describing the phenomenon. I would ask if they contribute to the “death” of PoGo just like how the older set adopted Facebook and how teens then migrated to Snapchat.

We should not underestimate the impact of uncles and aunties. After all, there must be a reason for this saying: Old age and treachery will always overcome youthfulness and skill.

Since some people would rather watch a video bite than read articles, I share SciShow’s Hank Green’s 2.5 minute critique of “learning styles”.

Video source

From a review of research, Green highlighted how:

  • the only study that seemed to support learning styles was severely flawed
  • students with perceptions that they had one style over others actually benefitted from visual information regardless of their preference

This is just the tip of the iceberg of evidence against learning styles. I have a curated list here. If that list is too long to process, then at least take note of two excerpts from recent reviews:

From the National Center for Biotechnology Information, US National Library of Medicine:

… we found virtually no evidence for the interaction pattern mentioned above, which was judged to be a precondition for validating the educational applications of learning styles. Although the literature on learning styles is enormous, very few studies have even used an experimental methodology capable of testing the validity of learning styles applied to education. Moreover, of those that did use an appropriate method, several found results that flatly contradict the popular meshing hypothesis. We conclude therefore, that at present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice.

In their review of research on learning styles for the Association for Psychological Science, Pashler, McDaniel, Rohrer, and Bjork (2008) came to a stark conclusion: “If classification of students’ learning styles has practical utility, it remains to be demonstrated.” (p. 117)

In Deans for Impact, Dylan Wiliam noted:

Pashler et al pointed out that experiments designed to investigate the meshing hypothesis would have to satisfy three conditions:

1. Based on some assessment of their presumed learning style, learners would be allocated to two or more groups (e.g., visual, auditory and kinesthetic learners)

2. Learners within each of the learning-style groups would be randomly allocated to at least two different methods of instruction (e.g., visual and auditory based approaches)

3. All students in the study would be given the same final test of achievement.

In such experiments, the meshing hypothesis would be supported if the results showed that the learning method that optimizes test performance of one learning-style group is different than the learning method that optimizes the test performance of a second learning-style group.

In their review, Pashler et al found only one study that gave even partial support to the meshing hypothesis, and two that clearly contradicted it.

Look at it another way: We might have learning preferences, but we do not have styles that are either self-fulling prophecies or harmful labels that pigeonhole. If we do not have visual impairments, we are all visual learners.

Teaching is neat. Learning is messy.

Learning is messy and teaching tries to bring order to what seems to be chaos. The problem with learning styles is that it provides the wrong kind of order. Learning styles has been perpetuated without being validated. A stop sign on learning styles is long overdue.

After reading this review of research on homework, my mind raced to how some people might resort to formulaic thinking.

This was the phrase that seeded it:

Based on his research, Cooper (2006) suggests this rule of thumb: homework should be limited to 10 minutes per grade level.

What follows were examples and an important caveat:

Grade 1 students should do a maximum of 10 minutes of homework per night, Grade 2 students, 20 minutes, and so on. Expecting academic students in Grade 12 to occasionally do two hours of homework in the evening—especially when they are studying for exams, completing a major mid-term project or wrapping up end-of-term assignments—is not unreasonable. But insisting that they do two hours of homework every night is expecting a bit much.

If you assume that people would pay more attention to the caveat than to the formula, you assume wrongly. Doing the former means thinking harder and making judgements. The latter is an easy formula.

Most people like easy.

If those people are teachers and administrators who create homework and homework policies, then everyone who is at home will likely suffer from homework blues.

Am I overreaching? I think not. Consider another example on formulaic thinking.

I provide professional development for future faculty every semester, but this semester was a bit different. There was a “social” space in the institution’s learning management system (LMS) where a certain 70:30 ratio emerged.

A capstone project for these future faculty is a teaching session. The modules prior to that prepare them to design and implement learner-centred experiences. At least one person played the numbers game and asked what proportion of the session should be teacher-centred vs student-centred.

I provide advice in person and in assignments that the relative amount is contextual. My general guideline is that student-centred work tends to require more time since the learners are novices and that the planning should reflect that.

However, once that 70:30 ratio was suggested in the social space, it became the formula to follow. It was definite and easier than thinking for and about the learner. It allowed future faculty to stay in their comfort zone of lecturing 70% of the time and grudgingly attempt student-centred work 30% of the time.

But guess what? When people follow this formula or do not plan for more student-centred activities and time, they typically go over the 70% teacher talk time and rush the actual learning. This pattern is practically formulaic.

Formulaic thinking is easy, but that does not make it right or effective. In the case of the course I mentioned, the 70:30 folk typically return for remediation. It is our way of trying to stop the rot of formulaic thinking.

The video below highlights some research that would not pass muster today.

Video source

Today we are guided by the principle of “do no harm”, or at least “do the least harm”.

I wonder if the same could be said about social experiments that are a result of non-researchers tinkering with systems and policies.

For example, how much social experimental harm has the PSLE caused?

Make no mistake: The PSLE has been a very successful as a social experiment. It has become the operating standard, it shapes expectations, and we cannot seem to think outside it.

However, we need to ask ourselves if the PSLE embedded in our collective psyche is a good thing. Simply mentioning it as harmful begs disbelief in some quarters and helplessness to do otherwise in others.

Just because something is successful does not make it helpful or harmless. Pandemic diseases spread with us as carriers and our technologies as enablers. Often we do not even know we are helping the disease spread until it is too late.

This doctor is highlighting some symptoms of PSLE. Are you feeling OK?

There are many things that could be said about research.

As a former academic, I share just three truisms:

  1. Publish or perish.
  2. To steal from one is plagiarism. To steal from many is research.
  3. Practice without research is blind. Research without practice is sterile.

I share a variation of the third truism as an image quotation I created some time ago.

Practice without theory is blind. Theory without practice is sterile.

Most young academics learn the first truism as a graduate student by being mentored or observing professors carefully. If they end up in Research I universities, publish or perish is a constant mantra. Their jobs depend on how much and how well they publish.

The second truism is sneaked in various contexts and said half in jest. It is the recognition that we stand on the shoulder of others, be they giants or not. Combined with the first principle, research can often be a dog eat dog world.

The third truism and couplet is something some researchers ignore. In order to build and stay in ivory towers, no doubt funded by generous research grants, it helps to spout rhetoric that the research adds to the pool of knowledge. It does not have to actually make a larger impact.

Research that is based on practice and informs practice is vital, but it is still sorely lacking particularly in education. Some experts play the old game because they are far removed from the ground.

If you are a practitioner, do not be tempted to ignore research as a result of this. Set up conditions and demand for research that informs practice instead.


Click to see all the nominees!

QR code

Get a mobile QR code app to figure out what this means!

My tweets


Usage policy

%d bloggers like this: