Another dot in the blogosphere?

Posts Tagged ‘study

This is my response to newspaper articles [Today] [STonline] on a study by Singapore’s Institute of Policy Studies (IPS). I also respond in longer form to tweets about the articles or study.

First some background, disclosures, and caveats.

According to one article, the study was “a quantitative look at the views of 1,500 citizen or permanent resident (PR) parents with children in local primary schools on their perceptions about Singapore’s education system at that level”.

I am not linked to the IPS nor do I have a stake in what it does. As an educator, I have a stake in how people process reports of such studies because it reflects our collective capacity to think critically.

My intent is to provide some insights based on my experiences as a teacher educator and researcher. In the latter capacity, I have had to design and conduct research, supervise it, and be consulted on designs, strategies, and methodologies.

However, without full and immediate access to the actual IPS report and data, I have to take the newspaper articles at face value. I also have to assume that the research group implemented the survey-based study rigorously and ethically.

The headline by the Today paper was click(bait)-worthy. It was not the only finding, but the paper thought it would grab eyeballs.

At least two people tweeted and wanted to know if other stakeholders like parents and the students themselves were asked about the impact of the PSLE.


I understand their concerns, but this was probably not on the research agenda. I say this not to dismiss the importance of their questions.

Good research is focused in order to be practical, to manage limited time and resources, and to shed a spotlight on a fuzzy issue. The questions about teachers and students could be addressed in another study.

It might help to view the study as a snapshot of early stage policy implementation. MOE has passed policy of “every school, a good school” and shared upcoming changes to the PSLE. The big question is: What is the buy in?

MOE can more easily manage the buy in among teachers and students. Parents are a different matter, so the study rightfully focused on that group.

The study was not about making any comparison. It was about taking a snapshot of public opinion.

This is also not a question that the IPS could seek answers to in mainstream schools here. Except for international, private, and most special needs schools, all mainstream Primary schools subscribe to the PSLE and do not have alternatives like e-portfolios. Some home-schooled children even take the PSLE.

This is actually a critical question that needs to be asked.

Our Prime Minister hinted at it in his National Day Rally speech in 2013 and MOE responded with some changes — IMO superficial changes — in late 2016.

If enough stakeholders question the timing or value of PSLE, then the followup questions revolve around the WHEN and HOW of change.

According to the ST article, “the sample of parents… had a proportionate number of children in almost all the 180 or so primary schools here.”

Now this could mean that there was less than ten parents representing each school on average. We cannot be sure if some schools were over or under-represented, nor can we be absolutely certain that the respondents were representative of parents in general. This is why national surveys rely on large returns.

That said, surveys, whether voluntary or solicited, tend to be taken by those who want to have their say. You can never be absolutely certain if you have are missing a silent majority or have a data from a vocal minority. However, a large return tends to balance things out.

The survey study seemed to rely on descriptive statistics. At least, that is what the papers focused on. If that is the case, a statistical analysis was not in the design. If it was, there would be specific research questions based on hypotheses.

Not every study needs a statistical analysis. If this was a snapshot or preliminary study, the descriptive statistics paint a picture that highlight more questions or help policymakers suggest future strategies.

Overall, I do not fault a study for attempting to paint a broad picture that no one else seemed to have a clear view of. It sets the stage for more query and critical analysis.

But I do have one more potshot to take and it is directed at the newspapers.

The contrast of what was highlighted by each paper of the same study could not be more stark.

To be fair, both papers had a few articles on the same study to highlight different topics. But what the newspapers choose to tweet is an indication of what they value. This is no different from what any of us chooses to tweet.

I chose to call out the subjectivity of any press that thinks of itself as objective or impartial. Any study and press article has bias, some have more and some less.

As content creators, we should make our bias transparently obvious. As critical thinkers and doers, we should try to figure out what the biases are first.

Caution by dstrelau, on Flickr
Caution” (CC BY 2.0) by dstrelau

 
Last week I read this blog entry, Give a kid a computer…what does it do to her social life? It summarised a research paper that claimed to study how computers influenced social development and participation in school.

The paper might seem like a good read, until you realise its limitations. The blogger pointed these out:

A few caveats of these conclusions should be borne in mind. First, the study only lasted for one school year. Second, having a smart phone, with the constant access it affords, may yield different results. Third, children were given a computer, but not Internet access. Some kids had it anyway, but the more profound effects may come from online access.

The single year study is quite a feat even though a longer longitudinal study would have been better. The researchers were probably limited by schooling policies and processes like access to students and how students are grouped.

I am more critical of the other study design flaws.

My first response was: Computers only, really?

Phones are the tools, instruments, and platforms of choice among students. You can take away their computers, but you can only remove their phones from their cold dead hands. If you wanted to study the impact of a technology set that was key to social development and school participation, you should focus on the influence of the phone.

My second response was: Not consider Internet access, really?

That is like studying the impact of cars on air quality or travel stress by limiting the cars to a thimble of fuel. Much of what we do with computers and phones today requires being online. You can focus on what happens offline with these devices, but this is such a limited view. This is like saying you observe what happens in one minute out of every hour and claim to know what happens all day.

There might be a need to study the impact of, say a 1:1 programme, but this would likely happen in the larger context of Internet-enabled phone use. It does not make sense to silo study the impact of non-Internet computer use.

My third response on reading the abstract was: Self-reporting via surveys, really?

There is nothing wrong with surveying itself, particularly if the surveys were well-designed and valid. However, self-reporting is notoriously unreliable because participant memories are subject to time, contextual interpretation, emotion, and other confounding factors.

Given that the study was quasi-experimental, where were the other data collection methods to triangulate the findings? These methods include, but are not limited to, observations, interviews, focus groups, document analysis, video analysis, etc.

While my critique might sound harsh, this is the norm of academic review. If a study is to inform theory or practice, is must be rigorous enough stand up to logical and impartial critique.

There is no perfect study and on-the-ground situations can be difficult. But if the researchers do not manage the circumstances and design with better methods, then their readers should read critically with informed lenses. If the latter do not have them, this doctor offers this free prescription.

According to this BBC report, Northumbria University ‘life-threatening’ caffeine test fine, two sports science students were supposed to be given 300mg of caffeine in a study. Instead, they received 30,000mg (over one-and-a-half times the lethal dose) due to a miscalculation.

The two human subjects recovered after dialysis and intensive care. The university was fined £400,000 (almost SGD717,000 at the current exchange rate).

The numbers obviously matter in this case. The insufficient attention to the calculation to the dose ultimately led to a hefty fine. The university was fortunate not to add two to the number deaths on campus.

Then there are cases where numbers should matter less, or even not at all.

This WaPo article, Trump pressured Park Service to find proof for his claims about inauguration crowd, reported how Trump sought numbers to confirm his perception that his inauguration crowd was not as small as reported by the press.

The article provides insights into how some people, not just Trump, play the numbers game. They take a perspective built on bias or limited information, and then seek data to back it up.

The article was a reminder what NOT to do because this is like coming to a conclusion first, then conducting a study, collecting data, and massaging the results and discussion to fit the conclusion.

If we jump on schooling tangent, this is similar to the conventional and deductive way of teaching: Present a basic concept and then build it up with examples and practice. While this approach might work from a content expert’s point of view, it ignores another method.

A less oft used method is that of induction. Here phenomena, data, and noise are collected and processed first before arriving at generalisations or conclusions.

The deductive method generally goes from general to specific while the inductive one goes from specific to general. Instruction can consist of both, of course, but we tend to practice and experience more deductive methods because that is how most textbooks are written and how experts try to simplify for novices.

There is nothing wrong with the deductive method in itself. It is the over-reliance on that strategy and the imbalance that is the problem.

Likewise, playing the numbers game like Trump and worrying about how they indicate reputation or bruised ego can make you focus on what is relatively unimportant. It can tip the balance the wrong way.

Since some people would rather watch a video bite than read articles, I share SciShow’s Hank Green’s 2.5 minute critique of “learning styles”.


Video source

From a review of research, Green highlighted how:

  • the only study that seemed to support learning styles was severely flawed
  • students with perceptions that they had one style over others actually benefitted from visual information regardless of their preference

This is just the tip of the iceberg of evidence against learning styles. I have a curated list here. If that list is too long to process, then at least take note of two excerpts from recent reviews:

From the National Center for Biotechnology Information, US National Library of Medicine:

… we found virtually no evidence for the interaction pattern mentioned above, which was judged to be a precondition for validating the educational applications of learning styles. Although the literature on learning styles is enormous, very few studies have even used an experimental methodology capable of testing the validity of learning styles applied to education. Moreover, of those that did use an appropriate method, several found results that flatly contradict the popular meshing hypothesis. We conclude therefore, that at present, there is no adequate evidence base to justify incorporating learning-styles assessments into general educational practice.

In their review of research on learning styles for the Association for Psychological Science, Pashler, McDaniel, Rohrer, and Bjork (2008) came to a stark conclusion: “If classification of students’ learning styles has practical utility, it remains to be demonstrated.” (p. 117)

In Deans for Impact, Dylan Wiliam noted:

Pashler et al pointed out that experiments designed to investigate the meshing hypothesis would have to satisfy three conditions:

1. Based on some assessment of their presumed learning style, learners would be allocated to two or more groups (e.g., visual, auditory and kinesthetic learners)

2. Learners within each of the learning-style groups would be randomly allocated to at least two different methods of instruction (e.g., visual and auditory based approaches)

3. All students in the study would be given the same final test of achievement.

In such experiments, the meshing hypothesis would be supported if the results showed that the learning method that optimizes test performance of one learning-style group is different than the learning method that optimizes the test performance of a second learning-style group.

In their review, Pashler et al found only one study that gave even partial support to the meshing hypothesis, and two that clearly contradicted it.

Look at it another way: We might have learning preferences, but we do not have styles that are either self-fulling prophecies or harmful labels that pigeonhole. If we do not have visual impairments, we are all visual learners.

Teaching is neat. Learning is messy.

Learning is messy and teaching tries to bring order to what seems to be chaos. The problem with learning styles is that it provides the wrong kind of order. Learning styles has been perpetuated without being validated. A stop sign on learning styles is long overdue.

 
I have been reading the opinion articles in local rags and social media about whether kids with promising talent should work as soon as possible or stay in school.

Conventional wisdom, particularly in a place like Singapore, favours schooling because paper qualifications seem to be what employers recruit and reward. But that tide is changing, particularly in fields that do not require specific professional qualifications, where drive, experience, and attitude are more important.

I do not see why we have to think along the traditional lines of either starting/continuing work or furthering one’s schooling/education. Why not both?

After some basic schooling, much of what needs to be learnt is done on the job (OTJ). Some OTJ training and development is provided at the workplace, sometimes a vendor provides it. Sometimes the worker signs up for Coursera, sometimes s/he takes a night class.

Then there are those who take courses online, face-to-face, or a combination, but also work part-time, are apprentices, or have internships in their fields of interest.

We have workers who realize that they must be learners and we have learners who are working on the side.

These days you can have your cake and eat it too. You can start with a culinary diploma, set up a cake shop, and learn more trade skills from pastry chefs on YouTube. You can also start with one job, wish to dabble in some frosting, switch careers, and get the necessary qualifications one way or other.

There are many permutations and combinations for how a person becomes a pastry chef. Or anything else for that matter. Open your eyes and ears and ask around. The exceptions are becoming the rule. There is no one size or method that fits all.

Tags: ,

I found this photo on Twitter taken by @garystager.

I do not have to guess that he took the photo here in Singapore because the Twitter geo tag tells me it was taken in the eastern part of our main island.

Signs like these are very common at fast food joints and upmarket coffee shops because students frequent these spots and deny customers seating by spending long hours there.

Locals do not bat any eyelid because such signs are the norm. It takes outsiders to find them unusual or funny. When they do this, they hold up a mirror with which we should examine ourselves.

Why is it not just socially acceptable but even expected that kids study in places meant for relaxation, entertainment, or a quick meal? You might even spot mothers or tuition teachers drilling and grilling their charges at fast food restaurants.

This is almost unique to Singapore. I suspect it happens (or will happen) elsewhere. Where? Any place that has high PISA scores.

So here is a tongue-in-cheek proposition for OECD. Why not investigate the relationship between studying at places like Pizza Hut and performance in PISA tests?

Policymakers worldwide might not be aware or care for the effect that the tuition industry might have on Singapore’s PISA test scores. But McDonald’s is everywhere. It might be an untapped solution to cure test score ills.

ReadWriteWeb ran an article on “how the blogosphere links to and embeds YouTube videos”.


Video source

According to that study, YouTube videos that might be considered educational, i.e., science and how-to videos, ranked very low. But that just discounts teachers who simply show YouTube videos in class or embed them in other platforms like Twitter, Facebook, or wikis. At least, I hope they do.

So what? So nothing really. I just really like the Muppet version of the Bohemian Rhapsody.


http://edublogawards.com/files/2012/11/finalistlifetime-1lds82x.png
http://edublogawards.com/2010awards/best-elearning-corporate-education-edublog-2010/

Click to see all the nominees!

QR code


Get a mobile QR code app to figure out what this means!

My tweets

Archives

Usage policy

%d bloggers like this: