Another dot in the blogosphere?

Posts Tagged ‘problem solving

 
Yesterday I reflected on how our Number One ranking in OECD’s problem-solving test raised more critical questions than provided model answers.

This tweet gave me more fuel thought.

The processes behind the products of learning are just as important, if not more so. A Number One ranking is a product of a combination of complex processes. Actually it is a by-product because we are not schooling kids for a worldwide competition.

Topping the ranking boards can send unintended and undesirable messages. Among them might be:

  • We are the best, so there is no need to change.
  • Let’s maintain the ranking for the sake of being Number One.
  • We have little or even nothing to learn from others.
  • This is a competition to be the best, so we must guard our secrets.

Unlikely as these messages might be, they can still be normalised actively or passively. The press or authorities might actively laud these accomplishments uncritically. We might passively believe everything we hear by not questioning the processes and products of rankings.

If we want learners to be resilient and creative in the face of failure, teachers and educators must first model such thinking and actions. A single-minded focus on narrow measures of success does not reveal the stories and iterations of moving forward by falling. I say we ignore rankings and do what ranking tables and agencies do not or cannot measure.

The STonline reported that a sample of Singapore students topped an Organisation for Economic Cooperation and Development (OECD) test on problem-solving.

I am glad to read this, but only cautiously so. This is partly because the press tends to report what is juicy and easy. I am cautious also because such news is not always processed critically from an educator’s point of view.

For example, how did the OECD test for problem-solving ability? According to an excerpt from the article above:

Screen capture of original article.

Screen capture of original article.

There were no other details about the authenticity, veracity, or adaptability of the software-based simulation. Only the makers of the software and the students who took the test might provide some clues. This test system is a closed one and lacks critical observers or independent evaluators.

Perhaps it would be better to raise some critical questions than to make blanket statements.

The product of problem-solving is clear (the scores), but not all the processes (interactions, negotiations, scaffolding, etc.). So how can we be certain that this problem-solving is authentic and translates to wider-world application? Our Ministry of Education (MOE) seemed to have the same concern.

MOE noted that the study design is a standardised way of measuring and comparing collaborative problem-solving skills, but real-life settings may be more complex as human beings are less predictable.

Our schools might have alternative or enrichment programmes — like the one highlighted in Queenstown Secondary — that promote group-based problem-solving. How common and accessible are such programmes? To what extent are these integrated into mainstream curriculum and practice?

The newspaper’s description of the problem-solving simulation sounds like some of the interactions that happen in role-playing games. How logical and fair is it to attribute our ranking only to what happens in schools? What contributions do other experiences make to students’ problem-solving abilities?

Test results do not guarantee transfer or wider-world impact. What are we doing to find out if these sociotechnical interventions are successful in the long run? What exactly are our measures for “success” — high test scores?

What is newsworthy should not be mistaken for critical information to be internalised as knowledge. The learning and problem-solving do not lie in provided answers; they stem from pursued questions.

I argue that we have more questions than answers, and that is not a bad thing. What is bad is the current answers are inadequate. We should not be lulled into a collective sense of complacency because we topped a test.

This video and blog entry may not be suitable for those with ultra-sensitive dispositions.


Video source

I found this video embedded in this LifeHacker article on preventing splashback. Some might find this topic gross, but it provides a solution to an everyday problem (if not everyday, then how regular you are). I also think that it illustrates principles of meaningful learning.

The splashback problem is something most of us with seat flush toilets would have faced. Bringing this video as an example for, say, a physics lesson activates a learners prior knowledge.

The problem, suggested solution and rationalization for the solution provide an authentic context for problem solving. The illustration was not so authentic as to gross viewers out, but enough to be realistic or believable. It was certainly more authentic than problems or situations that learners cannot relate to, e.g., falling out of a plane.

The solution was derived by applying theoretical principles and by doing. This required the experimenters to also think about which variables to keep constants and which to change. It might also have been fun to make the most realistic-looking poop.

The video obviously required the combined efforts of at least two people, so there are opportunities for collaborative learning during the planning, implementing, production and editing. Individuals in any group are likely to specialize in something, and when they do, these become opportunities for self-directed learning.


http://edublogawards.com/files/2012/11/finalistlifetime-1lds82x.png
http://edublogawards.com/2010awards/best-elearning-corporate-education-edublog-2010/

Click to see all the nominees!

QR code


Get a mobile QR code app to figure out what this means!

My tweets

Archives

Usage policy

%d bloggers like this: