Another dot in the blogosphere?

My reflection starts with an Apple Pay verification process and ends with lessons on teaching and assessment.
 

 
When Apple Pay launched in Singapore in May, I jumped on the bandwagon by verifying one of my credit cards. The process was quick and painless: Scan the card details into the Wallet app and verify the card by SMS.

I tried the process with another eligible card, but did not receive SMS verification. I put that down to early implementation issues.

However, I tried about ten times between the launch in May and this month and was still unsuccessful. The Wallet app provided the alternative verification process of calling the credit card issuing bank’s customer service.

I dread using such customer “service” because the process make me feel like a rat being tested in a maze.
 

 
I had to get through several layers of number pressing before getting the option to speak with a human. Once there, I was informed that they were “experiencing a high call volume”.

I missed having an old phone that I could slam down on the receiver.

This particular bank provided the option of leaving my contact number so that I would receive a call-back in four hours. That must have been some really high call volume!

I received one shortly before the four-hour mark and explained how I did not receive SMS verification for Apple Pay from that bank’s system. I also mentioned that I had done the verification for another bank’s card quickly and seamlessly with the same process.

The customer service representative (CSR) was puzzled, checked the messaging records, and told me that SMS had been sent to my phone. I wanted to reply that I was not an idiot, but I bit my tongue. I repeated that I did not receive any despite several attempts over two months.

The CSR then advised me not to use my bank-issued security dongle. I told him that the dongle was irrelevant because it was not a verification option in Apple’s Wallet app. So he said he needed to look into my case and asked if he could call me back in an hour.

As soon we disconnected, something connected. A long time ago, I blocked a few of the bank’s SMS numbers because I kept getting marketing messages despite telling them I did not want any. I wondered if the SMS verification shared one of those numbers.

I figured out how to unblock the numbers and tested the SMS verification for that bank card. It worked as quickly as my first card.

The was not the fault of the bank. It was mine for blocking numbers, irritating as their messages were.

I reminded myself of two lessons on teaching:

  1. You should not just stick to a script. It is important to first listen to the learner’s problem before suggesting a learning solution. The CSR’s advice to not use the dongle was obviously part of a recommended script, but it was irrelevant in this context. Mentioning the dongle not only did not help matters, it added to my frustration.
  2. Thinking out loud is one of the best ways to learn. I knew what the symptom of my problem was (no SMS from the bank), but I did not know its root cause (I had blocked some SMS numbers). Speaking to someone helped me pull thoughts to the surface and helped me find my own solutions.

When the CSR called back, I explained how I had solved the problem myself. He was relieved. I was relieved.

Right after we disconnected, he triggered an SMS to me to rate the customer service by text. It was like being pranked.

Bank SMS.

I did not respond to the SMS because the ratings were too coarse: Below, Meet, Exceed.

The phone service took place over more than one call and had multiple components. Averaging the experience was not meaningful. Detailed feedback on what was good or not good about the experience and analysing a recording of the exchanges are more tedious but better options.

I thought of two lessons on assessment:

  1. The administrative need to collect and collate data drives such bad practice. Just because you collect these data does not make the data meaningful or help CSRs improve. Administrative needs should not drive assessment.
  2. The average rating approach is a hallmark of summative assessment. It is grading an experience. If the CSR received “Exceed”, did he get a pat on the back? If the feedback was “Meet”, would he just keep reading from scripts? If the grade was “Below”, what can he do with that information? Good assessment is based on quality feedback, not just grades.

It does not take special events, teacher observations, prescribed professional development, or even a personal learning network to learn how to teach or assess better. The lessons and reminders are everywhere, even in the Apple Pay card verification process. You just have to pay attention.

Before the new PSLE scoring system was announced two weeks ago, it was described by Acting Education Minister (Schools) Ng Chee Meng in May as “no silver bullet”.

That cliché aptly describes the changes. So do “a mixed bag” and “to have your cake and eat it too”. In my sixth and final reflection on PSLE2021, I explain why the restructuring does not go far enough.

As if to pre-empt this line of argument, Mr Ng said:

“Some things are best evolved and not revolutionalised,” he said noting that Singapore’s education system is a strong and robust one as educators have done very well over the last 50 years in building a strong system.

The PSLE2021 is an evolution, not a revolution. Again, very apt.

The most important but undersold change is the switch from norm-based testing to criterion-based testing (see Part 1 of my reflection). However, the PSLE retains its summative testing and sorting nature. These counteract the messages of the restructure being less stressful, not being a source of competition, and focusing on the learner and learning (Part 2).

Those that study change and are familiar with the literature will describe the proposed changes as piecemeal. This contrasts with systemic change.

Piecemeal change is often top-down and tacked on to an existing system. It might make incremental improvements and it does not disrupt the status quo. That is why such change is evolutionary and not revolutionary.

Systemic change is often the opposite, although its leadership and sustainability can stem from a mix of top-down, bottom-up, and middle-up-and-down. Such change takes place by first identifying key leverage points of a system.

In schooling, one critical leverage point for systemic change is assessment. Change this and everything else has to change. It is the tail that wags the dog.

If the PSLE2021 was systemic, it could start with changes in the assessment at the end of Primary 6 — if there was one at all — and cascade changes to educational policies, curriculum, teaching methods, school support, stakeholder behaviours, and more.

Piecemeal change often leads to little appreciable change or no change at all.

The changes in PSLE2021 will not include curricula (see point 10 of this article). It is relatively easy to get used to the Achievement Levels (ALs) since we were all schooled to think that way — they are like O-Level grades!

Teachers can keep drilling in the latter stages and tutors can keep “enriching”. Enrichment tuition centres need only replace their trophy heads’ grades with AL1s instead of As or A*s. Parents can keep pushing their kids to compete and subject them to hothousing.

Consider another example. With regard to the PSLE2021 changes, a school principal said:

…this would reduce the previous “pressure points” of comparing against peers and chasing the last few marks. Instead, the focus can be on grasping and having a “mastery over content”, and striving towards one’s personal best.

The change from conventional grades to ALs will do little to stop the paper chase. The ALs are not actionable because they are products of a terminal activity (Part 2).

Trawl what leaders in education are saying online about grading and you will see something like this emerge.

Quantitative grading ends learning. Quality feedback sustains learning.

It is possible to do very well in a test or exam by drill, rote, and formulaic thinking. It matters little if you have “mastery of content” if you do not hone thinking skills.

The changes in PSLE2021 have not been accompanied by changes in curriculum to address student thinking and skills, or professional development for teachers to teach differently.

For example, the curriculum is still designed to be learning about Mathematics or Science. It is hardly about learning to be or think like a mathematician or scientist. Some teachers want critical, creative, and independent learners, but they either do not know how to model or nurture these traits, or are not willing to let go.

So I am critical of the piecemeal change. The vision for change is not met by its currently proposed implementation. Mr Ng’s vision was:

…to move this school system forward so that we reduce the competitiveness of it, and encourage creativity and collaboration of succeeding together.

How is retaining and polishing an old and increasingly outdated assessment and sorting system an attempt to “encourage creativity and collaboration of succeeding together”? Creative answers are not encouraged and not appreciated in grading rubrics. There is are names for “collaboration” and “succeeding together” in exams — they are cheating and colluding.

If we remain rooted in the domain of summative assessment, we operate by its rules and language.

Here is another cliché: Fortune favours the brave. Did Finland worry when it implemented keyboarding over handwriting? Did it wonder what others would think about:

No. It focused on what its students need today and tomorrow. It takes care of its citizens so that they take care of themselves and their country.

I am not suggesting that we adopt Finland’s strategies wholesale because our schooling contexts might be different. Our schooling system already has so-called alternatives like DSA, e-portfolios, institutional entry tests, interviews, performances, and train-through like integrated programmes. Why not empower and support these more?

I say we put our money where our collective mouth is. If we say we must value creativity, innovation, critical thinking, and collaboration, then we must implement processes that nurture and measure these things.

 
I poke and prod some quotes from various stakeholders of PSLE2021.

Quote 1

Currently, the T-score (short for transformed score) reflects how well a student did in relation to others in the cohort — using a mathematical formula. A student may have got high marks for a subject, but would receive a lower T-score if most of his peers performed better than him.

This encapsulates the main issue with the current PSLE. It is a sorting model based on bell curve or normal distribution.

There is nothing wrong about making the assumption of normal distribution for a large population. What is doubtful is if a cohort of Primary 6 students is representative of that population.

Quote 2

MOE said the changes are part of a larger shift to nurture well-rounded individuals and move away from an over-emphasis on academic results. They will reduce fine differentiation of students – a key complaint of the current scoring system; reflect a student’s level of achievement regardless of how his peers have done; and encourage families to choose schools based on their suitability for the child’s learning needs, talents and interests.

This quote ticks all the right boxes, but we need to read in between the lines.

Student achievement as measured by standards or criteria instead of comparison with other students in the sample is a good move. I wonder if we have studied the USA’s implementation of Common Core and its testing regimes.

The kickback there is how testing has affected curriculum, restricted teaching, influenced teacher appraisals, and increased stress levels of stakeholders. The only ones that seem to have benefitted from the programme is test companies.

As for the desired change in parental mindsets, read one example in the quote below.

Quote 3

This was also an issue raised by Jean Lim, a former teacher with more than 30 years’ experience. “In the past, if a student scored 75, we could tell parents that their child scored an A, and there were happy. But now, if they score an AL4, which is still considered an A in the old system, they will not be happy, because an AL4 just doesn’t sound as nice,” she said.

The message that the focus will be on the learner and learning will fall on deaf ears if PSLE2021 comes across as only about changes in scoring.

The current PSLE regime has created a cultural monster that feeds on kiasu-ism and is fueled by enrichment tuition for competition. Numbers like T-scores, aggregates, and cut-off points are the well-understood rules.

We wait with bated breath on what MOE and schools will do to deal with these. If they take action, we do not need more dialogues on what PSLE2021 means. We can read and think. We need MOE and schools to listen and reflect first.

Quote 4

Time needed for parents to change mindset of chasing ‘good schools’

This was an awful title for a forum letter. It is not just time that will change mindsets, as if the influence is somehow automatic. It will take a lot of concerted effort.

There was a plan for the original PSLE in 1960. It changed over time, but it was people that communicated, forced policies through, and implemented the regime. Once enculturated, the PSLE took a life of its own when schools and parents responded to the increased stakes and competition with hot-housing and tuition.

Better than the headline was conclusion of the letter:

The greatest change that this new system is supposed to elicit is a mindset change. With the clock ticking away from now until 2021, more things can be done by schools and the Education Ministry to alleviate the fear and uncertainty that parents feel, to help them have more confidence in the new system.

Quote 5

It depends on what is best for her, not what the best school is,” said Ms Ho. “Ultimately, you want your child to grow up to be a good person with good character, good morals and if you’re always focusing on the academics, you will miss out on other things.

If I was facilitating a change management effort, this quote would be an integral part of the visioning process. Change agents need to visualise what they want to achieve otherwise they will be running blindly.

We need need more parents with this perspective. MOE needs more parents with this perspective.

There already are some who have this mindset. How many are there? What is MOE going to do take advantage of this?

I am being realistic, not blindly optimistic, about the changes in and around PSLE2021. It is piecemeal change, not systemic change. It is evolutionary change, not revolutionary change. It is not enough. More thoughts on this in Part 6 tomorrow.

The first three parts of my reflections on PSLE2021 was like reviewing the good, the bad, and the ugly.

  • Part 1: The good change is the move to criterion-based testing
  • Part 2: The bad is that the assessment is still summative
  • Part 3: The ugly is how T-score differentiation goes away only to be replaced by course granularity

 

 
Most people know how the current A grade in the PSLE spans scores of 75 or more. They have pointed out how the new Achievement Levels 1 to 4 will be equivalent to the current A.

The concern seems to be that the old A was attainable at 75 while straight As (75s) under the new scheme results in four AL4s and aggregate of 16. The new aggregate will not look and feel as pretty.

Others have focused on the disparity of score spans for each AL. I illustrate the score spans for each AL in the table below.
 

AL Raw score range Score span
1 ≥90 11
2 85 to 89 5
3 80 to 84 5
4 75 to 79 5
5 65 to 74 10
6 45 to 64 20
7 20 to 44 25
8 <20 19

 
But those who think this way are missing the point.

Not only do the ALs try to introduce some granularity to the grades, I speculate that they are an attempt to 1) prevent grade inflation, and 2) insidiously reintroduce the bell curve.

Grade inflation is the ease with which students get an A or even an A* for each of their examinable subjects. It is more commonly referred to in the context of school, university, or workplace admission offices. The people who work here help decide which students get entry and they struggle to distinguish between numerous diplomas filled with straight As.

This is the source of the snippet on grade inflation that I tweeted last year.

The “finer” grained ALs help separate the good As from the not so good As. This punctures grade inflation, and very likely, egos and morales too.
 

 
In Part 1 of this series, I wrote about how the future standards or criterion-based testing was better than the current norm-referenced testing. I described it as an important fundamental shift in the PSLE. Implemented well, the focus could shift centrally to the learner and learning instead of focusing on sorting.

However, administrators and policymakers like “God views” of their system. Reducing people to numbers, data points, statistics, and diagrams are their work (and could be their idea of fun). The bell curve is too sexy to let go because phenomena only seem normal if there is a normal distribution.

Things seem neater and safer under the umbrella of a bell curve. You can be sure that one or more groups have crunched numbers with existing data to see if the ALs might insidiously recreate a normal distribution.

With some logical guesswork, you might see how this pattern might emerge as well.

It is a fairly safe assumption to say that many kids taking PSLE have be “tuitioned” and/or drilled in school. Quite a few will get As. The ALs 1 to 4 will spread them out: There will be fewer AL1s than AL4s. The curve draws itself with greater granularity.

TL;DR? The uneven AL bands in PSLE might not just be for increasing the granularity of measuring achievement. It might actually help administrators and policymakers prevent A-grade inflation and recreate the bell curve.

I failed to get verified as @ashley on Twitter.

When I read that Twitter was opening up the Twitter verified account to anyone, I thought I would give it a go. Nothing ventured, nothing gained, right?

Apparently, something ventured also led to nothing gained.

I jumped through hoops and provided information in a form. It took a few days for me to receive this rejection email.

Twitter rejection email.

Twitter did not say why I could not get a blue check mark against my handle “at this time”. So, maybe much later?

I can guess why the verification was rejected though. I am not a celebrity Ashley or an otherwise famous one. My account is not of enough public interest.

It does not matter that I have had @ashley since January 2007 and that Twitter is the only social media platform I believe in and am active on.

It does not matter that I have ignored threats and monetary offers for my Twitter handle.

It does not matter that I promote edu-tweeting when I can.

A little over 2000 followers does not pass muster. It is a drop in a celebrity ocean. This is my fault since I block between 30-50 people every day for assorted Twitter sins. Imagine how many I would have it I did not stand my ground.

I am taking a weekend break from ruminating on PSLE2021 [Part 1: An important undercurrent] [Part 2: The Dark Side] [Part 3: Differentiation vs granularity].

It is depressing to think about what we put kids through and to process the piecemeal change that is PSLE2021.


Video source

So I lighten my own mood with a YouTube video that carries important advice.

Instructional designers and teachers can learn something from the online fascination with Taylor Swift’s legs.

If more people seem to be interested in Tay-Tay’s legs than in climate change, what might carriers of the latter and more important message do?

The advice at the end of the video is this: Change tactics from persuasion by guilt to persuasion by charm. No one likes being nagged or told they are wrong.

This does not mean you cannot be critical or point out flaws. It does mean saying the same thing differently, e.g., with wit and charm.

In Part 1 of my analysis of the new PSLE assessment system, I highlighted the important fundamental switch from norm-referenced testing to criterion-referenced testing.

In Part 2, I described how the summative design and implementation of the high-stakes exam counters positive change.

In this part, I reflect on MOE’s and the public’s obsession with differentiation when we all should be more concerned about granularity.

The current PSLE uses transformed scores (T-scores) and their aggregates as outcome measures for the exam. There are at least three major problems with doing this.

  • The scores can be normalised (see Part 1) and this process is not transparent to the students or the public.
  • A score might indicate where a student stands relative to his or her peers, but it does not indicate what that number means. The student does not know what areas of learning need to be addressed because the exam papers are not returned and there is no feedback loop. This is typical of summative assessment (see Part 2).
  • The aggregate scores seem to finely differentiate students who are competing for places in Secondary schools. For example, a student with an aggregate score of 221 gets in, but another with 220 does not. Someone sets an entry score, but no one can really explain why that benchmark is and what it means. That is one way we play the cruel numbers game.

PSLE2021 is supposed to take away that differentiation because exam papers are graded with Achievement Levels (ALs). Students are assessed on four academic subjects and each subject can be graded from AL1 to AL8. This results in 29 discrete categories of aggregate AL scores of 4 to 32. There is a better, but still not adequate, granularity of scores.

It is no surprise that other bloggers and their mothers have pointed out that the competition will now be for low aggregate scores. Tuition centres might tweak their marketing material to focus on lowering AL scores.

The schooling arms race used to be to get the highest aggregate T-score possible. The next battle is to get the lowest aggregate AL score possible. This is like a Pokémon Go game, just not as fun. You do not want to collect all the scores. You only want the very rare AL1 Pokemon.
 

 
The “new” PSLE does not change or break the summative assessment mould. It is still a sorting tool. It is being tweaked to be a decision-making tool.

If the focus is on student achievement and learning — as claimed by the MOE PSLE2021 microsite and echoed the press — students need even greater granularity. By this I am not referring to exact scores of each paper, although this will provide coarse insight. I am also referring to feedback and remediation on areas of weakness.

At the risk of painting a overly simplistic dichotomy, we have these divergent paths:

  1. Summative assessment model, T-scores, fine differentiation for the purpose of sorting.
  2. Alternative assessment model, high granularity for the purpose of meaningful learning.

Our MOE seem to be designing a hybrid, at least on paper. In its wish list is: Summative assessment, some granularity, a focus on learning. This is a very elusive Pokémon, it is exists.

What paths will we take? What game will we play? What is at stake?

http://edublogawards.com/files/2012/11/finalistlifetime-1lds82x.png
http://edublogawards.com/2010awards/best-elearning-corporate-education-edublog-2010/

Click to see all the nominees!

QR code


Get a mobile QR code app to figure out what this means!

My tweets

Archives

Usage policy

Follow

Get every new post delivered to your Inbox.

Join 2,358 other followers

%d bloggers like this: