Another dot in the blogosphere?

Posts Tagged ‘ethics

Video source

I’ll admit it. The title of this episode did not appeal to me from the get go. Why use artificial intelligence (AI) to figure out of there are other forms of intelligent life in the galaxy?

Here is my bias: I would rather see the power of AI developed more for enabling better life on Earth. But I remain open-minded enough to learn something about the alien effort.

According to one scientist, the last 50 years of space data exploration is akin to a glass of water. This is against the total data set the size of the world’s oceans. So using AI makes sense.

I liked the honesty of another scientist who declared that he did not know exactly what he was looking for. He was simply looking for a blip of life against a sea of darkness. So again there is the counter narrative to the press and movies — we are not looking for aliens to battle.

So how might AI detect alien life? Pattern recognition of rare needles against vast amounts of hay in huge stacks.

About halfway through the video, the content switched abruptly to synths — AI with bodies that mimic humans. Long story short, we are nowhere near what science fiction paints in books or movies. But the efforts to deconstruct and reconstruct the human body and mind are interesting (to put it mildly).

I liked how the video moved on to the ethics of synths. What rights would they have? Can they be taught good values? If they commit crimes, who is responsible? These proactive questions influence their design and development.

I think the episode was the final one. If it was, it was a good note to end on.

I have avoided reading and reviewing this opinion piece Analytics can help universities better support students’ learning. When I scanned the content earlier this month, my edtech Spidey sense got triggered. Why?

Take the oft cited reason for leveraging on the data: They “provide information for faculty members to formulate intervention strategies to support individual students in their learning.”

Nowhere in the op piece was there mention of students giving permission for their data to be used that way. Students are paying for an education and a diploma; they are not paying to be data-mined.

I am not against enhancing better study or enabling individualisation of learning. I am against the unethical or unsanctioned use of student data.

Consider the unfair use of student-generated data. Modern universities rely on learning management systems (LMS) for blended and online learning. These LMS are likely to integrate plagiarism checking add-ons like Turnitin. When students submit their work, Turnitin gets an ever-increasing and improving database. It also charges its partner universities hefty subscription fees for the service.

Now take a step back: Students pay university fees while helping a university partner and the university partner makes money off student-generated data. What do students get back in return?

Students do not necessarily learn how to be more responsible academic writers. They might actually learn to game the system. Is that worth their data?

Back to the article. It highlighted two risks:

First, an overly aggressive use of such techniques can be overbearing for students. Second, there is a danger of adverse predictions/expectations leading to self-fulfilling prophecies.

These are real risks, but they sidestep the more fundamental issues of data permissions and fair use. What is done to protect students when they are not even aware of how and when their data is used?

This is not about having a more stringent version of our PDPA* — perhaps an act that disallows any agency from sharing our data with third parties without our express consent.

It is about not telling students that their data is used for behavioural pattern recognition and to benefit a third party. While not on the scale of what Cambridge Analytica did to manipulate political elections, the principle is the same — unsanctioned and potentially unethical use of a population’s data.

*I wonder why polytechnics are included in the list of agencies (last updated 18 March 2013) responsible for personal data protection but universities are not.

The video embedded in the tweet below went viral recently.

Video source

Educators would readily repeat the message of grit or persistence or resilience.

I was about to add to the chant by pointing out that the life lesson was not foreseen, planned, or scaffolded. This does not help teachers who operate by standards, curriculum, or otherwise being told what to do.

As I hesitated on clicking the publish button in WordPress, this Atlantic news article provided insights on why the mother bear and her cub made such a desperate dash. They were startled by an aerial drone that sought to capture footage.

The lesson about struggling and failing to learn about grit and persistence is important, but it is the obvious and low-hanging fruit. The bears would have been put in that danger had the documentarians operated more ethically. Therein lies an equally important but less obvious lesson.

Despite the doubling of tweet length, this one (archived version) needs more context.

The sharing session might focus on WHAT the context is and HOW the supposed system auto-magically does this.

But I wonder if it will explore the WHY of doing this. Answering this question explores the ethics of incorporating such technology. This might include what data is collected and how algorithms run to make summary decisions.

Let us not forget where others have gone or are going before, i.e., how Facebook and Google are under the microscope for not being more careful with student data.

You just know that a blog entry with content like “plenty of educators, especially administrators, wouldn’t know a blog from their elbow, let alone have a clue how they might use Twitter or Ning in their districts” is fishing for something.

The author of that blog had a few points to make: 1) many educators don’t know what Web 2.0 is, much less how to integrate it, 2) Web 2.0 is important to learners now and in the future (even if Web 2.0 evolves to something else), and 3) we are not putting enough technology in the hands of learners so that they learn more authentically.

So my response to his question “Social media… dirty word or essential skill?” is obvious. It is an essential skill. But I’d add that students also need to be information literate and socially literate (I say more of this in a book chapter that I have written).

They also need essential attitudes and values. Why? Web 2.0 is a sociotechnical phenomenon. The technology enables users to generate and publish content easily. But users must also want to do this, and as they do, they must do so ethically and responsibly.

Click to see all the nominees!

QR code

Get a mobile QR code app to figure out what this means!

My tweets


Usage policy

%d bloggers like this: