Another dot in the blogosphere?

Posts Tagged ‘literacy

This week’s Crash Course episode on navigating digital information focused on evaluating evidence offered by online creators.


Video source

Anyone who says anything online needs to back up any claim with evidence. But not just any evidence.

Some might offer claims as evidence. Host John Green highlighted a claim about a new and supposedly deadly spider that had already killed five people in the USA. That claim (in all caps, no less) was made without reference to any other resource.

Others might offer wrong evidence after making a claim. Green provided the example of a US senator who brought a snowball into the senate floor and offered it as evidence that there was no global warming. This was evidence of winter and short term weather, but nothing against long term climate change.

In Green’s own words, not all evidence is created equally. So what are we to do? Ask two questions:

  • Does the information make sense?
  • Does the information merely confirm my pre-existing worldview?

Answers to both questions require value judgements and this can be a subjective process. To make things more objective, we could evaluate evidence by finding out how valid and reliable it is.

Validity is about how relevant and credible the information is; reliability is a measure of how much or how often that same evidence shows up.


Video source

Part 5 of the Crash Course series on digital literacy focused on using Wikipedia.

Host John Green pointed out that Wikipedia was almost 18-years-old, and as it matured, was behaving more like a responsible adult.

Wikipedia has long policed itself with three guiding principles for editing articles:

  1. Content should be represented from a neutral point of view
  2. Cired research should come from published and reliable sources
  3. Readers and editors should be able to verify the sources of information

Despite these operating principles and research about the accuracy of Wikipedia [example], some still wrongly dissuade others from using it.

Green recommended that Wikipedia might be relied on for breadth of information and links for fact-checkin: Use it like “a launch pad, not a finish line”.

The depth of research and fact-checking could come from the hyperlinks from Wikipedia to other resources. One caveat: Resources are never perfect or objective because a) they were made by imperfect people, and b) they are used by imperfect people.

Wikipedia is not the problem; we and how we use it are.

We live in testing times. Not just politically or environmentally, but also in terms of actual tests.

So here is a basic tip with multiple-choice questions like the one above: Use LETTERS as options instead of numbers.

In this week’s episode of Crash Course’s video on information and digital literacies, host John Green focused on the authority and perspective of sources.


Video source

The authority of an author or a source might be determined by finding out about its:

  • Professional background
  • Processes used to create information
  • Systems in place to catch and correct mistakes

Authoritative sources do not guarantee that their information is correct all the time. When they make mistakes, they admit and correct them openly.

The perspective of an author or a source needs to be gleaned from its orientation, opinions, or analyses. Perspective colours choice of words and the direction of influence.

This week’s episode on being literate today focused on reading laterally.


Video source

Reading laterally is not about reading articles from top to bottom, it is about reading sideways via other open tabs.

John Green recommended we do what teachers might still dissuade: When in doubt, check Wikipedia and its links to resources. The writeups and hyperlinks can be corrected much faster than other media.

Superficial consumption is not enough if we are to be critical readers, listeners, or watchers. It takes effort to go deep, but it pays off in the form of habits of critical literacy.

Here some of my notes on the second part of Crash Course’s series on media and digital literacies.


Video source

This episode focused on fact checking. To do this, presenter John Green outlined a Stanford University study on how a group university professors and students evaluated information online.

The participants focused on superficial elements of source sites, e.g., how it presented information, instead of looking deeper on what information it shared.

On the other hand, professional fact checkers armed themselves with at least three questions to evaluate sources:

  1. Who is behind this information and why are they sharing it?
  2. What is the evidence for their claims?
  3. What do other sources say about the sharer and its claims?

Answering these questions is not as simple as ABC, but it does provide an easy-to-remember set of 1-2-3 to evaluate what we read, watch, or listen to.

Near the end of the video, Green highlighted the difference between being cynical and being skeptical. The former is being “generally distrustful of everyone else’s motives” while the latter is being “not easily convinced”.

All of us could use a healthy dose of skepticism every day. The problem is that our bias might raise this shield when the information does not align to what we already know or believe. This is why asking the 1-2-3 regardless of source or our compass helps keep us in check.

John Green and co have just released part 1 of their Crash Course series on navigating digital information.


Video source

If I had to sum up the takeaway from the video, it would be this: Just because it looks like a news article does not make it one. Appearances like layout, graphics, and slickness matter, but these should not distract from the quality and accuracy of the content. To determine those latter qualities, we need to investigate the sources of the article.

Sounds simple, doesn’t it?

However, Green mentioned a study by the Stanford History Education Group which highlighted how historians and university students focused on the superficial instead of digging deep.

Speaking of digging deep, I could find the Stanford group online, but not the documentation about the study from the Crash Course video page. Might Crash Course consider providing a link to such evidence and not just its main sponsors/collaborators?


http://edublogawards.com/files/2012/11/finalistlifetime-1lds82x.png
http://edublogawards.com/2010awards/best-elearning-corporate-education-edublog-2010/

Click to see all the nominees!

QR code


Get a mobile QR code app to figure out what this means!

My tweets

Archives

Usage policy

%d bloggers like this: