Algorithms gone wrong
Posted February 13, 2017on:
YouTube relies on algorithms to guess what videos you might be interested in and make recommendations.
While it is machine intelligent, it does not yet have human intuit, nuance, and idiosyncrasies.
All I need to do is search for or watch a YouTube video I do not look for regularly and it will appear in my “Recommended” list. For example, if I search for online timers for my workshop sites, YouTube will recommend other timers.
If I watch a clip of a talk show host that I normally do not follow, YouTube seems to think I have a new interest and will pepper my list with random clips of that person.
This happens so often that I have taken to visiting my YouTube history immediately after I watch anything out of the ordinary and deleting that item. If I do not, my carefully curated recommendations get contaminated.
Some might argue that the algorithms help me discover more and new content. I disagree. I can find that on my own as I rely on the recommendations of a loose, wide, and diverse social network to do this.
YouTube’s algorithms cannot yet distinguish between a one-time search or viewing and a regular pattern. It cannot determine context, intent, or purpose.
Until it does, I prefer to manage my timeline and recommendations and I will show others how to do the same. This is just one of the things from a long list of digital literacies and fluencies that all of us need to do in the age of YouTube.