The Machines are here already

I came across a critical analysis of the modern internet recently, specifically how its algorithms today already shape and influence human behaviour, society, and the way the world is going.

It confirms my suspicion that we already are in the matrix in some way. It’s not that computers / robots / AI have ‘taken over’ as a discrete, external entity from us (as e.g. in Terminator) but instead the influence is much more subtle and insidious.

terminator
This is NOT the enemy – if only it were this easy.

Three ways in which the machines are in control

1. The smartphone reduces your Real Life (RL) experience

A smartphone just sucks someone into a small screen, where all their attention is focused. Have you seen people at a bus stop or on public transport recently? They just look into the palm of their hand.

If all that time is now spent on screens, it is not spent in the real (3-dimensional) world involving all of your senses.

I don’t know if the increase in mental health issues amongst young people is correlated with that, but arguably if you spend less time practising ‘being yourself’ in the real world you may find it more difficult. For that reason alone you should reduce your screen time and

drastically

so.

Don’t let the machines reduce your RL experience. The real (3D) world is messy but it involves ALL your senses!

2. The algorithms reduce your understanding that other views exist

One of the beauties of the algorithms running the internet is that they’re so subtle. While most people are aware of some degree of personalisation (e.g. you see content similar to what you’ve previously liked / bought), on a meta-level they still think everyone roughly shares the same reality (e.g. 62% of people in the UK don’t realise their social networks can affect the news they see – more on that later). This is the so-called filter bubble, according to Wikipedia

a state of intellectual isolation that can result from personalized searches when a website algorithm selectively guesses what information a user would like to see based on information about the user, such as location, past click-behavior and search history.

In addition, confirmation bias means we favour information which confirms previously existing beliefs or biases, and filter out the rest. Our existing beliefs also affect how we process and interpret new information (i.e. we slant it towards what we already believe).

This and other psychological research explains why Trump got elected.

The algorithms (especially on social media) fuel our (very human) confirmation biases, so we think everyone thinks like us / agrees with us. All we see every day is confirmation of our own views. We do NOT really understand any more in a deep way that other views and realities exist.

This absence of a shared reality, some argue, is a threat to democracy.

3. The algorithms harm your child’s development by feeding it bad content

‘Suggested videos’ / ‘watch next’ is NOT a good thing, if the algorithm is optimising for horrific Peppa the Pig parody videos. Your baby could see the following:

 A dentist with a huge syringe appears. Peppa’s teeth get pulled out. Distressed crying can be heard on the soundtrack.

(From: The disturbing YouTube videos that are tricking children).

And that apparently is only a harmless example. This kind of stuff imprints on your baby’s memory. Thanks to algorithms and machine learning,  baby will get served up more of the same stuff. You, on the other hand, don’t even know it’s going on in the first place.

The article on Medium that inspired this blog post – Something is wrong on the internet – explains quite well how it all works.

Peppa the Pig goes wild and scary

Do you control your machines?

I believe we can and should live with our machines, peacefully, but that it is us who calls the shots. I’d like to see some kind of digital enlightenment where we’re much more aware (enlightened) about some of the stuff that really goes on underneath!

It is shocking to think that, in 2018 in the UK, according to new research:

  • 62% don’t realise their social networks can affect the news they see
  • 45% of people are unaware that information they enter on
    websites and social media can help target ads
  • 83% are unaware information can be collected about them that other people have shared

The full research, the aptly subtitled ‘2018 Digital Understanding Report’, was commissioned by Martha Lane Fox’ doteveryone think tank.

If it is true that the machines are here already, then digital education – and I don’t mean programming –  is the only way to ensure we learn to keep them in their places.