Bicycles for the mind, not babysitters

Steve Jobs in a film called ‘Memory & imagination: New pathways to the library of congress’, talked about computers and the future of libraries. His quote ‘computers are like a bicycle for the mind’ comes from an excerpt in the film:

I think one of the things that really separates us from the high primates is that we are tool builders. I read a study that measured the efficiency of locomotion for various species on the planet. The condor used the least energy to move a kilometer. And humans came in with a rather unimpressive showing, about a third of the way down the list. It was not too proud a showing for the crown of creation. So, that did not look so good. But, then somebody at Scientific American had the insight to test the efficiency of locomotion for a man on a bicycle. And a man on a bicycle, a human on a bicycle, blew the condor away, completely off the top of the charts.

Steve jobs

And that is what a computer is to me. What a computer is to me is it is the most remarkable tool that we have ever come up with, and it is the equivalent of a bicycle for our minds.

In other words, Jobs saw computers as tools that make us more capable, but not as something that should displace or exploit us. Depending on the definition of computers, this may still be correct, but sadly, this is not true for technology in general. The rise and normalization of artificial intelligence and machine learning has been replacing human activity to the point of letting Amazon, Facebook or Google Home do it for us.

There is a philosophical approach to technology, human-driven technology which focuses in what Steve Jobs, Bill Gates and others preached: a tool that enable humans to reach complex endeavors like going to other planets. Not that exploit us to live like zombies looking at our phone for new notifications each 5 minutes.

There is also in my opinion a large issue with the current model of ‘show me more of what I like, to stay more time in the platform’. On one hand, allowing Netflix to auto-play the series we are watching could lead to binge watching and the lost of many time and sleep. That has been reported multiple times, so it should come as a no surprise. However, what will be extremely difficult to evaluate, is what is the loss of not finding content that does not fall in the categories that we know we like.

Let us think about this for a minute: it is probably clear to us what we like and what we don’t like, but it is impossible to know if we like or not what we don’t know. If we allow an algorithm to pick the content for us, we will never be exposed to new and different things unless it is in line with what the algorithm purpose is. Following a human-driven philosophy, that will not be an issue: technology is there to make us better, so it will make sense to at least suggest a wider range of options.

However, AI powered business do not work that way, their focus is to keep you using the platform for more time, so there is no reason for the algorithm to show you content that you may dislike, therefore making you step out of their platform.

I can make an example that resonates with my personal experience: I used to read books about technology and business. Then I found that psychology was also an extremely interesting field to learn more, especially when applied to human relationships (business, persuasion, seduction…). More recently, following some recommendations online, I am really enjoying reading biographies. The interesting point here is that the only reason why I started to read is because I followed the advice of somebody that I trust online. But without that, there was no way for me to try a topic that did not interest me too much. Furthermore, if I just let Amazon suggest me titles, it would have been very unlikely to discover some of the nice biographies out there.

This can be extrapolated to other arts like music, tv shows or movies. Even if you try to take a look at different topics, the algorithm will focus generally in giving you more of what you have seen recently, so the suggestions are to keep you watching, not to let you explore possibilities.

It also does not help the fact that mostly, the comment that makes people stay more in platforms is the one that leads to outraging and confrontation. In fact, I have heard multiple times that everybody can use the same tools online, and I tend to disagree. The fact that tools are available for everybody to use, that is true, but the content that different people will use will not have the same result. Anybody that wants to put content that not leads to large confrontations will have way more problems to make the content to reach a bigger audience. Many of the problems that we can see today with social media would be easy to solve if the only thing that matters was the tool, and not the psychological reactions to the content itself.

Now, it can be argued that it is the work of each person to try to expand their knowledge to different topics. That is probably true, but will be very lame in the mouth of a spoken person of any company that uses persuasive technology to keep people using their platform for hours, while they exploit data to sell to the biggest seller. Yes, the final responsibility is always with the human, but is quite unethical to build a huge ecosystem to prevent people to make choices. When the creation of filter bubbles is the main topic that makes economic sense for a company, it is shameful to suggest that people should know better.

Because humans require bicycles for the mind, not babysitters that let us explore only a part of the world.

One Reply to “Bicycles for the mind, not babysitters”

Leave a Reply

Your email address will not be published. Required fields are marked *