Do algorithms dream of electric humans?

Begoña Repiso
4 min readAug 26, 2019
Los Angeles 2019

Rick Deckard is a “bounty hunter”, he collects bonus — extra pay supplements — to buy himself a new mechanic sheep. In his dystopian society people who still remain on earth after a chemical war are the new social outcasts. Humanity as an adjective is considered “status” among earth resistants, now they’re in a world where artificially constructed “humanoids” replace them on the hardest works.

How do you demonstrate Humanity? What differentiates this so-called androids from humans?

The answer is Empathy

Humanoids have a lack of this feeling, ergo they’re not humans. But this is hard to find: Humans are getting less human day by day in a world that offers no hope. And Humanoids try to get out of a slavery life, in redemption they have more vital desire and initiative than humans.

“Most androids I’ve known have more vitality and desire to live than my wife, She has nothing to give me”. (Deckard on p. 88)

A sign of empathy is to take care of animals, most of them rare and expensive after world destruction. Buying one involves a considerable expense so you can replace by an electric pet. You need to make money for that, so, he “retires” Humanoids (they’re supposed non-humans) to get paid and win a social status. Deckard is not happy with that job, He becomes aware of his own existence and uneasy.

Roy Batty isn’t happy with his job either, he works as a slave in martian colonies, he is becoming aware of his existence, he feels strong and agile and he has a yearning for a free life but he senses that he has no time.

Do this Nexus-6 generation of Androids really feel empathy — fear — love? Maybe they’re feeling nothing, just imitating. But who knows? I think when Roy Batty smiles or cries or even when he is dying at the end maybe he doesn’t know what empathy is but he’s trying to provoke that feeling on Deckard. He transcends himself to recognise another similar soul and has the urge to communicate his impressions, his anxiety.

Maybe he doesn’t feel empathy but already has curiosity, initiative, courage, vital spirit. This behaviour awakes a gaze of admiration on Deckard’s soul. His death encourages him to seek his freedom too.

Our current (2019) world is about to fall in dystopia making humans less humans, depersonalizing, absorbing their initiative to easily serve corporate companies. Actually, there aren’t highly qualified “humanoids” for doing this job, it’s cheaper than that: There are Algorithms. Algorithms think fast, so quickly than they can predict actions and get to know humans better than humans themselves. Companies need human brains to design and control these applications, but in the near future, these applications would be capable of learning and redesigning themselves.

An algorithm is a step by step method of solving a problem. It is commonly used in computer programming based on Alan Turing’s idea of a computer machine that consists of one memory, some instructions (a program) some elementary procedures with input and output.

Could an algorithm in a near-future be capable of being aware of itself?

As Yuval Noah Harari explains: Emotions are really human brain’s calculations. So as a result of that I can’t see why not a machine is capable of thinking and sense. Self-consciousness takes a step forward to the whole subject.

Regarding the emotional matter, many engineers bring up a more fine-tuned question: Could a human notice that a machine suddenly has become aware of itself? That this application can feel emotions? For now, machines can simulate feelings (like Roy Batty? ;)) and this could be detected by the Turing test, which is the original concept of Blade Runner’s Voight Kampff empathy test despite the last one is based on empathy more than intelligence.

But is empathy a sign of humanity or precisely the lack of it?… Rather for “progress”, industrial and Colonial advantages come from harm to people and the environment. Natives and animals have been treated as objects or property, if you want to destroy people only need to depersonalise them. That’s what Deckard’s boss was trying to do, with that “skin-job” term, but this is not working for some people fortunately, there are some people that can’t disable compassion, even if they are “non-humans”. Maybe Roy or Deckard or Rachel could tell us about that.

Originally published at https://repisoblog.wordpress.com on August 26, 2019.

--

--