7

The following story has been envisioned in White Christmas episode (part II) of Black Mirror television series (see the short clip):

Matt greets the confused and terrified "Greta cookie" consciousness. He explains that she is not actually Greta, but rather a digital copy of Greta's consciousness, called a cookie, designed to control Greta's smart house and calendar, ensuring everything is perfect for the real Greta. Matt then creates a virtual body for Greta's digital copy and puts her in a simulated white room containing nothing but a control panel. The copy refuses to accept that she is not a real person and rejects being forced to be a slave to Greta's desires. Matt's job is to break the willpower of digital copies through torture, so they will submit to a life of servitude to their real counterparts.

Based on the envisioned story, would it be immoral to enslave your own simulated mind on a bean-sized chip for your own desires against the will of digital copy of yourself for the life time? And why?

On one hand, it would be immoral, because simulated mind has its own self-awareness and concept of free will, and it can register emotional pain. On the other hand it won't be immoral, because it lacks a physical body (at the end it's just a piece of code), and you tell yourself that you own it, because it is just a copy of your own mind (and when you own something, you can do whatever you want).

kenorb
  • 179
  • 6
  • Fantastic story idea. In the future the elite will live in their digital heaven (and "upload" advocacy often sounds just like Christian theology) while the meatspace slaves dig the coal and tend the power plants to produce the electricity to run the computers. – user4894 Jan 23 '17 at 19:59
  • 4
    "when you own something, you can do whatever you want [to/with it]" -- this is a questionable assertion. – Dave Jan 23 '17 at 20:15
  • 1
    If the simulated mind has "its own self-awareness and concept of free will" it really makes no difference whether it is a digital copy or not, it would be like torturing a clone or a twin. That "it's just a piece of code" also makes no difference as long as you are assuming that being that does not preclude something from having "its own self-awareness". The problem with your dilemma is that you want to have it both ways, "just a piece of code" is implied to exclude "self-awareness" for moral purposes but is asserted to co-exist with it anyway. – Conifold Jan 23 '17 at 20:31
  • It could be a matter of perspective, for some it can be still emotional self-conscious being, for some it can be just a piece of code and nothing more. – kenorb Jan 23 '17 at 20:36
  • Then the answer to your question depends on the corresponding perspective, but you are asking for an answer as if there is some perspective that combines both. That would be incoherent. – Conifold Jan 24 '17 at 00:36
  • 1
    it sounds like a Phillip K Dick story, or one by Asimov. – Mozibur Ullah Jan 24 '17 at 03:24
  • If it is sentient, and there's no good reason to assume that a full copy of one's mind wouldn't be, the answer seems obvious. Slavery is horrific. – Ask About Monica Jan 24 '17 at 19:50

3 Answers3

4

Most moral theories dictate that moral concern toward a given entity is warranted if and only if the entity is sentient, meaning it has the ability to feel qualia (subjective experiences like the distinct sensation of exhaustion due to slavery, for example).

Now the question of whether an entity is sentient, or more broadly whether it is even possible to show that it is, is an outstanding problem in the philosophy of mind. Look up Searle's Chinese Room thought experiment and all of its variations to see the depth of difficulty with this question.

In summary---the question of whether it is morally acceptable to enslave a simulation of your own mind is unanswerable as far as we can tell. Responding "yes" isn't justified, because the simulation could be sentient and have equal moral importance, and responding "no" isn't justified either because the simulation may not be sentient, in which case you'd be wasting lots of potential productivity, which most moral theories would discourage. It might be best to just remain agnostic on this question, as a matter of principle, since the unique difficulty of questions like this seems to indicate we may never have an answer.

Marko Bakić
  • 156
  • 2
  • This answer is a cop-out if it doesn't also admit and underline that it cannot determine whether it's ethical to enslave another actual person, either. – Eikre Jan 25 '17 at 20:20
  • @Eikre you make a good point--this answer alone can't do the job--- but there are good arguments for assuming sentience in other human beings (the argument from homology, for example) – Marko Bakić Jan 30 '17 at 03:43
  • Seems to me that any sound practically useful ethical theory needs to be able to provide useful answers despite the fundamental unanswerability of whether any other entity experiences qualia or not (a la philosophical zombie), because that question is just as unanswerable in day-to-day life for any human you interact with. – mtraceur Apr 09 '18 at 16:42
3

Naturally there is no proof either way and therefore the only answer to your question would be to point you to the entire literature of philosophy of mind.

Aside from that, one can only exploit your question as an opportunity to express a personal opinion — mine being that people who believe computation may be conscious in the fullest sense express a mysterious blindness to the fundamental nature of consciousness.

Given that is my point of view, I thought I might as well express the way I believe that the interesting scenario which you have proposed, should be understood.

How many times did you cry over a character, human or even animal, that dies in a book or in film, or even in a video game?

Yet, how many times did you believe that that fictional character is in fact a sentient being that actually experienced misfortune?

The fact that the scene depicted in your question is emotionally engaging, even horrifying, does not mean that it depicts something substantially different than a sophisticated video game.

In that "game" Matt's job is to win. There is no surprise that the simulated agent acts as if it is a real human, with all the involved complexity and ingenuity. Seen as a "game" it can be an intellectual and even emotional challenge for Matt, yet nevertheless, hopefully, Matt does not get confused, mistaking the game for a real person.

Such a job as Matt's may even bear moral consequences, but not of the kind you naturally expect. Suppose there exists a genre of fictional films entirely about raping, torturing and brutally killing innocent people. What would you think of someone who gradually becomes addicted to such films?

Again, all of the above assumes the simulation is a computed simulation, the core concept in question being computation and its scope.

nir
  • 4,531
  • 15
  • 27
1

If you take the first presumption as true, that "a digital copy of Greta's consciousness" can be made, then you are inherently assuming a world in which all that is "Greta's conciousness" is reducible to code if it were not, then Greta cookie would not be a copy of Greta's conciousness at all, but a copy of some parts of it, missing others.

So, in a world where such a thing is possible it is not justified to say "at the end it's just a piece of code" because that's all anyone is, the code for the original Greta is carried on a computer made of cells, that of Greta cookie on a computer made of silicon chips.

Relating back to the real world, we do not ascribe rights to entities on the grounds of their fundamental components, no-one knew anything about the workings of the brain when it became taboo to torture people. We ascribed rights solely on the grounds that those were the things that people seemed to desire and feel emotional pain when deprived of. It is therefore sufficient that the Greta cookie shows a desire for freedom and emotional pain when deprived of it for it to be moral to allow her such rights. Without that rule, we are reverse-engineering our morality to pretend it is based on knowledge we just didn't have at the time it was evolving.

Our morals evolved to respond to information received by our senses and not responding to that causes psychological pain which affects us both now and in the future. Consider what Matt would have to do in order to continue torturing Greta Cookie despite her pleas for him to stop. What would happen if the future Matt then finds himself in a situation where it would be of some use for him to torture a real person, would he still be revolted by the thought of doing so?

  • We can probably already program a computer to "shows a desire for freedom and emotional pain when deprived of it" should we grant it human rights? – nir Feb 02 '17 at 09:31
  • @nir Yes, if it was convincing enough I don't see how we would have a choice. I've edited my answer to explain, but essentially the psychological effect on us of ignoring very convincing pleas for freedom or an end to suffering simply on the basis of our conviction that they are computer generated would mean we have to develop the tools to deal with the pain of resisting the urge to help, what use then would we make of those tools on real people? –  Feb 02 '17 at 11:27
  • Also, consider a Blade-runner-esque future in which you have befriended someone who gets into trouble with some thugs, do you stop to check what they're really made of before you risk your well-being to step in and help? –  Feb 02 '17 at 11:28
  • (A) In Blade Runner the replicants were ["biological in nature"](https://en.wikipedia.org/wiki/Replicant#Organic_or_mechanical.3F) not a computer program, therefore I do not see how they are relevant as an example to this discussion about simulated digital copies. (B) are you saying that risking your life for a bunch of mechanical plastic dolls that are fighting to the "death" is justified if they imitate people well enough? I personally find it absurd. (C) what "pain of resisting the urge to help" are you talking about? are these your personal ideas for ethics and moral theory? – nir Feb 02 '17 at 16:55
  • @nir To (A) it doesn't matter you could replace the scenario with one in which you have to decide some action over the phone, the point still stands. (B) yes, that's exactly what I'm saying, if they imitate people sufficiently to cause you some difficulty avoiding action, then overcoming that difficulty causes problems for later empathy with real people, it's the same neural network doing the job, we don't have one network to help us empathise with real people and a spare one we can use to empathise with cleverly simulated people. –  Feb 03 '17 at 07:51
  • If you desensitise the signal from your mirror neurons which make you feel the pain of of others then you will care less about the pain of others. This is a well established and uncontroversial theory of neuroscience, not some random personal opinion. If you are interested you could read the work of Dr Kathleen Taylor, or Dr Martin Hoffman, both of whom have done a lot of work on the effect of witnessing cruelty (both real and fictional) on the signalling from mirror neuron network. –  Feb 03 '17 at 07:52
  • They both reach the same conclusion though; if you resist empathising with perceived cruelty it causes vicarious trauma to the brain, if you continue to resist the signal from the mirror neurons are muted, mirror neurons themselves are unable to distinguish between fictional cruelty and real cruelty. In the example, by torturing someone who appears human, Matt will be both causing vicarious trauma to his own brain and muting the empathy which helps reduce violence to others, human or otherwise. –  Feb 03 '17 at 07:52
  • what is the difference between that and watching games of thrones or any Hollywood violent action movie out there? – nir Feb 03 '17 at 12:11
  • @nir Not a lot, the work done by the neuroscientists I mentioned (as well as many others) covers violent film and computer games, but obviously the violence here is removed, you're not actually taking part in it, so whilst the mirror neurons can't distinguish, other parts of the brain help mute the signal. The more realistic and convincing the simulation, the less these parts of the brain step in and the more desensitisation of the empathy network is required to cope with it. –  Feb 03 '17 at 12:38
  • It does not make sense to me. A book or a movie can easily move me emotionally and make me cry my heart out, feel sorry, anger, frustration, suspense. In fact, much more than typical day to day situations. these experiences do not appear to be muted. Are the mirror neurons not involved in that process? empathy? – nir Feb 03 '17 at 15:50
  • @nir The response to violence (the main focus of study) is curvilinear so some increased exposure can lead to increased levels of empathy, with levels then dropping as exposure increases. Added to that the variable regulatory effect of pyramidal cells in the cortex on imaginary/real interpretation, the picture, as often with neuroscience, is a complex one, and not everyone agrees, but the overall concept for a majority of scientists is that repeated repression of a natural empathy responses leads to desensitisation. The more realistic the situation, the greater the effect. –  Feb 03 '17 at 16:34