Suppose there is a cloning device that could make exact copies of humans. It makes one person into two with the exactly same memory and other properties, with equal status and no one defined to be the original person or the copy. One person is to be cloned 99 times for some good reasons. The exact process is the following:
- The original person is named Copy 1'.
- Copy k' is cloned to be Copy k and Copy (k+1)', for each k between 1 and 99.
- Copy 100' is also called Copy 100.
Following this process, one copy in the 100 has to face some unbearable mental experience, for some inevitable reason unrelated to the cloning process. But the person could choose which copy would have the experience, before the whole cloning process begins. How would the person want to choose?
There are two ways of thinking:
- Copy 100 (or equally, Copy 99). Because in each step, the person has 1/2 chance to be Copy k and 1/2 chance to be Copy (k+1)' in their own experience in the next moment. Before the whole process, they know they have 1/2 chance to be Copy 1, and only 1/299 chance to be Copy 100, which is very small.
- There are no differences between the choices. Think the world is simulated by a computer, the computer could do the whole process together in a single step. The computer program could be optimized to clone in any order, ignoring how it is originally planned in the world. Philosophically, there should be no differences one could feel between a "real" world or simulated by such a computer, so the conclusion should still hold if it's not known to be not a computer. The resulting world as a whole is also the same in every possible way, after all.
Which one is true? Is one of them fallacious, or they are just different perspectives that doesn't have to have an answer, or humans cannot know the answer which may also raise the clear definitions of subjective feelings above the world, or something else?
(Not part of the question.) One thing to note is, humans' instincts are from the evolution, which is an approximation of some higher value instead of the original thing. But humans could think about the original higher value, like wanting their instincts to be different from what they are, even if they cannot really change. This way of cloning is like the reproduction of bacteria. One possible answer to the question is, the human ideas about "unbearable mental experience" and the will to avoid such experiences is only defined in the scope of what humans and the world already are. If they are like bacteria they will subjectively think somewhat differently. And the scope is smaller than the computer out of the world, because copying people to different worlds would defy the possibility of any such analysis. But in this case, one still has to define the scope. And this may create unexpected implications to comparison of feelings between species, including aliens and AIs, that is, to question about the existence of the higher value beyond this scope, so that only discussing what is in the scope is justified.
Also appreciate if someone could find this paradox already has a name, or provide some related problems. The closest thing I know is the Sleeping beauty problem.