2

If it is possible to simulate consciousness using computer hardware and software, does that mean that computers are able to experience qualia?

abluezebra
  • 39
  • 2
  • print "Hi, I'm in here and I'm conscious." If you run that program, it claims to be conscious. But you probably don't think it's conscious. On the other hand if I tell you a fancy neural network learning algorithm running on a supercomputer printed out the same sentence, you might feel different. What do you think? Is there a difference? Second question. How do you know your next door neighbor experiences qualia? – user4894 Nov 10 '16 at 01:56
  • 1
    Hi, welcon to Philosophy SE. Please visit our [Help Center](http://philosophy.stackexchange.com/help/dont-ask) to see how to ask questions that can be answered here. Your question is currently too broad, it is called the hard problem of consciousness and [Wikipedia](https://en.wikipedia.org/wiki/Hard_problem_of_consciousness) and [IEP](http://www.iep.utm.edu/hard-con) have long articles on it. – Conifold Nov 10 '16 at 02:06
  • Does [this](http://philosophy.stackexchange.com/questions/6938/what-makes-humans-different-from-a-chemical-computer/6947#6947) answer your question? – commando Nov 10 '16 at 03:07
  • No. Change *simulate* for *replicate* and then maybe... – M. le Fou Nov 10 '16 at 04:01
  • 1
    You will forever be as unable to verify that computer AI experience qualia as you are unable to verify that other humans experience qualia. – Ask About Monica Nov 10 '16 at 22:09

2 Answers2

1

That conscience is a phenomenon, and not a concept (or even a construct, a fiction of sorts), is already a philosophical position. A host of thinkers have distanced themselves from it. I'm actually having trouble remembering one that doesn't, at least partially (help me in the comments if you do).

Another thing entirely is to talk about having experiences. Here, the key difference to be made is if you take computers to be artificial, just because their hardware was assembled, not "grown". That is why many people are uncomfortable with the "artificial" in Artificial Intelligence.

The thing is, you don't have to "solve" the nature vs. artifice problem to assume that experience is possible in non-conventional settings. In fact, what you may be forced to do, when thinking about these things, is to leave the problem open.

André Souza Lemos
  • 1,154
  • 8
  • 22
0

No, computers are not able to experience qualia as long as computers are only syntactical machines. Syntax is inadequate to achieve semantic content and a first person subjective ontological status. See Searle's Chinese Room. Also, see Feynman's Computer Heuristics lecture. Lastly, see Searle's argument that in addition to syntax being inadequate for semantics, that physics are inadequate for syntax: "Is the Brain a Digital Computer?"

Could we "simulate", "emulate" or model consciousness? Of course, but a simulation of a stomach won't digest a pizza.

MmmHmm
  • 2,401
  • 13
  • 28
  • 7
    The Chinese Room has been met with serious opposition in philosophy of mind. See e.g. Hofstadter, Blackburn, and others on how it begs the question and simply sets up an intuition as though it's fact. – commando Nov 10 '16 at 03:07
  • 1
    Newton's theory of gravity met with serious opposition too, that didn't in and of itself make it wrong. And how *does* syntax become semantics then? – user4894 Nov 10 '16 at 03:36
  • @user4894 see e.g. [here](http://seop.illc.uva.nl/entries/chinese-room/#5.1) for suggestions of how certain kinds of representation suffice for semantic content. Moreover your analogy to Newtonian mechanics is not apt, as it was supported by a huge empirical body. Considering that the present discussion concerns subjective experience, which is not communicable, it's pretty much impossible (as far as anyone can tell) to answer the question with certainty. See, for example, Stoljar's Ignorance Hypothesis. Last, the syntax/semantics discussion is almost idiosyncratic to Searle's formulation. – commando Nov 10 '16 at 04:35
  • @commando, user4894's analogy is apt, see: neuroscience. Searle has refuted his critics and sorry, but a "suggestion" is not empirical verification, nor does Searle beg the question, see Baggini's assessment in your link. – MmmHmm Nov 10 '16 at 05:47
  • @Mr.Kennedy if you've managed to interpret neuroscience such that it's found anything more than *correlates* between *reported* subjective experiences and observed neural phenomena, you'd better write up a paper, because I assure you nobody in the field thinks it's achieved anything more than that. That's not even to mention that positing *sufficient* conditions is very different from them being necessary. As for Baggini: you cite one assessment out of many, and leave out e.g. most functionalist, reductionist, panpsychist, etc. positions. – commando Nov 10 '16 at 06:05
  • @Mr.Kennedy, Like you I believe that computers may never have qualia (or as Dennett put it, be conscious in the fullest sense), but I think your two arguments are problematic. First, Searle's point that syntaxt is insufficient for semantics is intuitive but contested. In particular we do not know what semantics in the metaphysical sense of meaning actually consists of. Second, imagine a multitude of syntactical self replicating silly machines flooding the universe. some of them will be destroyed by mishaps and because they are not suited for survival, etc... – nir Nov 10 '16 at 06:39
  • but after a while some might remain. those syntactical machines that we might describe as "evolved" to survive in nature. one might now argue that the universe of which we know little, confers these syntactical machines with meaning. Yudokovsky once retorted to your second argument with "Can you have simulated information that is not really information? Can you have correct answers which are only simulated correct answers?" That is to say, a simulation of a pizza is not a pizza, but a simulation of a brain, embodied in the world, IS a brain — now argue if there is anything left unaccounted for – nir Nov 10 '16 at 06:39
  • 1
    btw, I upvoted this answer, since while I do not agree with it, it consists of famous arguments and proper references, so it does not deserve in my opinion the score of -1 points that it had. – nir Nov 10 '16 at 06:43
  • @commando, you misread the point and I am presuming you have not investigated Searle's responses to his critics. They are too numerous to list here but feel free to cite an single instance of Searle begging the question. – MmmHmm Nov 10 '16 at 06:46
  • @nir Sure, computers as we know them now. A point Searle often makes is that *we* are biological machines and it's not out of the question that we could make conscious machines, but not according to our current neuroscience, not with current computers and not with a behaviorist model. As for information, in which sense? Does he mean in the sense of information per Claude Shannon or in the sense of "I have information in my head about how to post a comment on Stack Overflow"? See the quote [here](http://philosophy.stackexchange.com/questions/38868/could-the-internet-be-a-brain/38870#38870) – MmmHmm Nov 10 '16 at 06:56
  • @nir, thanks btw... funny that no one seems to have a problem with Feynman demonstrating many of the same points. heh. – MmmHmm Nov 10 '16 at 07:45
  • @Mr.Kennedy, If one calls the brain a machine then it is a truism that machines may be conscious. However machines and computers are not the same thing. computation is a mechanical concept or phenomenon. Machines on the other hand need not be mechanical, or a mechanism. Modern computers as physical systems rely on non-mechanical phenomena (such as quantum mechanics, etc...) but as computation devices, whatever they do is reducible to a mechanism. Therefore I would be interested to know what you mean by "computers as we know them now" – nir Nov 11 '16 at 10:07
  • @nir [Computers](http://tinyurl.com/hlj8ot5) now = syntactical. When Guthrie commented "This machine kills fascists" was he commenting upon his guitar (not a machine) or the guitar player? Would the comment be as effective if said of a player piano? We are biological machines, i.e. there are causal mechanisms involved. "[The brain is a biological machine, and we might build an artificial machine that was conscious; just as the heart is a machine, and we have built artificial hearts.](http://socrates.berkeley.edu/~jsearle/BiologicalNaturalismOct04.doc)" – MmmHmm Nov 11 '16 at 12:10