4

Researchers including Synthetic Genomics have made incredible advances, but in doing so, have also made a powerful philosophical contribution to how we (as ordinary people) view life itself. We now live in a world where DNA has been written by man, and Craig Venture himself has talked extensively in interviews about both how they "printed" DNA and how they removed and inserted DNA into cells, and "booted" the cells with the traits contained in new DNA. This view is very much similar to how we view, and have always viewed computers.

  • DNA = the software
  • The cell = the hardware

Before I take this too far I should offer a number of caveats. Obviously this is like a computer but it is unique in that the software can both fundamentally change the hardware according to its program and cause the cell to reproduce. The similarity remains, however, that software is portable and can be placed in some appropriate hardware and run.

As humans, we all evolved from a single cell. I have difficulty arguing what traits constitute sentience, but surely we will all agree that one thing that does fit our sentience definition is the human mind, and we trivially agree that the combination of hardware and software in the form of a human embryo leads to a human given the proper environment. Furthermore, we should recognize that embryonic DNA can in fact be transplanted and that the "hardware" (the embryonic cell minus the DNA) does not contain meaningful information for the construction of a human brain.

If stored on a computer, the full human DNA data file is roughly 6 GB. Who knows how much of that is used for anything, but it doesn't matter. It is almost inarguable that somewhere within that data (which we have access to today) contains the blueprint from the human brain, which does a large fraction of its development in a rather uneventful chemical environment. Let's say that I isolate 100 MB out of our DNA that truly codes for the construction of the brain.

Question: I make the claim that sentience just equals really good software. Is this right? Wrong? Absurd? Surely the right hardware is needed to run this software, but the cell performs nothing more than base physical processes, which we can emulate on supercomputers. In fact, by circumventing the metabolic necessities of neurons, we can simulate the core sentient functionality with supercomputers in fewer computations than the real world takes. How satisfactory is the view - that regardless of whatever it is that constitutes sentience, software creates it?

AlanSE
  • 155
  • 3
  • Any chance I could persuade you to provide more context here on the actual problem you are trying to solve? As many of the answers below indicate your question as framed may not be the most constructive or helpful way to look at the issue -- citing some sources on your terms may help with this. Also please try to communicate how the community here explaining this to you actually advances your study of philosophy. – Joseph Weissman Aug 31 '11 at 15:18
  • @Joseph Well that was surprising. I will hesitate to give you a "problem I am trying to solve" because I've asked a question, and changing this from a philosophical question to a problem that I want help with would not be in line with the purpose of the site. To refer to the official faq, I think this has significant impact on "the nature of reality and existence", and the concept of brain emulation is not something that I will do justice for by giving another reference. – AlanSE Aug 31 '11 at 15:51
  • I would encourage you to reformulate this to ask a specific question if you can -- rather than ask if your claim is right or wrong. Again, citing sources on your terms will help to provide context. – Joseph Weissman Aug 31 '11 at 15:56
  • @Joseph Reformulate to ask a specific question? Like "can a set of instructions contained in a finite number of bytes sufficiently specify the computational architecture needed for sentience?" I think the current title is most effective and it received the kinds of answers I was hoping for. What is your criteria for closing questions? – AlanSE Aug 31 '11 at 16:04
  • My objection is to the lack of theoretical context and the rather 'point-blank' expression of the problem. "In what sense might software be considered sentient?" might be a step towards a clearer formulation, though again it is problematic without defining your terms. (What does "satisfactory" mean?) This may well be a potentially helpful question but as formulated I'm not seeing how someone explaining this to you helps advance your study of philosophy. Please try to detail this, and reformulate to specify your context if you can. – Joseph Weissman Aug 31 '11 at 16:18
  • In passing, I might suggest reformulating to ask after references to *specific philosophers who have considered this problem* rather than simply stating that you claim "sentience is software" and polling for responses or refutations... – Joseph Weissman Aug 31 '11 at 16:39
  • 1
    I feel like there's nothing special about software, that you're really asking about 'deterministic process that follows a rule set'. The metaphor of DNA as some kind of machine language for programming life is, well, they're all metaphors for mechanistic processes. The concept of sentience then falls into the mass of writings about simulation/brains in vats/do robots feel pain. – Mitch Aug 31 '11 at 17:18
  • Wholly unsatisfactory. Syntactical manipulation is inadequate for semantic content. – MmmHmm Mar 16 '17 at 04:10

4 Answers4

4

How satisfactory is the view - that regardless of whatever it is that constitutes sentience, software creates it?

I'd say it is pretty unsatisfactory, because it doesn't really tell us anything. In order for it to be useful, you'd have to come up with a definition of "software" that is: a) adequate to the phenomena, and b) offers new explanatory insights.

Note that you're also going to need to come up with a clear definition of "sentience"-- this is a term that has widely differing meanings depending on the context. (For example, in Buddhist philosophy, "sentience" means "having sense organs", which thus implies "capable of suffering." Thus, all animals and insects are sentient, and some plants might conceivably be.)

Finally, you probably want to look into the growing literature on epigenetics; it appears that you are relying upon a reductionist view of DNA/genetics that is generally being left behind.

Michael Dorfman
  • 23,267
  • 1
  • 44
  • 69
  • I thought that the evidence for epigenetics was really really weak. The one article I read tried to connect feast/famine times in a parents generation to sickness in descendants, and it just sounded like really weak strength of evidence, like almost junk science. Software is something that can be stored in bytes, I agree however, that I haven't defined it. In fact, I don't know what software is, all I know is this quality about it. The implications of reductionism regarding the blueprints of life, at least, always seemed very powerful to me. – AlanSE Aug 31 '11 at 12:04
  • 2
    The Wikipedia article at http://en.wikipedia.org/wiki/Epigenetics has a good overview of epigenetics; the evidence for it is extremely strong, and most geneticists recognize that there is a lot more to genetics than simple expression of DNA. As for reductionism, you are attempting to reduce the functioning of a complex biological organism to a simple set of logic gates; it's difficult to imagine what the biological analog of "software" would be in this regard. In short, "software creates sentience" is a pretty meaningless sentence without some rigorous explanations of the key terms. – Michael Dorfman Aug 31 '11 at 13:16
3

the human brain, which does a large fraction of its development in a rather uneventful chemical environment

Development is critically dependent upon the chemical environment; many genes are devoted to making sure the environment is just right (in the parent as well as the offspring), and many genes actually determine the chemical environment (metabolism, hormones, etc.).

So the "it's just software" view is not really helpful. It's a complex interplay between environment (including physical laws, chemical composition, and all the rest) and encoding (in the form of genes, which themselves depend in complex ways on the environment).

Because of this complexity, I don't think we're afforded many philosophical insights (assuming that one accepts materialism at least for the sake of the argument so that we can pursue biology without worrying about outside interference). Sentience is implemented with a complex system that is unlike anything else with which we are deeply familiar. Until we know a lot more, I don't know what to do with that.

(Quantum mechanics and related physics gives us a sufficiently abstractable version of physical reality to get the "it could all be software" point; we don't need to know anything about DNA or brains except that they're all matter.)

Rex Kerr
  • 15,806
  • 1
  • 22
  • 45
  • Well, one option I have to argue against the non-triviality of the chemical environment is that the human womb is also encoded for in the 6 GB of DNA. I do think, however, that a complex environment (which is not subject to this reductionism) is likely a requisite for sentience. – AlanSE Aug 30 '11 at 20:06
  • 1
    @Zass: DNA encodes proteins. There's quite a bit more in the watery environment that manages DNA than just protein or is controlled by protein. – Mitch Aug 30 '11 at 20:47
  • 1
    @Rex how is the control of the chemical environment of the brain different than the control of my GPU fan going up when I run intensive games? Isn't that essentially the same thing? – corsiKa Aug 31 '11 at 06:59
  • 1
    @glowcoder - The GPU fan doesn't alter the circuitry of your GPU. – Rex Kerr Aug 31 '11 at 14:44
  • @Rex well, not yet it doesn't. If it were to evolve for a few million years it may yet. It does prevent the circuitry from being altered, though (due to meltdown). – corsiKa Aug 31 '11 at 14:56
  • 1
    @glowcoder - That is the key difference--self-modifying systems that interact with an external environment are unlike the software that we typically write. Thus, intuitions from software development transfer poorly. – Rex Kerr Aug 31 '11 at 15:17
  • 1
    @Rex But isn't that only a limitation of current hardware? Couldn't one comprehend of an instruction set that was vastly more powerful and truly recursive in the sense that logic gates could multiply and create different computational regions like the brain does? That could still be a formal system like current software, just vastly more powerful. – AlanSE Aug 31 '11 at 16:09
  • @Zassounotsukushi - Yes, one could create such programs, but since we don't routinely do that now, we have little intuition for how such a system behaves. And without deep intuition, all that this insight would tell us is that consciousness is Turing-computable to arbitrary accuracy, which we already know since we know from physics that the entire universe is Turing-computable to arbitrary accuracy (assuming that one is a materialist w.r.t. consciousness). – Rex Kerr Sep 01 '11 at 03:24
1

As suggested, you cannot make any claim that software alone is enough to simulate consciousness because you have to also take into account the fact that human "software" (genes) is the leading contributor to developing the hardware. Our genes code for both the software and the growth of the "hardware" in our brains. Using merely code on a basic computer might not be enough to simulate/demonstrate (problem of other minds here) consciousness. We can't say for sure either way, but you definitely can't say that software alone is satisfactory from your thought experiment. It is not conclusive in that regard, because we don't know of the interdependence of software and hardware when it comes to the human brain.

stoicfury
  • 11,548
  • 7
  • 42
  • 79
1

There is no contradiction in the possibility that specific physical characteristics are required to have sentience. While we certainly don't know that this is true, it is certainly possible that you simply cannot have sentience without DNA.

David Schwartz
  • 1,926
  • 11
  • 11