0

If we grant that there is one consciousness per nervous system (or maybe 2 if we consider hemispheres) then it seems plausible to me that the nervous system (or brain or body) must be unified in some way.

I want to distinguish "natural" unity vs "observer relative" unity (if there's already terminology for this distinction in philosophy, please post). For example there's a natural unity of my field of experience... many different things appear together... I have no choice over this. Distinguish this from the unity of an "organism". We as human beings decide to categorize matter in certain ways say "inanimate" vs "animate"... but we perform this categorization for convenience. Ultimately, all we have is matter and energy. An organism, or a computer all composed of matter/energy... Why do we take them to be two "unified" things instead of millions of little bits? It's primarily because of "our" interests and it helps us organize. It is conceivable some other type of beings don't perform this categorization and see the world as points/particles of mass/energy.

This distinction is analogous to the one Searle makes between "intrinsic" intentionality and "as-if" intentionality.

So my guess is that if "consciousness" is naturally unified, then the corresponding "matter" involved might be naturally unified also. As far as I know, nobody has identified this unification. There's a "trivial" unification... All matter/energy in the universe interacts by the laws of physics. But if consciousness is unified into separate discrete bins... then my guess is that there's a type of natural unification of matter that also separates into discrete bins. If these bins correspond to brains, then my guess is there's some "natural" unity to brains.

Has any philosopher attacked this issue of finding natural unities in matter? The only thing I can find that could conceivably cause this "natural" unity of the brain is quantum entanglement.

Ameet Sharma
  • 2,951
  • 1
  • 11
  • 25
  • 1
    I am not really following what "unified" means. A computer with CPU is "unified" by operation of the CPU. The unification is functional, and we similarly have cerebral cortex. What more unification do we need? Centralization of information processing is not really the hard part of the problem of consciousness. Arguments from functionality to substance generally died with [Kant's critique](https://philosophy.stackexchange.com/a/39144/9148), there is no need for properties of one to reflect those of the other. "Simplicity of soul" or "unity of thought" need not be carved into their carriers. – Conifold Oct 21 '21 at 23:39
  • But "function" is dependent on the interest of an observer. The closest analogy is how Searle distinguishes "intrinsic" intentionality from "as-if" intentionality. A computer follows "functions" with respect to an organism with a certain language, interests etc... in an absolute sense a computer is just a set of particles like any other set of matter. Does a CPU have any function with respect to an amoeba? – Ameet Sharma Oct 22 '21 at 00:42
  • Maybe a better way to say it is "function" is "observer-relative". – Ameet Sharma Oct 22 '21 at 00:50
  • I think it is only relative in the most ethereal sense in which everything is relative, including "consciousness", to the conceptual framework in which it is expressed. We can track information flows and measures of its degree of integration in a system have been proposed, such as [Tononi's Phi](https://en.wikipedia.org/wiki/Integrated_information_theory#Postulates:_properties_required_of_the_physical_substrate). Maybe that is the sort of thing you are looking for? Tononi's particular measure turned out to be unsatisfactory, but it's a work in progress. – Conifold Oct 22 '21 at 05:13
  • @Conifold, I have the same basic objection to integrated information theory as Searle. "Information" is observer relative. How do we track information of a physical system in some "absolute" sense... the only absolute measure is all the literal physical data... the physical properties of all the mass/energy involved. But this isn't the same type of "information" that we deal with in a computer. Here's Searle's objection: https://iep.utm.edu/int-info/#SH5c – Ameet Sharma Oct 22 '21 at 07:29
  • 1
    The IEP author does not find Searle's objections compelling (Aaronson's are much more so), he confuses different kinds of information and his view of "causal powers" is more Aristotelian than modern. And why are mass and energy any more "absolute" than information (which need not be Shannonian)? All three are concepts in our current physical theories, and all physical data is theory laden. There are interpretations that even make information into a basic ontological concept, [Deutsch's for example](https://turingchurch.net/thoughts-on-david-deutschs-constructor-theory-7be91dca4a92). – Conifold Oct 22 '21 at 07:58
  • Let us [continue this discussion in chat](https://chat.stackexchange.com/rooms/130757/discussion-between-ameet-sharma-and-conifold). – Ameet Sharma Oct 22 '21 at 08:01

0 Answers0