Tuesday, April 22, 2008

Consciousness is not Physical: Primitive Version

What is consciousness?

Are you conscious? Did you evolve?

With the available evidence, we can now prove that consciousness isn't physical.

(Abstract: Consciousness isn't necessary to describe the physical world. Consciousness is necessary to fully describe a human being. Thus, consciousness must not be physical. Even if consciousness isn't necessary to fully describe a human being, it exists. Thus, consciousness must not be physical.)

I shall now go through the entire chain of logical possibilities. Any time I reach a possibility that proves physics isn't causally closed, then I will say so. At the end of the essay, I will have shown that every twig of the possibility tree terminates in causally open physics.

That is, while we can't yet say exactly why consciousness isn't physical, we can rule out the fact that it isn't. This is why I state this idea as a negation; I cannot describe consciousness. I can only determine a few things it is not.

I need a catchphrase so I can put it here. Something less lame than 'phear the logicz!' Let's begin.

First, let's assume you answered 'No, I'm not conscious. I do not experience qualia. Consciousness does not exist and the question is meaningless.' This is logically incoherent. I'm conscious. I can't be confused about my experience; it can't be some illusion; my experience of experience is consciousness itself. I think, therefore, I am.

So, are you conscious? You're welcome to answer no, but I'm taking all your stuff. If you're not conscious then you cannot suffer.

Next, did you evolve? If you didn't evolve, then this essay is moot anyway - we know that there's something beyond physics. If humans did not evolve, then biology is not causally closed under physics.

If you evolved and you are conscious, then consciousness is either free or confers some specific fitness advantage.

If consciousness is free, then that means it has no physical overhead. It appears spontaneously. It either has no physical effects in which case it is purely nonphysical, or it has physical effects and your brain is not causally closed under physics. (There would also be no constraints on the occurrence of consciousness.)

However, consciousness is not free. First, it requires specialized brain structures, and second consciousness has the property of 'choice' or 'decision' which allows you to, among other things, kill yourself.

Thus consciousness must confer a selective advantage that outweighs the costs. However, it is physically impossible.

First let me note that I'm distinguishing thoughts from actions; information processing from rearrangements of energy.

The relevant action, reproduction, is entirely physical, as are all the relevant prerequisites. However, all of these actions could be carried out without experiencing them. We could program, hard-script, a robot to do so without much trouble. (If computers are conscious then consciousness is free.)

Let me digress, considering some physical action that cannot be performed by a robot. What is preventing it? Are their hands not simply patterns of atoms, just as ours are? If we were to find a test of consciousness it would mean we have literally discovered magic. If there is an action robots cannot perform, then physics is not causally closed under physics.

Similarly, things like 'creativity' which are strings of fully physical actions, cannot be unique to consciousness. It would mean that there's literally an incantation that you can do to prove you're conscious. Which step exactly would a robot be unable to mimic and why? How would it know to fail if you hard-code it to mimic the action?

Further, the relevant thoughts are similarly constrained. There is no calculation that can be performed by a brain that cannot be performed by a computer. To posit such a calculation would require it to be not a calculation - it would require it to be non-mathematical and thus consciousness would perform nonphysical calculations.

As such, a robot can perform all the relevant physical actions of reproduction. Because computers have a much easier time staying consistent, it would likely perform the actions and calculations at a much higher efficiency than a human would.

Thus, a computer is fully capable of emulating human action, in the physical world, without consciousness. There is no physical action or thought that consciousness is a prerequisite for.

What if all this is true, but consciousness is physical but just somehow does it better than a computer? That, sure, you don't have to experience it, but experience makes it more effective? That would literally mean consciousness is a magic ingredient. Two otherwise identical computers compete, one experiencing its calculations, one not, and the conscious one wins. Consciousness would affect the physical world without actually being a part of it, contradicting the premise.

Consciousness physically does nothing. Consciousness confers a selection advantage. Your brain is causally open under physics.

This isn't easy, let's recap.
  1. You evolved.
  2. You are conscious.
  3. Consciousness costs resources.
  4. There is no action you must experience to perform.
  5. There is no action you could perform better by experiencing it.
  6. There is no computation you must experience to perform.
  7. Ergo, consciousness cannot physically do anything.
  8. Consciousness confers a selective advantage.
  9. Ergo, consciousness isn't physical.
  10. Your brain is causally open under physics.
  11. Ergo, the universe is causally open under physics.
And if any of the premises are untrue, then the conclusion remains true. Further and less obviously, there are no other possibilities to consider. It is logically impossible for me to be wrong.

Physics is best described as a consistent set of rules for the interaction of energy. Without such a set, interaction is in fact impossible, because contradictions do not exist.

Therefore, I propose that there is an orthogonal physics that uniquely defines the physical output of consciousness. I could call it the physics2 of vacuum2. (I won't.) This system must have a similar set of rules, though not necessarily mathematical ones. Also, if it is to not violate causality, (which you'll note I used to discover it) the system must be incapable of violating any physical law.

The system presumably has a similar causal hole. The system mutually closes causality with the physical vacuum.

This is the set of rules that allows us to be conscious - to experience the world. It is the plane of the mind. Unfortunately, that's nearly everything I know about it.

Most likely it allows free will, but it is possible that it simply confirms determinism. [rant]So please, quit debating it. You have no idea. The concepts of free will and determinism aren't even consistent concepts. Believe whatever the hell you want and tell anyone who argues with you to stuff it.[/rant]

So, in physics, what's the hole? There's only one hole in physics that I know of. It's been rigorously proven, too. Quantum randomness is true stochastic behavior. There's no physical way to predict what any particular particle will do, ever.

Technically speaking, a second set of rules could easily interface at this level, without violating any law or indeed being detectable at all, except through logic.

But for now I'm done with the firm certainty of deductive logic. This tree has grown up and borne delicious philosophical fruit. In my next essay I'll discuss some of my minor hobbies, the consequences of this fruit, and develop some of the ways consciousness may be implemented in the brain.

15 comments:

Anonymous said...

Your example of either programming a robot to mimic the desired action *or* having consciousness is apt; however since natural selection did not have as an option being programmed to mimic the desired action (say, planning in time, abstraction, etc.) then the available option (evolving consciousness) fits a fitness criterion.

Alrenous said...

Abstraction could easily be modeled with a Bayesian fuzzy logic circuit. It cannot be reliably determined whether an information processor is using this analogue or 'true' consciousness, or even if such a difference exists.

Planning in time is similarly straightforward to script.

I don't see how natural selection couldn't mimic these actions, nor do I see any evidence it did not.

I avoided mentioning them on purpose, because of this.

Clark Goble said...

Consciousness isn't necessary to describe the physical world.

Some would debate that. Just because it's not necessary for current physics doesn't mean it isn't necessary

Alrenous said...

There are only two ways consciousness can be enfolded into an extension of current physics.

First, a new force. Unfortunately this requires a new type of charge as well, and there's no way to confine this charge only to atoms in the brain - the particles cannot tell that they make up a brain.

Next, a new type of matter, which for some reason is confined to brains. While possible, it would be almost identical to a soul. It would be exactly like dark matter that can tell its in a brain - which I suppose would make sense, since it's conscious.

It cannot be a new type of interaction of existing forces, because either that is exactly equivalent to a new force, or it isn't a necessary concept.

In both cases, what we basically have is a conscious equation, like the wave equation, but for thinking.

The problem is that not only can you put this equation into a pocket calculator and have a conscious calculator, but you can implement it with rocks rolling down a hill and have a conscious hill. In the end everything ends up being conscious because there's no discernible difference we can describe.

Looking at the mind node as an example, we can see this clearly. A mind node or something equivalent - something outside normal math - is also the only way to solve this dilemma.

Matt Norwood said...

This paragraph:

However, consciousness is not free. First, it requires specialized brain structures, and second consciousness has the property of 'choice' or 'decision' which allows you to, among other things, kill yourself.

contains two unsupported assertions that I find improbable.

First, I have no reason to think that consciousness requires specialized brain structures. The only evidence I have of the existence of consciousness is my own experience of my own consciousness. Other humans relate to me stories about their own conscious experience, and from this I might imagine that they are also conscious, even though I have no proof. But there's no reason to think that other objects in the world not gifted with the power of speech do not also experience the world consciously.

As for "choice": I see no reason to link consciousness with decision-making or choice. I experience this body and brain making certain decisions and executing them; this does not mean that those decisions originate in consciousness. Indeed, recent fMRI studies suggest that this is not the case; people's self-reported "conscious decisions" are significantly predated by neurological indicia of the outcome of those decisions.

Alrenous said...

First, I have no reason to think that consciousness requires specialized brain structures

At the very least, it requires synchronization with a specific brain wave, either beta or gamma, I don't remember. We found this out by studying binocular rivalry. While not a spatial structure, it is at least temporal structure.

(Oddly my binocular rivalry expresses itself differently than for people in these studies.)

Second, the computational streams have unconscious and conscious parts. Only after a certain point are all upstream computations conscious - this suggests that there's some additional element, perhaps a structure, that collates or somehow makes conscious the sensations.

I will have to concede that I was reaching by mentioning choice. Still, I don't see how the simulation of choice is supposed to be less expensive than the actuality of choice.

Similarly, if we cannot meaningfully make choices, I fail to see how consciousness can have any effects at all, which would make it pretty ineffable.

I have seen that study about decision, and I've half-written a post about them. I'm going to compare them to alien-hand syndrome.

Unknown said...

Consciouness is the third axiom of metaphysics. Existence and Identity are the first 2. In order to be conscious something has to exist, and have an identity. Please tell me you dont think Consciouness becuase its not physical can exist on its own!!!

Alrenous said...

I really should mention in the article that this is the third of four writeups of this idea.

I really did try to figure out how to say this without insult, but it didn't work.

Clean up your spelling and don't use multiple exclamation marks. Basically, it gives a negative impression which I doubt is replicated in person. You're not the dude from Portugal; your respect for your own tongue reflects your respect for your own ideas.

While I will attempt to correct for this, most people will not.

James said...

I'm still with the, "I'm not sure consciousness isn't free." camp -- or at least the "I'm not sure experience isn't free, and consciousness is an emergent form of a complex experience".

It may simply be that matter experiences things (a new force, in a sense, or perhaps it is simply the "inside" of an existing force) and we, in our anthropocentric tendencies, claim that we're "special" and the only creatures that get to have consciousness. It's essentially vitalism. Most people cling to a humanistic or vitalistic worldview, at least implicitly.

I'm not so sure we're that special in the universe. It may just be that experience is a fundamental property of space-time, and we are finally self-aware enough (through our reflective self-monitoring capabilities) to realize we are having experiences.

Either way, I'm not too concerned -- I'm fairly convinced that at the very least property dualism is true, and we are a long ways off from figuring out whether brains are causally closed, but if they aren't, substance dualism is going to make a big comeback. I may or may not be alive to find out, but in the interrum time, I think I'm going to dance.

Unknown said...

Consciousness is awareness of qualia. It's not always present. One has to focus attention to become aware of qualia.

One possible evolutionary just-so explanation for the emergence of consciousness is the following: an ability to simulate others' (unconscious) experiences confers reproductive advantage in a social setting. Since other minds (of the same species) work similarly, monitoring, or perceiving the processes of one's own mind is the natural experience simulator.

Steps 4,5,6 don't imply that consciousness does physically nothing. An evolved trait doesn't imply that it has to either (1) be prerequisite of a function or (2) the most effective way of implementing that function. An evolved trait simply means that it conferred a selective advantage and it was there to be selected (as opposed to other possible things that weren't there to be selected).

Do you need to simulate others' experiences to predict them? Yes, because if you can predict them, you can simulate them. Do you need consciousness to simulate others' experiences. No. Is it most efficient out of all the possible ways of simulating? I don't know. Let's assume not, but It was there to be selected and it conferred a social advantage. So it was selected.

The assumption that consciousness is not necessary to describe the physical world begs the question. Consciousness is perception of perceptions and it is necessary to describe the physical world.

Alrenous said...

Monitoring others' experience doesn't require consciousness. We can already do that in simple ways in silicon. In the case of consciousness, it does have to be the most effective, because supporting a consciousness is very expensive in terms of bits/neuron.

Qualia don't exist if you're not aware of them.

Unknown said...

Yes monitoring others' experience doesn't require consciousness, just like vision doesn't require a visual cortex. Visual cortex probably isn't the most efficient vision processor possible either (meaning it can be done more cheaply). However, both consciousness and visual cortex evolved.

How do you know that there's a much cheaper experience simulator alternative that's as realistic as consciousness, that at the same time has executive functions. Even if such a thing exists in the design space, my argument is that it isn't necessarily an evolutionary alternative to consciousness (it has to be present in the first place to be selected).

>Qualia don't exist if you're not aware of them.

Let's call it proto-qualia then. A person that doesn't have conceptions of qualia/consciousness is not aware of his subjectivity. He becomes conscious precisely when his perceptions become the object of his perception.

Alrenous said...

How do you know that there's a much cheaper experience simulator alternative that's as realistic as consciousness, that at the same time has executive functions.

Because epiphenomenalism is very silly.


not aware of his subjectivity

"is not aware of his awareness."
"not thinking of what he's thinking of. He starts thinking of what he's thinking of precisely when what he's thinking of becomes the object of his thinking."

You should spend more time thinking about what you're saying before you say it.

TheDividualist said...

I would argue that consciousness is an advantage in PvP tasks, not in PvE tasks. In contests. The conscious chess player - ceteris paribus, same processing speed etc. not fast computers - beats the non-conscious chess player. By being less predictable. Upsetting the other players pattern-matching experience by deciding to pull a new move.

Alrenous said...

Human chess players still beat computers in on a dollar vs. dollar standard. Similarly, jobs are not taken by robots unless the price of humans is being artificially inflated.