Beemer (dr_tectonic) wrote,
Beemer
dr_tectonic

neuropsychological maunderings: debugging decisions

Had a thought about an idea I've run across now and again. The notion is that, when you do a brain scan of people making decisions, they make them much too rapidly to be using conscious reasoning, so what we usually think of as "reasoning" is actually just "rationalization" of a decision that's already been made by the rest of the brain.

I'm sure the research is sound, but that interpretation of it is just absurd.

The problem is that it assumes that "the rest of your brain" is a separate entity from "your self", which is stupid. Conscious thought is not the only part that matters; you are all of the things going on in your brain.

Sure, under normal circumstances, consciousness catches up to what's going on after the other parts of the brain have handled the decision-making process. That's easy to demonstrate. But that doesn't mean that those other parts of the brain are necessarily using some different kind of thought process than they would if consciousness were coupled in to the system.

The way it's usually described, it sounds like consciousness is the self and it floats on this sea of mysterious irrationality, seeing pattern where none exists, manufacturing a logic to explain decisions that are actually totally arbitrary, that "reasoning" is nothing more than self-deception.

[Aside: that's not, I'm sure, what researchers say when describing their results, but it's how it usually comes out when people restate it.]

I think this is a really lousy and inaccurate metaphor. Here's what I'd propose as an alternative (this may not make a lot of sense if you're not familiar with programming).

Consciousness is the run-time debugger. Your brain, your WHOLE brain, has all these different modules that do different kinds of thinking, and under normal operating conditions, you've got data flying back and forth between all of them in this computational dance. Consciousness is the introspection module: it keeps track of which modules are doing what (including itself).

Now, heavy-duty reasoning -- which is very difficult and resource-intensive -- is like running the program of your mind under a debugger. You get everything set up, and then you slowly move through the decision, step by step, stopping to check the accuracy of the results after each operation. You couple the introspection module (consciousness) in to do this.

Here's the important thing, though: unless there's a bug, you get the same answer either way. So most of the time, we don't bother with the debugger: consciousness just applies shorthand labels to the execution as it goes by, noting which modules got invoked and assuming that they operated correctly. The decision process doesn't differ radically just because you're paying attention to it.

(Normally. I'm sure that it's just as possible to get heisenbugs in your thinking as it is in your code, where you get a glitch that only occurs when you're not looking at it, because something about the looking suppresses it. I'm not claiming that your brain is a computer here, but both systems process information, and I think that these ways of functioning are probably characteristic of any information-processor, regardless of form.)

Logical deduction is a particular kind of decision-making that is easy to get wrong, so you often have to run it under the debugger. It's also a very small subset of the decisions you make, so it's misleading to think of the other kinds of reasoning or decision-making that you do as "irrational". Is deciding whether you want to eat a cookie a 'rational' decision? Not really. It's not like you can assign numeric values to the benefits and drawbacks of eating the cookie and then apply some mathematical algorithm. That doesn't mean you're being totally illogical when you make that decision, as is implied by the flawed metaphor, just that it's not a problem that is well-suited to solution by careful, methodical, conscious reasoning.

And you can run non-logic reasoning under the debugger as well. If you have a complicated emotional situation, and you're sitting there thinking hard and trying to sort out how you feel and what you want, that's just as much a slow, resource-intensive introspective operation as logicking your way carefully through an algebra problem. There's no "if-then-else" going on, but there's plenty of checking various mental modules against one another to try and make the outputs align into a coherent whole. No logic, plenty of debugging.

(And you can totally do algebra without introspection, if you practice it enough; you're just far more likely to drop minus signs and forget to divide by two...)

Anyway, I just wanted to throw that out there. Consciousness provides us with a sense of self, but it's not a separate thing from the rest of our minds. Your mind may be composed of many individual things, but it is still an integrated whole, and that whole is what makes up you. Consciousness has a special function in that whole, but it is still a part of it. We should think of consciousness as the part that generates awareness of the decision-making processes -- of which there are many, operating in many different ways -- at varying levels of scrutiny, and NOT as a part that actually makes decisions. Both the decision-making element and the awareness element are vital parts of you, and it makes no sense to think of them as being separate from your self.
Tags: consciousness, mind, philosophy, programming, thought
Subscribe
  • Post a new comment

    Error

    Anonymous comments are disabled in this journal

    default userpic

    Your reply will be screened

    Your IP address will be recorded 

  • 7 comments