?

Log in

No account? Create an account
neuropsychological maunderings: debugging decisions - The Mad Schemes of Dr. Tectonic [entries|archive|friends|userinfo]
Beemer

[ userinfo | livejournal userinfo ]
[ archive | journal archive ]

neuropsychological maunderings: debugging decisions [May. 31st, 2006|03:26 pm]
Beemer
[Tags|, , , , ]

Had a thought about an idea I've run across now and again. The notion is that, when you do a brain scan of people making decisions, they make them much too rapidly to be using conscious reasoning, so what we usually think of as "reasoning" is actually just "rationalization" of a decision that's already been made by the rest of the brain.

I'm sure the research is sound, but that interpretation of it is just absurd.

The problem is that it assumes that "the rest of your brain" is a separate entity from "your self", which is stupid. Conscious thought is not the only part that matters; you are all of the things going on in your brain.

Sure, under normal circumstances, consciousness catches up to what's going on after the other parts of the brain have handled the decision-making process. That's easy to demonstrate. But that doesn't mean that those other parts of the brain are necessarily using some different kind of thought process than they would if consciousness were coupled in to the system.

The way it's usually described, it sounds like consciousness is the self and it floats on this sea of mysterious irrationality, seeing pattern where none exists, manufacturing a logic to explain decisions that are actually totally arbitrary, that "reasoning" is nothing more than self-deception.

[Aside: that's not, I'm sure, what researchers say when describing their results, but it's how it usually comes out when people restate it.]

I think this is a really lousy and inaccurate metaphor. Here's what I'd propose as an alternative (this may not make a lot of sense if you're not familiar with programming).

Consciousness is the run-time debugger. Your brain, your WHOLE brain, has all these different modules that do different kinds of thinking, and under normal operating conditions, you've got data flying back and forth between all of them in this computational dance. Consciousness is the introspection module: it keeps track of which modules are doing what (including itself).

Now, heavy-duty reasoning -- which is very difficult and resource-intensive -- is like running the program of your mind under a debugger. You get everything set up, and then you slowly move through the decision, step by step, stopping to check the accuracy of the results after each operation. You couple the introspection module (consciousness) in to do this.

Here's the important thing, though: unless there's a bug, you get the same answer either way. So most of the time, we don't bother with the debugger: consciousness just applies shorthand labels to the execution as it goes by, noting which modules got invoked and assuming that they operated correctly. The decision process doesn't differ radically just because you're paying attention to it.

(Normally. I'm sure that it's just as possible to get heisenbugs in your thinking as it is in your code, where you get a glitch that only occurs when you're not looking at it, because something about the looking suppresses it. I'm not claiming that your brain is a computer here, but both systems process information, and I think that these ways of functioning are probably characteristic of any information-processor, regardless of form.)

Logical deduction is a particular kind of decision-making that is easy to get wrong, so you often have to run it under the debugger. It's also a very small subset of the decisions you make, so it's misleading to think of the other kinds of reasoning or decision-making that you do as "irrational". Is deciding whether you want to eat a cookie a 'rational' decision? Not really. It's not like you can assign numeric values to the benefits and drawbacks of eating the cookie and then apply some mathematical algorithm. That doesn't mean you're being totally illogical when you make that decision, as is implied by the flawed metaphor, just that it's not a problem that is well-suited to solution by careful, methodical, conscious reasoning.

And you can run non-logic reasoning under the debugger as well. If you have a complicated emotional situation, and you're sitting there thinking hard and trying to sort out how you feel and what you want, that's just as much a slow, resource-intensive introspective operation as logicking your way carefully through an algebra problem. There's no "if-then-else" going on, but there's plenty of checking various mental modules against one another to try and make the outputs align into a coherent whole. No logic, plenty of debugging.

(And you can totally do algebra without introspection, if you practice it enough; you're just far more likely to drop minus signs and forget to divide by two...)

Anyway, I just wanted to throw that out there. Consciousness provides us with a sense of self, but it's not a separate thing from the rest of our minds. Your mind may be composed of many individual things, but it is still an integrated whole, and that whole is what makes up you. Consciousness has a special function in that whole, but it is still a part of it. We should think of consciousness as the part that generates awareness of the decision-making processes -- of which there are many, operating in many different ways -- at varying levels of scrutiny, and NOT as a part that actually makes decisions. Both the decision-making element and the awareness element are vital parts of you, and it makes no sense to think of them as being separate from your self.
LinkReply

Comments:
[User Picture]From: zalena
2006-05-31 10:09 pm (UTC)
At times like this I think Dr. Tectonic deserves his own television show.
(Reply) (Thread)
[User Picture]From: finagler
2006-05-31 11:30 pm (UTC)
Your analogy sounds a lot like the B-Brain described in Minsky's Society of Mind.

One big wrinkle your metaphor still needs to deal with: your conscious rationalization is often dead wrong about the "actual" reason you performed an action a particular way. It's most noticeable in the famous split-brain experiments on people who have had their two hemispheres disconnected, but you can also see it in simple tests for whether someone was influenced by an advertisement or not (people almost always say no, even though the statistics clearly show most people are heavily influenced).

So sure, the consciousness is part of a single entity, but often it the role it takes is that of the press secretary who has to explain the actions of the entity without having actually been allowed to attend the meetings.

(Reply) (Thread)
[User Picture]From: dr_tectonic
2006-06-01 03:08 am (UTC)
Nah, that's easy and makes perfect sense: introspection is very difficult, so it often gets only partially invoked, and there are plenty of modules that weren't compiled with debugging information.

Metaphorically speaking, anyway -- and I'm not claiming that what I've said is an accurate description/explanation, just that it's a more useful metaphor, a better (but still imperfect) way of thinking about it.

There are a lot of things about your own mind that you just can't figure out from inside, unless you're provided with extra information. When it took Intro to Linguistics, it was astonishing to discover all the things that I knew without being consciously aware that I knew them. Once I had the extra information, I could think about them (to a degree), but not until then. It makes sense that plenty of decision processes would be similar.
(Reply) (Parent) (Thread)
[User Picture]From: nehrlich
2006-06-01 11:42 am (UTC)
I'm going to be a pedant and pick on the "more useful" comment, because it begs the question, for me at least, of more useful for what? It's kind of funny for me to realize that all quality judgments now get red-flagged in my brain because I'm not sure I believe in absolutes any more, so I ask myself "better for what?" or "more useful to whom?"

Trying to figure out the deep reasoning of our brains is really a pain sometimes. It's interesting - I've absorbed a lot of information about management and business over the years at different companies, and yet I'm still not able to articulate it coherently. One of the reasons I've been reading lots of business books is sometimes they say something and I say "Oh, duh, of course" as they say something that I knew, but hadn't condensed into coherence.

Blogging is another activity I need to do more of to try to map out some of what's going on underneath the hood. Hrm.
(Reply) (Parent) (Thread)
[User Picture]From: nehrlich
2006-06-01 03:02 am (UTC)
Heh. After asking for it, now I'm too brain-dead to actually read it.

Couple references that may be of interest. The User Illusion, by Tor Norretranders, who discusses the theory that the conscious is just an overlay on top of the real processing power going on in the brain.

There was also the recent paper that I think you're referring to called "On Making the Right Choice: The Deliberation-Without-Attention Effect" (PDF here). I've been meaning to blog about it, because I find it fascinating. Basically they demonstrated that our unconscious minds are _way_ better at making choices involving complex criteria than our conscious minds.

Oh, which reminds me, Sources of Power, by Gary Klein, has some fantastic stuff on describing how our unconscious brain is trained to sift through all sorts of complex criteria.

Okay, I just skimmed through your argument now that my brain has been jump-started. I definitely agree with you that consciousness is merely part of the brain, nothing special or separate. Partly because of the User Illusion, I'm less inclined to believe that it's a slower or more accurate process. If anything, I've come to distrust my conscious brain; if I find myself having to come up with reasons why I think I made a good decision, it's often the wrong decision. Except when I'm confusing my unconscious with a knee-jerk emotional reaction. Tricky that.

Okay, I'll stop babbling now.
(Reply) (Thread)
[User Picture]From: dr_tectonic
2006-06-01 04:44 am (UTC)
This isn't even the one you asked for; it's a totally different one that randomly came up today.

One note, I think I'm probably using the word "decision" more broadly than you are. It sounds like you're thinking of "should I do A or B?" kinds of decisions, which are, indeed, often very hard to treat with logical reasoning and better left to unconscious evaluation. I'm also including 'decisions' like "does that argument make sense?", "what's the best route from here to there?", and "how should I get this set of numbers to add up properly?" (as well as things like "what do I want for lunch").

Conscious reasoning is absolutely a slower but more accurate process, if the problem is one that is well-suited to logical analysis. And that's a very big if.

Actually, come to think of it, probably part of the problem is that logical, reasoned analysis is an incredibly powerful tool, but we're naturally very bad at it. So we have to think hard and run it with lots of careful debugging or we get it wrong. And consequently, we pay a lot of attention to it when we discuss thought. We don't even notice the stuff that we're really good at, like determining whether an object is red or blue, or whether this object is bigger than that one.
(Reply) (Parent) (Thread)
[User Picture]From: nehrlich
2006-06-01 12:01 pm (UTC)
Hrm. I agree logical, reasoned analysis is powerful and that we're bad at it. I'm not sure it's necessarily more accurate, though, at least outside of a limited domain (I'm not sure as to what you are saying constitutes logical analysis). To take your example, I find that analyzing arguments is not a matter of logical analysis for me, but of experience - only when I've seen loads of arguments of a certain type do they get built into my unconscious pattern recognition so that they can alert my conscious brain "Hey, there's something fishy going on here" (e.g. my pedantry about quality in the other comment).

I do agree that logical analysis is a resource-intensive process, which is why we rarely call it into play. The phenomenon of satisficing is explained by this, I think, where we take the first acceptable choice rather than try to figure out the best one. It's too much effort to analyze all of the different choices, so we just take one that works. I think this applies to things like route-finding or grocery-shopping.

Getting numbers to add up is harder, cuz that's clearly logic. Except that, of course, I don't actually think when I do arithmetic. That sort of math just happens unconsciously because I've done it so many times that my unconscious has got it wired.

Hrm. I think I'm also being influenced by On Intelligence a lot, in the pattern recognition stuff. I think most people never kick the logical stuff into play, relying on the (immensely powerful) pattern recognition system to work for them. The inability of most people to estimate probabilities is an example of this.

Blab blab blab. I'll shut up now.
(Reply) (Parent) (Thread)