When I walked to the grocery story yesterday, the dregs of my cold had me feeling kinda wiped by the time I got there, so after I got my produce, I wandered over to the thrift store and looked for a shirt (no luck) and then sat in a chair and read the prologue and first chapter of a sci-fi book I found on the shelf while I recuperated for a few minutes. It was enticing, so I bought it.
The book is Permutation City, by Greg Egan. I'm currently about halfway through, and I can't decide whether I should finish it or not.
The problem is this: the writing is good, and I'm enjoying the various side-plots, and the setting explores a number of interesting ideas. But the main premise of the book is stupid. Beyond stupid, it's effing stupid.
Why is it fucking stupid?
Okay. So: the book revolves around the fact that people can have their brains scanned and then run software Copies of their minds on computer. Hardware limitations mean they can only run at 1/17th normal speed, but still, it works.
The protagonist wakes up as a Copy and discovers that he's stuck, his real self has disabled his ability to bail out of his simulated existence, so after some angst, he agrees to go through with various experiments he/they had planned before the scan. (Aside: it's also really bugging me that he (or his editor, maybe) consistently misspells "bail out" as "bale out".)
And this is where it gets fucking stupid. They run the simulation non-sequentially and discover that, lo and behold, even though it was out of order, the Copy's consciousness threads things through properly and the subjective experience is perfectly normal.
Wait, what? My complaint isn't even with the consciousness thing. They ran the simulation non-sequentially? HOW? Math doesn't work like that! You can't calculate the fifth timestep first, then the third one, then 2, 4, and 1. You have to have the results of step 4 in order to be able to calculate step 5. You can't just skip ahead.
Maybe this is specialist knowledge, but I don't think it ought to be. I mean, it comes up at least implicitly if you read anything, anything at all about computer modeling and simulation. And the author gives an example of addition being commutative (1+(2+3) = (1+2)+3) in discussing it, but surely he knows that that only works if all you're doing is adding; 2*(3+4) != (2*3)+4. I can't imagine that a simulation of the human brain would involve only addition.
Worse, they also discover -- in the lead-up to that experiment -- that they don't have to simulate all the steps: if they skip steps, and just simulate every other millisecond, or every tenth millisecond, or even every thousandth, the subjective experience for the Copy is unchanged. Okay, if you can skip 99 out of a hundred steps in your simulation without loss of fidelity? You just solved the hardware limitation problem that was introduced for the sake of the plot! HELLO!
So I ask you, my friends:
What should I do about this?
Speaking of, Catz?