I have been busily hacking away at statistical analyses over the last few workdays. I'm working on bias-correcting our data, and there's this popular technique that I suspected of being overzealous, and I really wanted to be able to show that people were Doing It Wrong and needed to modify their methodology.
I think that what I have instead proven is that BOTH approaches are overzealous, and that my proposed fix is actually marginally worse, but also that both of them are way, way worse than using a nearly-equivalent procedure that has a little bit more fancy math in it, which I thought would be kinda twitchy but is actually fine. (And which also turns out to be computationally less expensive.)
Which is a little bit annoying, because I've been touting (my fixed version of) the technique in talks, and now I'm going to have to change course (although I guess it will still work just fine for the larger point I'm making when I mention it, which is that this is a tricky subject), but on the plus side, it does mean that I can proudly say that yes, I am really truly Doing Science, because I had a belief, and the evidence proved that belief to be wrong, and therefore I changed my mind. SCIENCE!
Also, I got to use the word "oracle" in my abstract, which pleases me.