||[Jan. 31st, 2006|05:22 pm]
I'm making this post mostly so I can cut-and-paste it to a mailing-list, but what the heck, maybe somebody reading this will know the answer, too.|
My game has a crash-bug. In certain circumstances (not easily reproducible, ha-ha aren't THOSE the best kinds of bugs), it will use up enough memory to exceed the default JVM heap size, and the whole thing dies with a java.lang.OutOfMemory exception.
Now, this can be pretty easily prevented by running it from the command-line with the flag "-Xmx 256M", which tells the JVM to use more memory. But the problem with that solution is that our intended distribution method for the game is for the users to download it by right-clicking on the link, and run it by double-clicking the icon. Any startup procedure more complex than that is Just Too Hard. Open up a command prompt and type things that have to be spelled correctly? No way.
I guess that means I need to get it to use less memory. Okay, run a profiler on it. Now, our game has some pretty big data objects that we send back and forth between client and server by serializing them to XML. I already knew that the serialization code (which is third-party, and which I'm not going to touch) is pretty slow. Now I'm pretty sure that it's also a memory hog. In particular, I discovered that I can hit the "garbage collect NOW" button in the profiler and the memory usage drops by HALF. And the objects that get collected appear to all be spawned by things in the serialization code.
So... I'm thinkin' that means that I want garbage-collection to run more aggressively.
If I put a "System.gc()" call in at the end of the procedure that serializes the entire gamestate to XML, that should help, right?
Any pitfalls to prompting the garbage-collector to run in the spots where I know a lot of dead objects are being created?
Or am I barking up entirely the wrong tree?
If a programmer told me he had made this change, here's what I would check.
I would check and make sure that universal variables are maintained--unless all the data necessary for the game is serialized. But there's gotta be some server config data that doesn't change from game-to-game. . . I would think. So I would make sure any unserialized data isn't thrown out with the trash. :) Then I would check the serialized output and ensure that all the data integrity was maintained. I would run at least 5 cycles of serialization with different parameters to ensure nothing odd happens.
Reading up on how this gc funx works, it is both cool and incredibly frightening how this memory management system works. But then you're assuming that the programmers of this serialization code were good about releasing something when it should be released, and that they didn't just leave it on the heap assuming it would be available later. If your code passes those two tests, my belief would be that it would be a good fix.
You're on the wrong track with the universal/serialized variable stuff. There's no state preserved between games, and everything's getting piped through properly -- that's easy to tell. (And I've got kerjillions of successful serialization attempts.)
I'm pretty certain that it's just that the serialization code is kinda crappy in terms of how it handles memory. Unfortunately, it's not something I can muck around with; I need to fix it from the outside.