*laugh* Yeah, large dataspace manipulation is a pain. Up until this, the largest data I'd worked with was ~200MB, and this popped up in the middle of a bunch of tests unexpectedly.
I'm going to look into how to kick Python into doing the file read a bit more intelligently (perhaps breaking it up into pieces of a GB or so, ensuring that I don't break across a chunk I need to keep coherent during analysis), but it's a great justification for a Dual G5 and 10.4! :D
"No, really, I *need* the big iron to do my research honey, I *swear*!" ;)
Re: heh
Date: 2005-05-28 11:00 pm (UTC)I'm going to look into how to kick Python into doing the file read a bit more intelligently (perhaps breaking it up into pieces of a GB or so, ensuring that I don't break across a chunk I need to keep coherent during analysis), but it's a great justification for a Dual G5 and 10.4! :D
"No, really, I *need* the big iron to do my research honey, I *swear*!" ;)