Monday, October 10, 2011

Memory Lapse

Ray and I are planning to go down to CLC Wednesday morning to re-establish that connection.  Hope someone there still remembers us!

Used 'wc -l' in Cygwin to count lines in the data file ("C:\SHARED\Server Code\logs 2011-10-07\node0.uart-cut.trnscr").  It is 8,619,504.  Increased array size to 8,620,000.  Now Scilab complains it needs 146,540,153 memory, and we only set stacksize to 100 million previously.  Let's then increase stacksize to 150,000,000.  Now SciLab complains it can't allocate that much memory!  OK, let's go back to 100 million, and split the data file in half.

Splitting into data before and after Thu Oct 06 15:56:00 2011, which is roughly in the middle of the data file.  Now have two files:
  • node0.uart-cut.1st-half.trnscr - 4,143,039 lines
  •  node0.uart-cut.2nd-half.trnscr - 4,476,465 lines
I'm processing the first half now.  I'm worried, though, that I might still run out of memory partway through the script.   Yes, indeed it did.  Let's see if we can increase the stack size to 125,000,000.  Nope.

Let's now extract just the 1st 1/4th of the data, before Thu Oct 06 04:20:33 2011:
  • node0.uart-cut.1st-qrtr.trnscr- 1,984,492 lines
The analysis script anal-pulses.sce is running now, and hasn't run out of memory yet.  But it hasn't finished yet.  Check on it tomorrow...

No comments:

Post a Comment