Memory problems with PyNio

From: Pavel Jurus <jurus_at_nyahnyahspammersnyahnyah>
Date: Tue, 17 Apr 2007 14:42:13 +0200

Hi all,

I have problems when working with the large number of netcdf files. I can
easily run out of memory - when I tried to optimize PyNgl scripts to use less
memory, I found potential memory leak. Here is minimal example:

import Nio

file = "wrfout_d01_2007-04-02_20:00:00"
f = Nio.open_file(file + ".nc",'r')

repeat = 100

for i in range(repeat):
  result = f.variables['P'][:]


Shape of the float array 'result' is (1, 30, 114, 165), so
it should use around 30*114*165*4 ~ 2.25MB memory. Repeating assignment 100
times leads to python process using more than 225MB memory with no way to
reuse this memory.

Just to be sure I tried:

repeat = 100

for i in range(repeat):
  result = Numeric.zeros((1,30,114,165),'f')[:]

And it correctly uses much less memory.

I use Numeric version of PyNgl. Nio.__version__ is 1.1.0 and uname -a gives:
"Linux pc110 2.6.18-4-amd64 #1 SMP Fri Feb 2 14:28:35 UTC 2007 x86_64

Is there any way to free the memory after some Nio variable is no longer used?

Pavel Jurus
pyngl-talk mailing list
Received on Tue Apr 17 2007 - 06:42:13 MDT

This archive was generated by hypermail 2.2.0 : Tue Apr 17 2007 - 15:42:04 MDT