Number of objects grows unbouned...Memory leak


P

ptb

Hello all,

I'm using Python 3.4 and am seeing the memory usage of my program grow unbounded. Here's a snippet of the loop driving the main computation

opt_dict = {'interior':cons_dict['int_eq'],'lboundary':cons_dict['lboundary'],
'rboundary':cons_dict['rboundary'],
'material_props':{'conv':0.9,'diff':0.01},
'file_ident':ident,'numeric':True,'file_set':files}

# this produces roughly 25,000 elements
args = product(zip(repeat(nx[-1]),ib_frac),nx,subs)

for i,arg in enumerate(args):
my_func(a=arg[0],b=arg[1],c=arg[2],**opt_dict)
gc.collect()
print(i,len(gc.get_objects()))

A few lines of output:

progress....
0 84883
1 95842
2 106655
3 117576
4 128444
5 139309
6 150172
7 161015
8 171886
9 182739
10 193593
11 204455
12 215284
13 226102
14 236922
15 247804
16 258567
17 269386
18 280213
19 291032
20 301892
21 312701
22 323536
23 334391
24 345239
25 356076
26 366923
27 377701
28 388532
29 399321
30 410127
31 420917
32 431732
33 442489
34 453320
35 464147
36 475071
37 485593
38 496068
39 506568
40 517040
41 527531
42 538099
43 548658
44 559205
45 569732
46 580214
47 590655
48 601165
49 611656
50 622179
51 632645
52 643186
53 653654
54 664146
....

As you can see the number of objects keep growing and my memory usage grows proportionately. Also, my_func doesn't return any values but simply writes data to a file.

I was under the impression that this sort of thing couldn't happen in Python. Can someone explain (1) how this is possible? and (2) how do I fix it?

Hopefully that's enough information.

Thanks for your help,
Peter
 
Ad

Advertisements

P

ptb

Turns out one of the libraries I am using has a cache system. If I shut if off then my problem goes away...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top