13

I have a complicated python server app, that runs constantly all the time. Below is a very simplified version of it.

When I run the below app using python; "python Main.py". It uses 8mb of ram straight away, and stays at 8mb of ram, as it should.

When I run it using pypy "pypy Main.py". It begins by using 22mb of ram and over time the ram usage grows. After a 30 seconds its at 50mb, after an hour its at 60mb.

If I change the "b.something()" to be "pass" it doesn't gobble up memory like that.

I'm using pypy 1.9 on OSX 10.7.4 I'm okay with pypy using more ram than python.

Is there a way to stop pypy from eating up memory over long periods of time?

import sys
import time
import traceback

class Box(object):
    def __init__(self):
        self.counter = 0
    def something(self):
        self.counter += 1
        if self.counter > 100:
            self.counter = 0

try:
    print 'starting...'
    boxes = []      
    for i in range(10000):
        boxes.append(Box())
    print 'running!'
    while True:
        for b in boxes:
            b.something()
        time.sleep(0.02)

except KeyboardInterrupt:
    print ''
    print '####################################'
    print 'KeyboardInterrupt Exception'
    sys.exit(1)

except Exception as e:
    print ''
    print '####################################'
    print 'Main Level Exception: %s' % e
    print traceback.format_exc()
    sys.exit(1)

Below is a list of times and the ram usage at that time (I left it running over night).

Wed Sep  5 22:57:54 2012, 22mb ram 
Wed Sep  5 22:57:54 2012, 23mb ram 
Wed Sep  5 22:57:56 2012, 24mb ram 
Wed Sep  5 22:57:56 2012, 25mb ram 
Wed Sep  5 22:57:58 2012, 26mb ram 
Wed Sep  5 22:57:58 2012, 27mb ram 
Wed Sep  5 22:57:59 2012, 29mb ram 
Wed Sep  5 22:57:59 2012, 30mb ram 
Wed Sep  5 22:58:00 2012, 31mb ram 
Wed Sep  5 22:58:02 2012, 32mb ram 
Wed Sep  5 22:58:03 2012, 33mb ram 
Wed Sep  5 22:58:05 2012, 34mb ram 
Wed Sep  5 22:58:08 2012, 35mb ram 
Wed Sep  5 22:58:10 2012, 36mb ram 
Wed Sep  5 22:58:12 2012, 38mb ram 
Wed Sep  5 22:58:13 2012, 39mb ram 
Wed Sep  5 22:58:16 2012, 40mb ram 
Wed Sep  5 22:58:19 2012, 41mb ram 
Wed Sep  5 22:58:21 2012, 42mb ram 
Wed Sep  5 22:58:23 2012, 43mb ram 
Wed Sep  5 22:58:26 2012, 44mb ram 
Wed Sep  5 22:58:28 2012, 45mb ram 
Wed Sep  5 22:58:31 2012, 46mb ram 
Wed Sep  5 22:58:33 2012, 47mb ram 
Wed Sep  5 22:58:35 2012, 49mb ram 
Wed Sep  5 22:58:35 2012, 50mb ram 
Wed Sep  5 22:58:36 2012, 51mb ram 
Wed Sep  5 22:58:36 2012, 52mb ram 
Wed Sep  5 22:58:37 2012, 54mb ram 
Wed Sep  5 22:59:41 2012, 55mb ram 
Wed Sep  5 22:59:45 2012, 56mb ram 
Wed Sep  5 22:59:45 2012, 57mb ram 
Wed Sep  5 23:00:58 2012, 58mb ram 
Wed Sep  5 23:02:20 2012, 59mb ram 
Wed Sep  5 23:02:20 2012, 60mb ram 
Wed Sep  5 23:02:27 2012, 61mb ram 
Thu Sep  6 00:18:00 2012, 62mb ram 
4

2 回答 2

10

http://doc.pypy.org/en/latest/gc_info.html#minimark-environment-variables shows how to tweak the gc

于 2012-10-01T08:54:55.633 回答
5

Compared to cpython, pypy uses different garbage collection strategies. If the increase in memory is due to something in your program, you could try to run a forced garbage collection every now and then, by using the collect function from the gc module. In this case, it might also help to explicitly del large objects that you don't need anymore and that don't go out of scope.

If it is due to the internal workings of pypy, it might be worth it submitting a bug report, as Mark Dickinson suggested.

于 2012-09-15T16:58:44.400 回答