4

我正在尝试使用 python 2.7.2 中的多处理模块进行一些计算。我的代码是这样的:

from multiprocessing import Pool
import sys
sys.setrecursionlimit(10000)
partitions = []
class Partitions:
    parts = {} #My goal is to use this dict to speed
               #up calculations in every process that
               #uses it, without having to build it up
               #from nothing each time
    def __init__(self):
        pass
    def p1(self, k, n):
        if (k,n) in Partitions.parts:
            return Partitions.parts[(k, n)]
        if k>n:
            return 0
        if k==n:
            return 1
        Partitions.parts[(k,n)] = self.p1(k+1, n) + self.p1(k, n-k)
        return Partitions.parts[(k,n)]

    def P(self, n):
        result = 0
        for k in xrange(1,n/2 + 1):
            result += self.p1(k, n-k)
        return 1 + result

p = Partitions()

def log(results):
    if results:
        partitions.extend(results)
    return None

def partWorker(start,stop):
    ps = []
    for n in xrange(start, stop):
        ps.append(((1,n), p.P(n)))
    return ps

def main():
    pool = Pool()
    step = 150
    for i in xrange(0,301,step):
        pool.apply_async(partWorker, (i, i+step), callback = log)

    pool.close()
    pool.join()

    return None

if __name__=="__main__":
    main()

我是新手,我基本上复制了这个页面上主要代码的格式: python prime crunching: processing pool is slow? 我可以让每个核心中运行的进程都查看同一个字典来帮助他们的计算吗?它现在的行为方式,每个进程创建它自己的字典,它像疯了一样吃掉 ram。

4

1 回答 1

1

我不确定这是否是你想要的......但是,看看 multiprocessing.Manager (http://docs.python.org/library/multiprocessing.html#sharing-state-between-processes)。管理器允许您在进程之间共享一个字典。

于 2012-04-25T00:54:53.787 回答