在多处理模块中使用Pool.map_async()
(和)时我遇到了麻烦。Pool.map()
我已经实现了一个并行循环函数,只要函数输入Pool.map_async
是一个“常规”函数,它就可以正常工作。当函数是例如类的方法时,我得到一个PicklingError
:
cPickle.PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed
我只使用 Python 进行科学计算,所以我对酸洗的概念不太熟悉,今天刚刚了解了一点。我已经查看了几个以前的答案,例如Can't pickle <type 'instancemethod'> when using multiprocessing Pool.map(),但我无法弄清楚如何使其工作,即使按照答案中提供的链接进行操作.
我的代码,其目标是使用多核模拟 Normal rv 的向量。请注意,这只是一个示例,在多核上运行可能甚至没有回报。
import multiprocessing as mp
import scipy as sp
import scipy.stats as spstat
def parfor(func, args, static_arg = None, nWorkers = 8, chunksize = None):
"""
Purpose: Evaluate function using Multiple cores.
Input:
func - Function to evaluate in parallel
arg - Array of arguments to evaluate func(arg)
static_arg - The "static" argument (if any), i.e. the variables that are constant in the evaluation of func.
nWorkers - Number of Workers to process computations.
Output:
func(i, static_arg) for i in args.
"""
# Prepare arguments for func: Collect arguments with static argument (if any)
if static_arg != None:
arguments = [[arg] + static_arg for arg in list(args)]
else:
arguments = args
# Initialize workers
pool = mp.Pool(processes = nWorkers)
# Evaluate function
result = pool.map_async(func, arguments, chunksize = chunksize)
pool.close()
pool.join()
return sp.array(result.get()).flatten()
# First test-function. Freeze location and scale for the Normal random variates generator.
# This returns a function that is a method of the class Norm_gen. Methods cannot be pickled
# so this will give an error.
def genNorm(loc, scale):
def subfunc(a):
return spstat.norm.rvs(loc = loc, scale = scale, size = a)
return subfunc
# Second test-function. The same as above but does not return a method of a class. This is a "plain" function and can be
# pickled
def test(fargs):
x, a, b = fargs
return spstat.norm.rvs(size = x, loc = a, scale = b)
# Try it out.
N = 1000000
# Set arguments to function. args1 = [1, 1, 1,... ,1], the purpose is just to generate a random variable of size 1 for each
# element in the output vector.
args1 = sp.ones(N)
static_arg = [0, 1] # standarized normal.
# This gives the PicklingError
func = genNorm(*static_arg)
sim = parfor(func, args1, static_arg = None, nWorkers = 12, chunksize = None)
# This is OK:
func = test
sim = parfor(func, args1, static_arg = static_arg, nWorkers = 12, chunksize = None)
在Can't pickle <type 'instancemethod'> when using multiprocessing Pool.map()中的问题答案中提供的链接之后,Steven Bethard(几乎在最后)建议使用该copy_reg
模块。他的代码是:
def _pickle_method(method):
func_name = method.im_func.__name__
obj = method.im_self
cls = method.im_class
return _unpickle_method, (func_name, obj, cls)
def _unpickle_method(func_name, obj, cls):
for cls in cls.mro():
try:
func = cls.__dict__[func_name]
except KeyError:
pass
else:
break
return func.__get__(obj, cls)
import copy_reg
import types
copy_reg.pickle(types.MethodType, _pickle_method, _unpickle_method)
我真的不明白如何利用它。我唯一能想到的就是把它放在我的代码之前,但它没有帮助。一个简单的解决方案当然是只使用可行的解决方案并避免参与copy_reg
. 我更感兴趣的是copy_reg
能够正常工作以充分利用多处理,而不必每次都解决问题。