我正在尝试创建一个类,它可以运行一个单独的进程来完成一些需要很长时间的工作,从一个主模块启动一堆这些,然后等待它们全部完成。我想启动一次流程,然后继续为它们提供要做的事情,而不是创建和破坏流程。例如,也许我有 10 台服务器运行 dd 命令,然后我希望它们都 scp 文件等。
我的最终目标是为每个系统创建一个类,以跟踪与它相关的系统的信息,例如 IP 地址、日志、运行时等。但是该类必须能够启动系统命令然后返回在系统命令运行时执行返回给调用者,以便稍后跟进系统命令的结果。
我的尝试失败了,因为我无法通过管道将类的实例方法通过 pickle 发送到子进程。那些是不可腌制的。因此,我尝试以各种方式修复它,但我无法弄清楚。如何修补我的代码来做到这一点?如果你不能发送任何有用的东西,多处理有什么好处?
是否有与类实例一起使用多处理的良好文档?我可以让多处理模块工作的唯一方法是使用简单的功能。每次在类实例中使用它的尝试都失败了。也许我应该改为传递事件?我还不明白该怎么做。
import multiprocessing
import sys
import re
class ProcessWorker(multiprocessing.Process):
"""
This class runs as a separate process to execute worker's commands in parallel
Once launched, it remains running, monitoring the task queue, until "None" is sent
"""
def __init__(self, task_q, result_q):
multiprocessing.Process.__init__(self)
self.task_q = task_q
self.result_q = result_q
return
def run(self):
"""
Overloaded function provided by multiprocessing.Process. Called upon start() signal
"""
proc_name = self.name
print '%s: Launched' % (proc_name)
while True:
next_task_list = self.task_q.get()
if next_task is None:
# Poison pill means shutdown
print '%s: Exiting' % (proc_name)
self.task_q.task_done()
break
next_task = next_task_list[0]
print '%s: %s' % (proc_name, next_task)
args = next_task_list[1]
kwargs = next_task_list[2]
answer = next_task(*args, **kwargs)
self.task_q.task_done()
self.result_q.put(answer)
return
# End of ProcessWorker class
class Worker(object):
"""
Launches a child process to run commands from derived classes in separate processes,
which sit and listen for something to do
This base class is called by each derived worker
"""
def __init__(self, config, index=None):
self.config = config
self.index = index
# Launce the ProcessWorker for anything that has an index value
if self.index is not None:
self.task_q = multiprocessing.JoinableQueue()
self.result_q = multiprocessing.Queue()
self.process_worker = ProcessWorker(self.task_q, self.result_q)
self.process_worker.start()
print "Got here"
# Process should be running and listening for functions to execute
return
def enqueue_process(target): # No self, since it is a decorator
"""
Used to place an command target from this class object into the task_q
NOTE: Any function decorated with this must use fetch_results() to get the
target task's result value
"""
def wrapper(self, *args, **kwargs):
self.task_q.put([target, args, kwargs]) # FAIL: target is a class instance method and can't be pickled!
return wrapper
def fetch_results(self):
"""
After all processes have been spawned by multiple modules, this command
is called on each one to retreive the results of the call.
This blocks until the execution of the item in the queue is complete
"""
self.task_q.join() # Wait for it to to finish
return self.result_q.get() # Return the result
@enqueue_process
def run_long_command(self, command):
print "I am running number % as process "%number, self.name
# In here, I will launch a subprocess to run a long-running system command
# p = Popen(command), etc
# p.wait(), etc
return
def close(self):
self.task_q.put(None)
self.task_q.join()
if __name__ == '__main__':
config = ["some value", "something else"]
index = 7
workers = []
for i in range(5):
worker = Worker(config, index)
worker.run_long_command("ls /")
workers.append(worker)
for worker in workers:
worker.fetch_results()
# Do more work... (this would actually be done in a distributor in another class)
for worker in workers:
worker.close()
编辑:我试图将ProcessWorker
类和多处理队列的创建移到类之外,Worker
然后尝试手动腌制工作实例。即使这样也不起作用,我得到一个错误
RuntimeError:队列对象只能通过继承在进程之间共享
. 但我只是将这些队列的引用传递给工作实例?我缺少一些基本的东西。这是主要部分的修改代码:
if __name__ == '__main__':
config = ["some value", "something else"]
index = 7
workers = []
for i in range(1):
task_q = multiprocessing.JoinableQueue()
result_q = multiprocessing.Queue()
process_worker = ProcessWorker(task_q, result_q)
worker = Worker(config, index, process_worker, task_q, result_q)
something_to_look_at = pickle.dumps(worker) # FAIL: Doesn't like queues??
process_worker.start()
worker.run_long_command("ls /")