I'm using multiprocessing in my project. I have a worker function which put in a queue the results. Everything works fine. But as size of x increases (in my case x is an array) something gone wrong. Here is a simplified version of my code:
def do_work(queue, x):
result = heavy_computation_function(x)
queue.put(result) # PROBLEM HERE
def parallel_something():
queue = Queue()
procs = [Process(target=do_work, args=i) for i in xrange(20)]
for p in procs: p.start()
for p in procs: p.join()
results = []
while not queue.empty():
results.append(queue.get)
return results
I see in the system monitor the python processes working, but then something happen and all processes are running but doing nothing. This is what I get when typing ctrl-D.
pid, sts = os.waitpid(self.pid, flag)
KeyboardInterrupt
I do some tests. And the problem looks like to be in putting results in the queue in fact if I don't put the results everything works but then there would be no purpose.