I need to run a long foobar.py process with Popen and parse its output with a multiprocessing process.
My problem is that sometimes I cannot wait for the parser to finish, so I need to daemonize the parser using the multiprocessing daemon property. I need the parser to be usable both deamonic non daemonic ways. The doc also says that a daemonic process is not allowed to create child processes. So in that case I run the Popen process before the parser is forked (see the start method below).
class Parser(multiprocessing.Process):
def __init__(self, daemon, output):
super(Parser, self).__init__()
self.daemon = daemon
self.output = output
def start(self):
if self.daemon:
self.launchFoobar() # Foobar is launched before forking
super(Parser, self).start()
def run(self):
if not self.daemon:
self.launchFoobar() # Foobar is launched after forking
self.parseFoobar()
def launchFoobar(self):
self.process = subprocess.Popen(["python", "foobar.py"], stdin=subprocess.PIPE, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
def parseFoobar(self):
with open(self.output, "w+") as f:
for line in iter(self.process.stdout):
f.write(line)
f.flush()
self.process.wait()
Let's say here that foobar.py just waits some seconds and print something, and parseFoobar method just prints output in a file. In my case both functions are a lot more complex than this.
Running Parser(daemon=False, "sync.txt").start()
works fine and there is some output in sync.txt. But running Parser(daemon=True, "async.txt")
does not produce anything in async.txt and seem to be blocked at line for line in iter(self.process.stdout):
because the file is created, but it is empty.
Why doesn't it work? How can I fix it?
You can find gists for parser.py and foobar.py for testing. Just run python parser.py
and look at output files.
Edit: There are some tips in django daemonize methods