0

I'm using subprocess.Popen to create an automated build script for the Scipy stack.

My current process follows below.

mathbuild.json:

{"suitesparse": {"version": "4.2.1",
         "dependencies": ["metis"],
         "downloads": ["http://www.cise.ufl.edu/research/sparse/SuiteSparse/SuiteSparse-4.2.1.tar.gz"],
         "build": ["cd $DL_DIR",
               "tar xvfz SuiteSparse-4.2.1.tar.gz",
               "cd SuiteSparse",
               "cp -r $DL_DIR/metis-4.0.3 metis-4.0.3"]},

 "metis": {"version": "4.0.3",
       "dependencies": [],
       "downloads": ["http://glaros.dtc.umn.edu/gkhome/fetch/sw/metis/OLD/metis-4.0.3.tar.gz"],
       "build": ["cd $DL_DIR",
             "tar xvfz metis-4.0.3.tar.gz",
             "cd metis-4.0.3",
             "make"]}}

mathbuild.py:

def package_list(package, config):
    for dependency in config[package]['dependencies']:
        yield from package_list(dependency, config)
    yield package

def build_package(package, config):
    command = '; '.join(config[package]['build'])
    build = subprocess.Popen(command, shell=True)


def process_package(package, config, env_dir, dl_dir):
    print('INSTALLING {0}'.format(package))
    print('Downloading...')
    download_package(package, config, dl_dir)
    print('Building...')
    build_package(package, config)


if __name__ == '__main__':
    parser = argparse.ArgumentParser(description='Install Pylab in a new venv.')
    parser.add_argument('env_dir', help='target directory for new environment')
    args = parser.parse_args()
    os.environ['ENV_DIR'], os.environ['DL_DIR'] = create_venv(args.env_dir)
    with open('mathbuild.json') as f:
        cfg = json.load(f)
    processed = []
    for package in package_list('suitesparse', cfg):
        if package not in processed:
            process_package(package, cfg,
                            os.environ['ENV_DIR'],
                            os.environ['DL_DIR'])
            processed += [package]

It creates a list of dependencies (such that later items depend on earlier ones) and then processes each one (downloading and then building based on the commands in the json file).

The issue is that packages are being built via new subprocess.Popen calls before dependencies are fully built. In the above example, the suitesparse execution begins even before the metis build is done. I assume that's because I'm opening a new subprocess each time around the for package in package_list('suitesparse', cfg) loop without regard for whether or not the prior subprocesses are done.

The question What's the best way to synchronize the loop-based Popen calls so that each call starts only when the previous call to Popen (i.e. previous item in the list) is done?

What I've tried I've tried changing the loop so that it builds one combined Popen (with both package builds), but that seems hackish.

4

1 回答 1

2

看起来你想要subprocess.check_call()而不是 Popen。从文档

运行带参数的命令。等待命令完成。如果返回码为零,则返回,否则引发 CalledProcessError。

您的构建功能看起来像:

def build_package(package, config):
    command = '; '.join(config[package]['build'])
    subprocess.check_call(command, shell=True)

如果您实际使用的是 Popen 对象,则可以调用该wait()方法等待子命令完成:

def build_package(package, config):
    command = '; '.join(config[package]['build'])
    build = subprocess.Popen(command, shell=True)
    # do something with the build object
    build.wait()
    # command is done
于 2013-06-24T03:14:22.240 回答