I have a personal python library consisting of several modules of scientific programs that I use. These live on a directory with the structure:
root/__init__.py
root/module1/__init__.py
root/module1/someprog.py
root/module1/ (...)
root/module2/__init__.py
root/module2/someprog2.py
root/module2/somecython.pyx
root/module2/somecython.so
root/module2/somefortran.f
root/module2/somefortran.so
(...)
I am constantly making changes to these programs and adding new files. With my current setup at work, I share the same directory with several machines of different architectures. What I want is a way to use these packages from python in the different architectures. If the packages were all pure python, this would be no problem. But the issue is that I have several compiled binaries (as shown in the example) from Cython and from f2py.
Is there a clever way to repackage these binaries so that python in the different systems only imports the relevant binaries? I'd like to keep the code organised in the same directory.
Obviously the simplest way would be to duplicate the directory or create another directory of symlinks. But this would mean that when new files are created, I'd have to update the symlinks manually.
Has anyone bumped into a similar problem, or can suggest a more pythonic approach to this organisation problem?