I realize this is an old question, but, just in case you find yourself here like I did, this is what worked for me.
I've got a package on GitHub (not registered with pypi) that relies on other GitHub (non-pypi) packages. I spent an inordinate amount of time trying to figure out how to get pip to handle this correctly. I will include what I did to fix it here.
Putting dependencies in a requirements.txt file is the preferred method of listing dependencies. However, you also need to populate install_requires in setup. It was at this stage that I ran into a roadblock with pip not wanting to install dependencies from GitHub.
Most places, including answers to this question, tell you to populate the dependency_links section of setup. However, you also need to populate the install_requires field with the name of the package referenced in dependency_links.
For example, if your requirements.txt contains the following.
somepackage==1.2.0
https://github.com/user/repo/tarball/master#egg=repo-1.0.0
anotherpackage==4.2.1
Then, your setup call should look like this:
setup(
name='yourpackage',
version='1.7.5',
packages=[],
url='',
license='',
author='',
author_email='',
description='',
install_requires=[
'somepackage==1.2.0',
'repo==1.0.0',
'anotherpackage==4.2.1'
],
dependency_links=[
'https://github.com/user/repo/tarball/master#egg=repo-1.0.0'
]
)
Ok, so now we've got our package configured; installing it is the next task. This is where I spent a lot of time. I could not figure out why specifying dependency_links apparently did nothing. The trick is that in some cases, you need to set the allow-all-external (can be more specific) flag for pip. For example:
pip install git+https://github.com/user/anotherrepo.git
--process-dependency-links --allow-all-external
You're done and it works!
DISCLAIMER: dependency_links and the flags process-dependency-links and allow-all-external are deprecated, so they will be removed soon. In the time I spent, I could not locate a better, prefered method and still have pip function properly.