A key idea in pip is that package versions listed in requirement files (or as pip install arguments), have precedence over those that are located during the normal dependency resolution process that uses “install_requires” metadata.
This allows users to be in control of specifying an environment of packages that are known to work together.
Instead of running something like pip install MyApp and getting whatever libraries come along, you’d run pip install -r requirements.txt where “requirements.txt” contains something like:
MyApp Framework==0.9.4 Library>=0.2
Regardless of what MyApp lists in setup.py, you’ll get a specific version of Framework (0.9.4) and at least the 0.2 version of Library. Additionally, you can add optional libraries and support tools that MyApp doesn’t strictly require, giving people a set of recommended libraries.
Requirement files are intended to exhaust an environment and to be flat. Maybe MyApp requires Framework, and Framework requires Library. It is encouraged to still list all these in a single requirement file. It is the nature of Python programs that there are implicit bindings directly between MyApp and Library. For instance, Framework might expose one of Library’s objects, and so if Library is updated it might directly break MyApp. If that happens you can update the requirements file to force an earlier version of Library, and you can do that without having to re-release MyApp at all.
To create a new requirements file from a known working environment, use:
$ pip freeze > stable-req.txt
This will write a listing of all installed libraries to stable-req.txt with exact versions for every library.
For more information, see:
Often, you will want a fast install from local archives, without probing PyPI.
First, download the archives that fulfill your requirements:
$ pip install --download <DIR> -r requirements.txt
$ pip install --no-index --find-links=[file://]<DIR> -r requirements.txt
To install from wheels on PyPI, if they were to exist (which is not likely for the short term):
pip install --use-wheel SomePackage
To install directly from a wheel archive:
pip install SomePackage-1.0-py2.py3-none-any.whl
Since wheels won’t be pervasive on PyPI for awhile, pip additionally offers pip wheel as a convenience, to build wheels for your requirements and dependencies.
To build wheels for your requirements and all their dependencies to a local directory:
pip install wheel pip wheel --wheel-dir=/local/wheels -r requirements.txt
And then to install those requirements just using your local directory of wheels (and not from PyPI):
pip install --use-wheel --no-index --find-links=/local/wheels -r requirements.txt
The wheel format can eliminate a lot of redundant compilation but, alas, it’s not generally advisable to upload your pre-compiled linux-x86-64 library binding to pypi. Wheel’s tags are only designed to express the most important Python-specific compatibility concerns (Python version, ABI, and architecture) but do not represent other important binary compatibility factors such as the OS release, patch level, and the versions of all the shared library dependencies of any extensions inside the package.
Rather than representing all possible compatibility information in the wheel itself, the wheel design suggests distribution-specific build services (e.g. a separate index for Fedora Linux binary wheels, compiled by the index maintainer). This is the same solution taken by Linux distributions which all re-compile their own packages instead of installing each other’s binary packages.
Some kinds of precompiled C extension modules can make sense on PyPI, even for Linux. Good examples include things that can be sensibly statically linked (a cryptographic hash function; an accelerator module that is not a binding for an external library); the best example of something that shouldn’t be statically linked is a library like openssl that needs to be constantly kept up-to-date for security. Regardless of whether a compatible pre-build package is available, many Linux users will prefer to always compile their own anyway.
On Windows the case for binary wheels on pypi is stronger both because Windows machines are much more uniform than Linux and because it’s harder for the end user to compile their own. Windows-compatible wheels uploaded to pypi should be compatible with the Python distributions downloaded from http://python.org/. If you already upload other binary formats to pypi, upload wheels as well. Unlike the older formats, wheels are compatible with virtual environments.
pip allows you to just download the source archives for your requirements, without installing anything and without regard to what’s already installed.
$ pip install --download <DIR> -r requirements.txt
or, for a specific package:
$ pip install --download <DIR> SomePackage
pip allows you to just unpack archives to a build directory without installing them to site-packages. This can be useful to troubleshoot install errors or to inspect what is being installed.
$ pip install --no-install SomePackage
If you’re in a virtualenv, the build dir is <virtualenv path>/build. Otherwise, it’s <OS temp dir>/pip-build-<username>
Afterwards, to finish the job of installing unpacked archives, run:
$ pip install --no-download SomePackage
pip install ---upgrade is currently written to perform a recursive upgrade.
Running pip install ---upgrade SomePackage would upgrade SomePackage and AnotherPackage despite AnotherPackage already being satisifed.
If you would like to perform a non-recursive upgrade perform these 2 steps:
pip install --upgrade --no-deps SomePackage pip install SomePackage
The first line will upgrade SomePackage, but not dependencies like AnotherPackage. The 2nd line will fill in new dependencies like OneMorePackage.
Three things are required to fully guarantee a repeatable installation using requirements files.
With Python 2.6 came the “user scheme” for installation, which means that all Python distributions support an alternative install location that is specific to a user. The default location for each OS is explained in the python documentation for the site.USER_BASE variable. This mode of installation can be turned on by specifying the –user option to pip install.
Moreover, the “user scheme” can be customized by setting the PYTHONUSERBASE environment variable, which updates the value of site.USER_BASE.
To install “SomePackage” into an environment with site.USER_BASE customized to ‘/myappenv’, do the following:
export PYTHONUSERBASE=/myappenv pip install --user SomePackage
Setuptools offers the setup_requires setup() keyword for specifying dependencies that need to be present in order for the setup.py script to run. Internally, Setuptools uses easy_install to fulfill these dependencies.
pip has no way to control how these dependencies are located. None of the Package Index Options have an effect.
The solution is to configure a “system” or “personal” Distutils configuration file to manage the fulfillment.
For example, to have the dependency located at an alternate index, add this:
[easy_install] index_url = https://my.index-mirror.com
To have the dependency located from a local directory and not crawl PyPI, add this:
[easy_install] allow_hosts = '' find_links = file:///path/to/local/archives