This is a partial revert of 6afc718307
to restore the ability to overwrite distutils installed packages.
It is not elegant, but some projects (such as OpenStack's devstack)
rely on overwriting packages installed via the system package manager.
These packages can't be removed because they are dependencies for
parts of the base system, but many of the things devstack needs to run
requires later dependencies. For historical reasons it's not easy to
fix this into a virtualenv, etc, all at once.
If distributions move to setuptools based packages, this problem might
fix itself.
Since pip 7, via pip freeze, is producing such mismatching #egg
fragment, forbidding them in pip 8 would be too strongly
backward-incompatible.
It is a partial rollback of 1a012bb6.
Compare extras when checking if a requirement has already been
specified, and take a union of the extras before installation.
Co-Authored-By: Sachi King <nakato@nakato.io>
Closes#3046, #3189
A lot of existing tarballs will successfully build a wheel, but the
wheel will be implicitly broken because they will have dynamically
adjusted the install_requires inside of their setup.py. Typically
this is done for things like Python version, implementation, or what
OS this is being installed on. We don't consider cache directories
to be OS agnostic but we do consider them to be Python version and
implementation agnostic. To solve this, we'll force the cached
wheel to use a more specific Python tag that includes the major
version and the implementation.
format_exc takes only one argument, limit which should be an integer.
python 2 seems more lenient than python 3 on that point.
mistake introduced in commit 3148b967a
This would occur when, for example, installing from a requirements file that references a certain hashed sdist, a common situation.
As of pip 7, pip always tries to build a wheel for each requirement (if one wasn't provided directly) and installs from that. The way this was implemented, InstallRequirement.link pointed to the cached wheel, which obviously had a different hash than the index-sourced archive, so spurious mismatch errors would result.
Now we no longer read from the wheel cache in hash-checking mode.
Make populate_link(), rather than the `link` setter, responsible for mapping InstallRequirement.link to a cached wheel. populate_link() isn't called until until prepare_files(). At that point, when we've examined all InstallRequirements and their potential --hash options, we know whether we should be requiring hashes and thus whether to use the wheel cache at all.
The only place that sets InstallRequirement.link other than InstallRequirement itself is pip.wheel, which does so long after hashes have been checked, when it's unpacking the wheel it just built, so it won't cause spurious hash mismatches.
* Add --require-hashes option. This is handy in deployment scripts to force application authors to hash their requirements. It is also a convenient way to get pip to show computed hashes for a virgin, unhashed requirements file. Eventually, additions to `pip freeze` should fill a superset of this use case.
* In --require-hashes mode, at least one hash is required to match for each requirement.
* Option-based requirements (--sha256=...) turn on --require-hashes mode implicitly.
* Internet-derived URL-based hashes are "necessary but not sufficient": they do not satisfy --require-hashes mode when they match, but they are still used to guard against transmission errors.
* Other URL-based requirements (#md5=...) are treated just like flag-based ones, except they don't turn on --require-hashes.
* Complain informatively, with the most devastating errors first so you don't chase your tail all day only to run up against a brick wall at the end. This also means we don't complain that a hash is missing, only for the user to find, after fixing it, that we have no idea how to even compute a hash for that type of requirement.
* Complain about unpinned requirements when hash-checking mode is on, lest they cause the user surprise later.
* Complain about missing hashes.
* Complain about requirement types we don't know how to hash (like VCS ones and local dirs).
* Have InstallRequirement keep its original Link around (original_link) so we can differentiate between URL hashes from requirements files and ones downloaded from the (untrustworthy) internet.
* Remove test_download_hashes, which is obsolete. Similar coverage is provided in test_utils.TestHashes and the various hash cases in test_req.py.
- unmark test_multiple_search for failure
- search for nonexistentpackage instead of non-existent-package since PyPI
seems to fallback to searches about "package" and manages to find
things...
`pip download` has the same functionality as `pip install --download`,
and the behavior of `pip install --download` is preserved with a deprecation
warning. `pip install --download` will be removed in pip version 10.
This saves a network hop when using git and passing an explicit sha
as a ref by comparing the version that's already checked out.
Yields a ~4x speedup on my local machine
Before:
```
$ /usr/local/bin/pip --version
pip 7.1.0 from /usr/local/lib/python2.7/site-packages (python 2.7)
$ time /usr/local/bin/pip install --disable-pip-version-check -e git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Obtaining raven-dev from git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Updating ./src/raven-dev clone (to 56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e)
Could not find a tag or branch '56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e', assuming commit.
Installing collected packages: raven-dev
Running setup.py develop for raven-dev
Successfully installed raven-dev
/usr/local/bin/pip install --disable-pip-version-check -e 0.84s user 0.48s system 39% cpu 3.300 total
```
After:
```
$ /Users/matt/.virtualenvs/pip/bin/pip --version
pip 7.2.0.dev0 from /Users/matt/code/pip (python 2.7)
$ time /Users/matt/.virtualenvs/pip/bin/pip install --disable-pip-version-check -e git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Obtaining raven-dev from git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
checking version
Skipping because already up-to-date.
Installing collected packages: raven-dev
Running setup.py develop for raven-dev
Successfully installed raven-dev
/Users/matt/.virtualenvs/pip/bin/pip install --disable-pip-version-check -e 0.59s user 0.22s system 98% cpu 0.824 total
```
When installing an editable from file:///path/to/file, pip currently does
not attempt to determine the name from #egg=NAME, just passing back
None. This causes constraints code to completely ignore this line
resulting in unexpected installation behaviour.
This patch makes '-e file:///path#egg=name' function similarly to
'file:///path#egg=name' and '-e git+URL#egg=name'. If #egg=name is not
defined, it returns None and the package becomes an unamed requirement,
which constraints will not parse but in the case of a requirement will
later be processed and determined.
Closes#3026
Currently if the local package you are trying to install is listed in
the constraint file it will silently fail to install the local package
while installing its dependancies.
This process was caused by the has_requirement returning true when a
constraint with that name was defined, resulting in the local package
not being processed.
Closes#2928
Fixes bug #2683
There are two changes here; one to fix the using-wheels codepath and one
to fix the no-wheels codepath. Two tests are introduced, one to test
each codepath.
If a single package is listed as a constraint; is a dependency of a
package being installed; *and* is already installed, we end up
processing it multiple times. This change adds a new "prepared" flag
which we set the first time the package is processed, to prevent
multiple handling.
Fixes bug #2888
This adds constraints files. Like requirements files constraints files
control what version of a package is installed, but unlike
requirements files this doesn't itself choose to install the package.
This allows things that aren't explicitly desired to be constrained if
and only if they are installed.
importing it prevent to debug other packages with `-W error` as the
deprecation warning will raise.
Though there is still imp imported from a few vendorized packages,
and for other purposes than cache_from_source.
Using --install-options, --build-options, --global-options changes
the way that setup.py behaves, and isn't honoured by the wheel code.
The new wheel autobuilding code made this very obvious - disable
the use of wheels when these options are supplied.
With wheel autobuilding in place a release blocker is some granular
way to opt-out of wheels for known-bad packages. This patch introduces
two new options: --no-binary and --only-binary to control what
archives we are willing to use on both a global and per-package basis.
This also closes#2084
Building wheels before installing elminates a cause of broken environments -
where install fails after we've already installed one or more packages.
If a package fails to wheel, we run setup.py install as normally.
This is needed for setup-requires, since without it its possible
to cause installation to fail in sort-circuit scenarios such as
the added functional test case demonstrates.
Without this, I was getting:
$ pip install -U 'sentry[lol]'
...
UnknownExtra: Unknown 7.4.1 has no such extra feature 'lol'
With this, I get:
$ pip install -U 'sentry[lol]'
...
UnknownExtra: sentry 7.4.1 has no such extra feature 'lol'