A lot of existing tarballs will successfully build a wheel, but the
wheel will be implicitly broken because they will have dynamically
adjusted the install_requires inside of their setup.py. Typically
this is done for things like Python version, implementation, or what
OS this is being installed on. We don't consider cache directories
to be OS agnostic but we do consider them to be Python version and
implementation agnostic. To solve this, we'll force the cached
wheel to use a more specific Python tag that includes the major
version and the implementation.
format_exc takes only one argument, limit which should be an integer.
python 2 seems more lenient than python 3 on that point.
mistake introduced in commit 3148b967a
The changes resolves a condition that can lead to a stacktrace due to the
use of constraint files. There are several conditions where the result
object may be left undefined which causes the problems.
``` traceback
Exception:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/pip/basecommand.py", line 211, in main
status = self.run(options, args)
File "/usr/local/lib/python2.7/dist-packages/pip/commands/wheel.py", line 180, in run
wheel_cache
File "/usr/local/lib/python2.7/dist-packages/pip/basecommand.py", line 266, in populate_requirement_set
requirement_set.add_requirement(req)
File "/usr/local/lib/python2.7/dist-packages/pip/req/req_set.py", line 267, in add_requirement
return result
UnboundLocalError: local variable 'result' referenced before assignment
```
This change simply ensures that the 'result' object is a defined
when the method returns.
Signed-off-by: Kevin Carter <kevin.carter@rackspace.com>
Removed the mention of "package index options" in the docs, because they don't all fit that category anymore. Not even --no-binary and --only-binary do; they're "install options".
This would occur when, for example, installing from a requirements file that references a certain hashed sdist, a common situation.
As of pip 7, pip always tries to build a wheel for each requirement (if one wasn't provided directly) and installs from that. The way this was implemented, InstallRequirement.link pointed to the cached wheel, which obviously had a different hash than the index-sourced archive, so spurious mismatch errors would result.
Now we no longer read from the wheel cache in hash-checking mode.
Make populate_link(), rather than the `link` setter, responsible for mapping InstallRequirement.link to a cached wheel. populate_link() isn't called until until prepare_files(). At that point, when we've examined all InstallRequirements and their potential --hash options, we know whether we should be requiring hashes and thus whether to use the wheel cache at all.
The only place that sets InstallRequirement.link other than InstallRequirement itself is pip.wheel, which does so long after hashes have been checked, when it's unpacking the wheel it just built, so it won't cause spurious hash mismatches.
are unavailable, but issue a warning if this is used.
2. Explicitly handle the case where the unicode detection finds wide
unicode but this is a 3.3+ build (necessary due to #1)
3. Fix tests broken due to #2.
Pythons with PEP 393 Flexible String Representation (so >= 3.3).
Granted, on these Pythons, the SOABI config var should always be set,
but the manual SOABI code path should still try to do the right thing.
Additionally, fix the version portion of the Python tag on wheels built
with PyPy that use the Python API. It will now be the Python major
version concatenated with the PyPy major and minor versions.
Fixes#2671, #2882.
Renaming "gots" didn't go well. I think the current naming is the most concise way to put it. If we rename it to "got", then the loop iterator can't be called "got", and the simple relationship between the iterator and collection names is lost. "Actual" and "actuals" are the other names that occurred to me, but they look so much like "allowed" that the code becomes harder to read.
This guards against the possibility of a weaker hash being added to hashlib in the future. Also give _good_hashes() a more descriptive name, and describe what we mean by "strong".
We can get away with returning a static list because those algorithms are guaranteed present in hashlib.
An info-level message for each package might be too intense. And it might give a false sense of security as well: it doesn't confirm that the virtualenv is non-empty; it merely notices when a package we're installing is already there.
Those commands already checked hashes, since they use RequirementSet, where the hash-checking is done.
Reorder some options so pre, no-clean, and require-hashes are always in the same order.
On versions of CPython affected by <http://bugs.python.org/issue14768>
(Python 2.6, some versions of Python 2.7 and 3.3),
`os.path.expanduser('~/path')` returns `//path` rather than `/path` when
`HOME=/`. This affects pip when `os.path.expanduser('~/.cache/pip')` is
expanded to `/\\\\.cache/pip`. Although `HOME=/` is probably uncommon on
most Linux systems, it is extremely common in Docker images.
Fixes#2996.
For dependencies that are properly pinned and hashed (not really dependencies at all, if you like, since they're explicit, root-level requirements), we install them as normal. For ones that are not pinned and hashes, we raise the errors typical of any unhashed requirement in --require-hashes mode.
Since the stanza under "if not ignore_dependencies" doesn't actually add anything if it's already in the RequirementSet, not much has to be done in the way of code: the unhashed deps don't have any hashes, so we complain about them as per usual.
Also...
* Revise wording of HashUnpinned errors. They can be raised even if no hash is specified, so the previous wording was misleading.
* Make wording of HashMissing less awkward.
dstufft is nervous about blowing a single-char option on something that will usually be copied and pasted anyway. We can always put it back later if it proves to be a pain.
* Add --require-hashes option. This is handy in deployment scripts to force application authors to hash their requirements. It is also a convenient way to get pip to show computed hashes for a virgin, unhashed requirements file. Eventually, additions to `pip freeze` should fill a superset of this use case.
* In --require-hashes mode, at least one hash is required to match for each requirement.
* Option-based requirements (--sha256=...) turn on --require-hashes mode implicitly.
* Internet-derived URL-based hashes are "necessary but not sufficient": they do not satisfy --require-hashes mode when they match, but they are still used to guard against transmission errors.
* Other URL-based requirements (#md5=...) are treated just like flag-based ones, except they don't turn on --require-hashes.
* Complain informatively, with the most devastating errors first so you don't chase your tail all day only to run up against a brick wall at the end. This also means we don't complain that a hash is missing, only for the user to find, after fixing it, that we have no idea how to even compute a hash for that type of requirement.
* Complain about unpinned requirements when hash-checking mode is on, lest they cause the user surprise later.
* Complain about missing hashes.
* Complain about requirement types we don't know how to hash (like VCS ones and local dirs).
* Have InstallRequirement keep its original Link around (original_link) so we can differentiate between URL hashes from requirements files and ones downloaded from the (untrustworthy) internet.
* Remove test_download_hashes, which is obsolete. Similar coverage is provided in test_utils.TestHashes and the various hash cases in test_req.py.
We purposely keep it off the CLI for now. optparse isn't really geared to expose interspersed args and options, so a more heavy-handed approach will be necessary to support things like `pip install SomePackage --sha256=abcdef... OtherPackage --sha256=012345...`.
`pip download` has the same functionality as `pip install --download`,
and the behavior of `pip install --download` is preserved with a deprecation
warning. `pip install --download` will be removed in pip version 10.
When pip tries to build a wheel but fails, setup.py install is run in
the same directory. However, some build files are not compatible
between 'install' and 'bdist_wheel', particularly scripts - the shebang
lines are different. Pip now cleans the build dir after a failed
bdist_wheel so that pip doesn't install broken scripts.
Closes#3006
This saves a network hop when using git and passing an explicit sha
as a ref by comparing the version that's already checked out.
Yields a ~4x speedup on my local machine
Before:
```
$ /usr/local/bin/pip --version
pip 7.1.0 from /usr/local/lib/python2.7/site-packages (python 2.7)
$ time /usr/local/bin/pip install --disable-pip-version-check -e git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Obtaining raven-dev from git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Updating ./src/raven-dev clone (to 56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e)
Could not find a tag or branch '56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e', assuming commit.
Installing collected packages: raven-dev
Running setup.py develop for raven-dev
Successfully installed raven-dev
/usr/local/bin/pip install --disable-pip-version-check -e 0.84s user 0.48s system 39% cpu 3.300 total
```
After:
```
$ /Users/matt/.virtualenvs/pip/bin/pip --version
pip 7.2.0.dev0 from /Users/matt/code/pip (python 2.7)
$ time /Users/matt/.virtualenvs/pip/bin/pip install --disable-pip-version-check -e git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
Obtaining raven-dev from git+https://github.com/getsentry/raven-python.git@56fc6f7beecf445843d0ec7052bb8c6f0ea80a2e#egg=raven_dev
checking version
Skipping because already up-to-date.
Installing collected packages: raven-dev
Running setup.py develop for raven-dev
Successfully installed raven-dev
/Users/matt/.virtualenvs/pip/bin/pip install --disable-pip-version-check -e 0.59s user 0.22s system 98% cpu 0.824 total
```
Change 3affcaa2b8 attempts to reset
purelib & platlib to any "install-dir" specified by the user in
setup.cfg. This code is used when we are installing wheels.
The problem with this is that distutils is *always* setting
"i.install_lib" -- even when the user specifies nothing. This has the
result of unconditionally setting purelib == platlib.
On some systems this results in .so's from the wheel getting installed
into /usr/lib/python2.7 (purelib) rather than /usr/lib64/python
(platlib). Because distribution-packaged libraries have installed
their .so into platlib, we can now have a situation where the new
pip-installed library is picking up an old .so from the distro package
... with predictably bad results.
This takes the approach of checking the configuration to see if the
user has overridden install-dir and only resetting the paths if they
have. The override case is covered by existing test-cases.
Closes#2940
When installing an editable from file:///path/to/file, pip currently does
not attempt to determine the name from #egg=NAME, just passing back
None. This causes constraints code to completely ignore this line
resulting in unexpected installation behaviour.
This patch makes '-e file:///path#egg=name' function similarly to
'file:///path#egg=name' and '-e git+URL#egg=name'. If #egg=name is not
defined, it returns None and the package becomes an unamed requirement,
which constraints will not parse but in the case of a requirement will
later be processed and determined.
Closes#3026
Currently if the local package you are trying to install is listed in
the constraint file it will silently fail to install the local package
while installing its dependancies.
This process was caused by the has_requirement returning true when a
constraint with that name was defined, resulting in the local package
not being processed.
Closes#2928
before building wheels. Warning is copied from http cache check.
HTTP cache has a similar check and warning,
but wheels would still try and fail to build.
Fixes bug #2683
There are two changes here; one to fix the using-wheels codepath and one
to fix the no-wheels codepath. Two tests are introduced, one to test
each codepath.
Previously we attempt to do some crazy things with import hooks
in order to attempt to automatically alias normally installed
dependencies as our vendored dependencies. This turned out to be
fairly fragile, so instead we'll manually patch sys.modules to
trigger the aliasing.
As part of this, we also drop support for the
PIP_NO_VENDOR_FOR_DOWNSTREAM environment variable because it was
never fully supported and now that we have wheel caching, actually
using it could possibly trigger a bad wheel to be cached. The new
mechanism requires some light patching by downstream to opt into
the mechanism, so they can also easily remove all of the files
in pip/_vendor/ except for __init__.py.
If a single package is listed as a constraint; is a dependency of a
package being installed; *and* is already installed, we end up
processing it multiple times. This change adds a new "prepared" flag
which we set the first time the package is processed, to prevent
multiple handling.
Fixes bug #2888
This adds constraints files. Like requirements files constraints files
control what version of a package is installed, but unlike
requirements files this doesn't itself choose to install the package.
This allows things that aren't explicitly desired to be constrained if
and only if they are installed.
We only want to prompt people to upgrade their pip version if the
newer version is not a post release of the version they have and
it's not a pre-release version.
os.rename causes errors to happen whenever the temporary location
is on a different device than the wheel cache directory. Using
shutil.move instead will cause it to use os.rename when it can and
fallback to copying otherwise.
importing it prevent to debug other packages with `-W error` as the
deprecation warning will raise.
Though there is still imp imported from a few vendorized packages,
and for other purposes than cache_from_source.
and adjust the logic to match; the result is simpler.
2) Due to #1, we can remove some hairy "format_control" hacks
3) Due to #1, we have to relax the parsing and allow:
- multiple options per line
- any supported option on a line with a requirement (not just
--install-option/--global-option, although they are the only
options that are passed into a requirement)
6.1.1 and before use git clones when installing a git+ url. 7.0.0.dev
didn't temporarily because the internal flag for controlling whether
to do a clone or an export was tied into the behaviour for whether
something will be deleted or not - which is arguably different. Remove
the confounded behaviour and just unpack always, using the same marker
flag to delete directories as used for file and non-vcs urls.
Using --install-options, --build-options, --global-options changes
the way that setup.py behaves, and isn't honoured by the wheel code.
The new wheel autobuilding code made this very obvious - disable
the use of wheels when these options are supplied.
With wheel autobuilding in place a release blocker is some granular
way to opt-out of wheels for known-bad packages. This patch introduces
two new options: --no-binary and --only-binary to control what
archives we are willing to use on both a global and per-package basis.
This also closes#2084
Wheel cache lookups become more complex when we wish to allow binary
blacklisting. Rather than passing more parameters around, replace
cache_root with wheel_cache, and create a wheel cache in all the
relevant command entry points.