1
1
Fork 0
mirror of https://github.com/pypa/pip synced 2023-12-13 21:30:23 +01:00

Merge branch 'master' of https://github.com/pypa/pip into issue-6222

This commit is contained in:
Sebastian Jordan 2019-10-03 08:36:19 +02:00
commit 6a2d2dbb81
152 changed files with 3239 additions and 2284 deletions

View file

@ -15,7 +15,7 @@ jobs:
inputs:
versionSpec: '3'
- bash: pip install tox nox setuptools wheel
- bash: pip install twine nox setuptools wheel
displayName: Install dependencies
- bash: nox -s generate_authors
@ -24,12 +24,12 @@ jobs:
- bash: nox -s generate_news -- --yes
displayName: Generate NEWS.rst
- bash: tox -e packaging
displayName: Run Tox packaging
- bash: python setup.py sdist bdist_wheel
displayName: Create sdist and wheel
- bash: twine check dist/*
displayName: Check distributions with twine
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: dist'
inputs:

View file

@ -22,4 +22,4 @@ steps:
inputs:
testResultsFiles: junit/*.xml
testRunTitle: 'Python $(python.version)'
condition: succeededOrFailed()
condition: succeededOrFailed()

View file

@ -20,10 +20,6 @@ jobs:
env:
- TOXENV: docs
- TOXENV: lint
- TOXENV: lint-py2
PYTHON_VERSION: 2.7
- TOXENV: mypy
- TOXENV: packaging
steps:
- uses: actions/checkout@master
- name: Set up Python ${{ matrix.env.PYTHON_VERSION || 3.7 }}
@ -44,17 +40,3 @@ jobs:
run: >-
python -m tox
env: ${{ matrix.env }}
news_format:
name: Check NEWS format
runs-on: ubuntu-18.04
steps:
- uses: actions/checkout@master
- name: Set up Python
uses: actions/setup-python@v1
with:
version: 3.7
- name: Install nox
run: pip install nox
- name: Check NEWS format
run: nox -s validate_news

53
.pre-commit-config.yaml Normal file
View file

@ -0,0 +1,53 @@
exclude: 'src/pip/_vendor/'
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v2.3.0
hooks:
- id: check-builtin-literals
- id: check-added-large-files
- id: check-case-conflict
- id: check-toml
- id: check-yaml
- id: debug-statements
- id: end-of-file-fixer
exclude: WHEEL
- id: flake8
exclude: tests/data
- id: forbid-new-submodules
- id: trailing-whitespace
exclude: .patch
- repo: https://github.com/timothycrosley/isort
rev: 4.3.21
hooks:
- id: isort
files: \.py$
- repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.730
hooks:
- id: mypy
exclude: docs|tests
args: []
- id: mypy
name: mypy, for Py2
exclude: docs|tests
args: ["-2"]
- repo: https://github.com/pre-commit/pygrep-hooks
rev: v1.4.1
hooks:
- id: python-no-log-warn
- id: python-no-eval
- id: rst-backticks
# Validate existing ReST files and NEWS fragments.
files: .*\.rst$|^news/.*
types: [file]
# The errors flagged in NEWS.rst are old.
exclude: NEWS.rst
- repo: https://github.com/mgedmin/check-manifest
rev: '0.39'
hooks:
- id: check-manifest

View file

@ -17,10 +17,6 @@ jobs:
- stage: primary
env: TOXENV=docs
- env: TOXENV=lint
- env: TOXENV=lint-py2
python: 2.7
- env: TOXENV=mypy
- env: TOXENV=packaging
# Latest CPython
- env: GROUP=1
python: 2.7

View file

@ -16,6 +16,7 @@ exclude .mailmap
exclude .appveyor.yml
exclude .travis.yml
exclude .readthedocs.yml
exclude .pre-commit-config.yaml
exclude tox.ini
exclude noxfile.py

View file

@ -39,7 +39,7 @@ The ``README``, license, ``pyproject.toml``, ``setup.py``, and so on are in the
* ``news/`` *[pip stores news fragments… Every time pip makes a user-facing change, a file is added to this directory (usually a short note referring to a GitHub issue) with the right extension & name so it gets included in release notes…. So every release the maintainers will be deleting old files in this directory? Yes - we use the towncrier automation to generate a NEWS file, and auto-delete old stuff. Theres more about this in the contributor documentation!]*
* ``template.rst`` *[template for release notes -- this is a file towncrier uses…. Is this jinja? I dont know, check towncrier docs]*
* ``template.rst`` *[template for release notes -- this is a file towncrier uses…. Is this jinja? I dont know, check towncrier docs]*
* ``src/`` *[source; see below]*
* ``tasks/`` *[invoke is a PyPI library which uses files in this directory to define automation commands that are used in pips development processes -- not discussing further right now. For instance, automating the release.]*

View file

@ -140,4 +140,3 @@ files on PyPI. Its for getting all files of Flask.)
.. _`tracking issue`: https://github.com/pypa/pip/issues/6831
.. _PyPI: https://pypi.org/
.. _PyPI Simple API: https://warehouse.readthedocs.io/api-reference/legacy/#simple-project-api

View file

@ -85,8 +85,8 @@ difference may simply be historical and may not actually be necessary.)
Each of these commands also uses the ``PackageFinder`` class for pip's
"self-check," (i.e. to check whether a pip upgrade is available). In this
case, the ``PackageFinder`` instance is created by the ``outdated.py``
module's ``pip_version_check()`` function.
case, the ``PackageFinder`` instance is created by the
``self_outdated_check.py`` module's ``pip_self_version_check()`` function.
The ``PackageFinder`` class is responsible for doing all of the things listed
in the :ref:`Overview <index-py-overview>` section like fetching and parsing

View file

@ -89,6 +89,8 @@ from a description of the feature/change in one or more paragraphs, each wrapped
at 80 characters. Remember that a news entry is meant for end users and should
only contain details relevant to an end user.
.. _`choosing-news-entry-type`:
Choosing the type of NEWS entry
-------------------------------

View file

@ -11,11 +11,11 @@ process, please `open an issue`_ about it on the issue tracker.
Development Environment
-----------------------
pip uses :pypi:`tox` for testing against multiple different Python environments
and ensuring reproducible environments for linting and building documentation.
pip is a command line application written in Python. For developing pip,
you should `install Python`_ on your computer.
For developing pip, you need to install ``tox`` on your system. Often, you can
just do ``python -m pip install tox`` to install and use it.
For developing pip, you need to install :pypi:`tox`. Often, you can run
``python -m pip install tox`` to install and use it.
Running pip From Source Tree
----------------------------
@ -30,8 +30,9 @@ from the ``src`` directory:
Running Tests
-------------
pip uses the :pypi:`pytest` test framework, :pypi:`mock` and :pypi:`pretend`
for testing. These are automatically installed by tox for running the tests.
pip's tests are written using the :pypi:`pytest` test framework, :pypi:`mock`
and :pypi:`pretend`. :pypi:`tox` is used to automate the setup and execution of
pip's tests.
To run tests locally, run:
@ -42,7 +43,7 @@ To run tests locally, run:
The example above runs tests against Python 3.6. You can also use other
versions like ``py27`` and ``pypy3``.
``tox`` has been configured to any additional arguments it is given to
``tox`` has been configured to forward any additional arguments it is given to
``pytest``. This enables the use of pytest's `rich CLI`_. As an example, you
can select tests using the various ways that pytest provides:
@ -67,35 +68,15 @@ tools, you can tell pip to skip those tests:
Running Linters
---------------
pip uses :pypi:`flake8` and :pypi:`isort` for linting the codebase. These
ensure that the codebase is in compliance with :pep:`8` and the imports are
consistently ordered and styled.
pip uses :pypi:`pre-commit` for managing linting of the codebase.
``pre-commit`` performs various checks on all files in pip and uses tools that
help follow a consistent code style within the codebase.
To use linters locally, run:
.. code-block:: console
$ tox -e lint
$ tox -e lint-py2
The above commands run the linters on Python 3 followed by Python 2.
.. note::
Do not silence errors from flake8 with ``# noqa`` comments or otherwise.
Running mypy
------------
pip uses :pypi:`mypy` to run static type analysis, which helps catch certain
kinds of bugs. The codebase uses `PEP 484 type-comments`_ due to compatibility
requirements with Python 2.7.
To run the ``mypy`` type checker, run:
.. code-block:: console
$ tox -e mypy
Building Documentation
----------------------
@ -112,5 +93,6 @@ To build it locally, run:
The built documentation can be found in the ``docs/build`` folder.
.. _`open an issue`: https://github.com/pypa/pip/issues/new?title=Trouble+with+pip+development+environment
.. _`install Python`: https://realpython.com/installing-python/
.. _`PEP 484 type-comments`: https://www.python.org/dev/peps/pep-0484/#suggested-syntax-for-python-2-7-and-straddling-code
.. _`rich CLI`: https://docs.pytest.org/en/latest/usage.html#specifying-tests-selecting-tests

View file

@ -23,3 +23,290 @@ user support.
In the pip issue tracker, we make use of labels and milestones to organize and
track work.
Labels
------
Issue labels are used to:
#. Categorize issues
#. Provide status information for contributors and reporters
#. Help contributors find tasks to work on
The current set of labels are divided into several categories identified by
prefix:
**C - Category**
which area of ``pip`` functionality a feature request or issue is related to
**K - Kind**
**O - Operating System**
for issues that are OS-specific
**P - Project/Platform**
related to something external to ``pip``
**R - Resolution**
no more discussion is really needed, an action has been identified and the
issue is waiting or closed
**S - State**
for some automatic labels and other indicators that work is needed
**type**
the role or flavor of an issue
The specific labels falling into each category have a description that can be
seen on the `Labels <https://github.com/pypa/pip/labels>`__ page.
In addition, there are several standalone labels:
**good first issue**
this label marks an issue as beginner-friendly and shows up in banners that
GitHub displays for first-time visitors to the repository
**triage**
default label given to issues when they are created
**trivial**
special label for pull requests that removes the
:ref:`news file requirement <choosing-news-entry-type>`
**needs rebase or merge**
this is a special label used by BrownTruck to mark PRs that have merge
conflicts
Automation
----------
There are several helpers to manage issues and pull requests.
Issues created on the issue tracker are automatically given the
``triage`` label by the
`triage-new-issues <https://github.com/apps/triage-new-issues>`__
bot. The label is automatically removed when another label is added.
When an issue needs feedback from the author we can label it with
``S: awaiting response``. When the author responds, the
`no-response <https://github.com/apps/no-response>`__ bot removes the label.
After an issue has been closed for 30 days, the
`lock <https://github.com/apps/lock>`__ bot locks the issue and adds the
``S: auto-locked`` label. This allows us to avoid monitoring existing closed
issues, but unfortunately prevents and references to issues from showing up as
links on the closed issue.
Triage Issues
*************
Users can make issues for a number of reasons:
#. Suggestions about pip features that could be added or improved
#. Problems using pip
#. Concerns about pip usability
#. General packaging problems to be solved with pip
#. Problems installing or using Python packages
#. Problems managing virtual environments
#. Problems managing Python installations
To triage issues means to identify what kind of issue is happening and
* confirm bugs
* provide support
* discuss and design around the uses of the tool
Specifically, to address an issue:
#. Read issue title
#. Scan issue description
#. Ask questions
#. If time is available, try to reproduce
#. Search for or remember related issues and link to them
#. Identify an appropriate area of concern (if applicable)
Keep in mind that all communication is happening with other people and
should be done with respect per the
`Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`__.
The lifecycle of an issue (bug or support) generally looks like:
#. waiting for triage (marked with label ``triage``)
#. confirming issue - some discussion with the user, gathering
details, trying to reproduce the issue (may be marked with a specific
category, ``S: awaiting-respose``, ``S: discussion-needed``, or
``S: need-repro``)
#. confirmed - the issue is pretty consistently reproducible in a
straightforward way, or a mechanism that could be causing the issue has been
identified
#. awaiting fix - the fix is identified and no real discussion on the issue
is needed, should be marked ``R: awaiting PR``
#. closed - can be for several reasons
* fixed
* could not be reproduced, no more details could be obtained, and no
progress can be made
* actual issue was with another project or related to system
configuration and pip cannot (or will not) be adapted for it
Requesting information
----------------------
Requesting more information to better understand the context and environment
that led to the issue. Examples of specific information that may be useful
depending on the situation:
* pip debug: ``pip debug``
* pip version: ``pip -V``
* Python version: ``python -VV``
* Python path: ``python -c 'import sys; print(sys.executable)'``
* ``python`` on ``PATH``: Unix: ``which python``; Windows: ``where python``
* Python as resolved by the shell: ``type python``
* Origin of pip (get-pip.py, OS-level package manager, ensurepip, manual
installation)
* Using a virtual environment (with ``--system-site-packages``?)
* Using a conda environment
* ``PATH`` environment variable
* Network situation (e.g. airgapped environment, firewalls)
* ``--verbose`` output of a failing command
* (Unix) ``strace`` output from a failing command (be careful not to output
into the same directory as a package that's being installed, otherwise pip
will loop forever copying the log file...)
* (Windows)
`procmon <https://docs.microsoft.com/en-us/sysinternals/downloads/procmon>`__
output during a failing command
(`example request <https://github.com/pypa/pip/issues/6814#issuecomment-516611389>`__)
* Listing of files relevant to the issue (e.g. ``ls -l venv/lib/pythonX.Y/problem-package.dist-info/``)
* whether the unexpected behavior ever worked as expected - if so then what
were the details of the setup (same information as above)
Generally, information is good to request if it can help confirm or rule out
possible sources of error. We shouldn't request information that does not
improve our understanding of the situation.
Reproducing issues
------------------
Whenever an issue happens and the cause isn't obvious, it is important
that we be able to reproduce it independently. This serves several purposes:
#. If it is a pip bug, then any fix will need tests - a good reproducer
is most of the way towards that.
#. If it is not reproducible using the provided instructions, that helps
rule out a lot of possible causes.
#. A clear set of instructions is an easy way to get on the same page as
someone reporting an issue.
The best way to reproduce an issue is with a script.
A script can be copied into a file and executed, whereas shell output
has to be manually copied a line at a time.
Scripts to reproduce issues should be:
- portable (few/no assumptions about the system, other that it being Unix or Windows as applicable)
- non-destructive
- convenient
- require little/no setup on the part of the runner
Examples:
- creating and installing multiple wheels with different versions
(`link <https://github.com/pypa/pip/issues/4331#issuecomment-520156471>`__)
- using a small web server for authentication errors
(`link <https://github.com/pypa/pip/issues/2920#issuecomment-508953118>`__)
- using docker to test system or global configuration-related issues
(`link <https://github.com/pypa/pip/issues/5533#issuecomment-520159896>`__)
- using docker to test special filesystem permission/configurations
(`link <https://github.com/pypa/pip/issues/6364#issuecomment-507074729>`__)
- using docker for global installation with get-pip
(`link <https://github.com/pypa/pip/issues/6498#issuecomment-513501112>`__)
- get-pip on system with no ``/usr/lib64``
(`link <https://github.com/pypa/pip/issues/5379#issuecomment-515270576>`__)
- reproducing with ``pip`` from master branch
(`link <https://github.com/pypa/pip/issues/6707#issue-467770959>`__)
Reaching resolution
-------------------
Some user support questions are more related to system configuration than pip.
It's important to treat these issues with the same care and attention as
others, specifically:
#. Unless the issue is very old and the user doesn't seem active, wait for
confirmation before closing the issue
#. Direct the user to the most appropriate forum for their questions:
* For Ubuntu, `askubuntu <https://askubuntu.com/>`__
* For Other linuxes/unixes, `serverfault <https://serverfault.com/>`__
* For network connectivity issues,
`serverfault <https://serverfault.com/>`__
#. Just because a user support question is best solved using some other forum
doesn't mean that we can't make things easier. Try to extract and
understand from the user query how things could have been made easier for
them or you, for example with better warning or error messages. If an issue
does not exist covering that case then create one. If an issue does exist then
make sure to reference that issue before closing this one.
#. A user may be having trouble installing a package, where the package
``setup.py`` or build-backend configuration is non-trivial. In these cases we
can help to troubleshoot but the best advice is going to be to direct them
to the support channels for the related projects.
#. Do not be hasty to assume it is one cause or another. What looks like
someone else's problem may still be an issue in pip or at least something
that could be improved.
#. For general discussion on Python packaging:
* `pypa/packaging <https://github.com/pypa/packaging-problems>`__
* `discuss.python.org/packaging <https://discuss.python.org/c/packaging>`__
Closing issues
--------------
An issue may be considered resolved and closed when:
- for each possible improvement or problem represented in the issue
discussion:
- Consensus has been reached on a specific action and the actions
appear to be external to the project, with no follow up needed
in the project afterwards.
- PEP updates (with a corresponding issue in
`python/peps <https://github.com/python/peps>`__)
- already tracked by another issue
- A project-specific issue has been identified and the issue no
longer occurs as of the latest commit on the master branch.
- An enhancement or feature request no longer has a proponent and the maintainers
don't think it's worth keeping open.
- An issue has been identified as a duplicate, and it is clearly a duplicate (i.e. the
original report was very good and points directly to the issue)
- The issue has been fixed, and can be independently validated as no longer being an
issue. If this is with code then the specific change/PR that led to it should be
identified and posted for tracking.
Common issues
*************
#. network-related issues - any issue involving retries, address lookup, or
anything like that are typically network issues.
#. issues related to having multiple Python versions, or an OS package
manager-managed pip/python installation (specifically with Debian/Ubuntu).
These typically present themselves as:
#. Not being able to find installed packages
#. basic libraries not able to be found, fundamental OS components missing
#. In these situations you will want to make sure that we know how they got
their Python and pip. Knowing the relevant package manager commands can
help, e.g. ``dpkg -S``.

View file

@ -120,7 +120,7 @@ pip works on Unix/Linux, macOS, and Windows.
----
.. [1] "Secure" in this context means using a modern browser or a
tool like `curl` that verifies SSL certificates when downloading from
tool like ``curl`` that verifies SSL certificates when downloading from
https URLs.
.. [2] Beginning with pip v1.5.1, ``get-pip.py`` stopped requiring setuptools to

View file

@ -5,4 +5,3 @@ Internal Details
================
This content is now covered in the :doc:`Reference Guide <reference/index>`

View file

@ -234,4 +234,3 @@ General Options
***************
.. pip-general-options::

View file

@ -572,7 +572,7 @@ each sdist that wheels are built from and places the resulting wheels inside.
Pip attempts to choose the best wheels from those built in preference to
building a new wheel. Note that this means when a package has both optional
C extensions and builds `py` tagged wheels when the C extension can't be built
C extensions and builds ``py`` tagged wheels when the C extension can't be built
that pip will not attempt to build a better wheel for Pythons that would have
supported it, once any generic wheel is built. To correct this, make sure that
the wheels are built with Python specific tags - e.g. pp on PyPy.
@ -826,7 +826,7 @@ Options
Examples
********
#. Install `SomePackage` and its dependencies from `PyPI`_ using :ref:`Requirement Specifiers`
#. Install ``SomePackage`` and its dependencies from `PyPI`_ using :ref:`Requirement Specifiers`
::
@ -842,7 +842,7 @@ Examples
$ pip install -r requirements.txt
#. Upgrade an already installed `SomePackage` to the latest from PyPI.
#. Upgrade an already installed ``SomePackage`` to the latest from PyPI.
::

View file

@ -34,4 +34,3 @@ Examples
/home/me/env/lib/python2.7/site-packages/simplejson-2.2.1-py2.7.egg-info
Proceed (y/n)? y
Successfully uninstalled simplejson

View file

@ -71,3 +71,9 @@ Examples
$ pip wheel --wheel-dir=/tmp/wheelhouse SomePackage
$ pip install --no-index --find-links=/tmp/wheelhouse SomePackage
#. Build a wheel for a package from source
::
$ pip wheel --no-binary SomePackage SomePackage

View file

@ -90,7 +90,7 @@ In practice, there are 4 common uses of Requirements files:
1. Requirements files are used to hold the result from :ref:`pip freeze` for the
purpose of achieving :ref:`repeatable installations <Repeatability>`. In
this case, your requirement file contains a pinned version of everything that
was installed when `pip freeze` was run.
was installed when ``pip freeze`` was run.
::
@ -100,49 +100,45 @@ In practice, there are 4 common uses of Requirements files:
2. Requirements files are used to force pip to properly resolve dependencies.
As it is now, pip `doesn't have true dependency resolution
<https://github.com/pypa/pip/issues/988>`_, but instead simply uses the first
specification it finds for a project. E.g. if `pkg1` requires `pkg3>=1.0` and
`pkg2` requires `pkg3>=1.0,<=2.0`, and if `pkg1` is resolved first, pip will
only use `pkg3>=1.0`, and could easily end up installing a version of `pkg3`
that conflicts with the needs of `pkg2`. To solve this problem, you can
place `pkg3>=1.0,<=2.0` (i.e. the correct specification) into your
requirements file directly along with the other top level requirements. Like
so:
::
specification it finds for a project. E.g. if ``pkg1`` requires
``pkg3>=1.0`` and ``pkg2`` requires ``pkg3>=1.0,<=2.0``, and if ``pkg1`` is
resolved first, pip will only use ``pkg3>=1.0``, and could easily end up
installing a version of ``pkg3`` that conflicts with the needs of ``pkg2``.
To solve this problem, you can place ``pkg3>=1.0,<=2.0`` (i.e. the correct
specification) into your requirements file directly along with the other top
level requirements. Like so::
pkg1
pkg2
pkg3>=1.0,<=2.0
3. Requirements files are used to force pip to install an alternate version of a
sub-dependency. For example, suppose `ProjectA` in your requirements file
requires `ProjectB`, but the latest version (v1.3) has a bug, you can force
pip to accept earlier versions like so:
::
sub-dependency. For example, suppose ``ProjectA`` in your requirements file
requires ``ProjectB``, but the latest version (v1.3) has a bug, you can force
pip to accept earlier versions like so::
ProjectA
ProjectB<1.3
4. Requirements files are used to override a dependency with a local patch that
lives in version control. For example, suppose a dependency,
`SomeDependency` from PyPI has a bug, and you can't wait for an upstream fix.
lives in version control. For example, suppose a dependency
``SomeDependency`` from PyPI has a bug, and you can't wait for an upstream
fix.
You could clone/copy the src, make the fix, and place it in VCS with the tag
`sometag`. You'd reference it in your requirements file with a line like so:
::
``sometag``. You'd reference it in your requirements file with a line like
so::
git+https://myvcs.com/some_dependency@sometag#egg=SomeDependency
If `SomeDependency` was previously a top-level requirement in your
If ``SomeDependency`` was previously a top-level requirement in your
requirements file, then **replace** that line with the new line. If
`SomeDependency` is a sub-dependency, then **add** the new line.
``SomeDependency`` is a sub-dependency, then **add** the new line.
It's important to be clear that pip determines package dependencies using
`install_requires metadata
<https://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-dependencies>`_,
not by discovering `requirements.txt` files embedded in projects.
not by discovering ``requirements.txt`` files embedded in projects.
See also:
@ -198,7 +194,8 @@ to building and installing from source archives. For more information, see the
Pip prefers Wheels where they are available. To disable this, use the
:ref:`--no-binary <install_--no-binary>` flag for :ref:`pip install`.
If no satisfactory wheels are found, pip will default to finding source archives.
If no satisfactory wheels are found, pip will default to finding source
archives.
To install directly from a wheel archive:
@ -215,15 +212,16 @@ convenience, to build wheels for all your requirements and dependencies.
<https://pypi.org/project/wheel/>`_ to be installed, which provides the
"bdist_wheel" setuptools extension that it uses.
To build wheels for your requirements and all their dependencies to a local directory:
To build wheels for your requirements and all their dependencies to a local
directory:
::
pip install wheel
pip wheel --wheel-dir=/local/wheels -r requirements.txt
And *then* to install those requirements just using your local directory of wheels (and not from PyPI):
And *then* to install those requirements just using your local directory of
wheels (and not from PyPI):
::
@ -374,7 +372,7 @@ look like this:
Each subcommand can be configured optionally in its own section so that every
global setting with the same name will be overridden; e.g. decreasing the
``timeout`` to ``10`` seconds when running the `freeze`
``timeout`` to ``10`` seconds when running the ``freeze``
(`Freezing Requirements <./#freezing-requirements>`_) command and using
``60`` seconds for all other commands is possible with:
@ -447,16 +445,18 @@ is the same as calling::
.. note::
Environment variables set to be empty string will not be treated as false. Please use ``no``,
``false`` or ``0`` instead.
Environment variables set to be empty string will not be treated as false.
Please use ``no``, ``false`` or ``0`` instead.
Config Precedence
-----------------
Command line options have precedence over environment variables, which have precedence over the config file.
Command line options have precedence over environment variables, which have
precedence over the config file.
Within the config file, command specific sections have precedence over the global section.
Within the config file, command specific sections have precedence over the
global section.
Examples:
@ -483,8 +483,9 @@ To setup for fish::
$ pip completion --fish > ~/.config/fish/completions/pip.fish
Alternatively, you can use the result of the ``completion`` command
directly with the eval function of your shell, e.g. by adding the following to your startup file::
Alternatively, you can use the result of the ``completion`` command directly
with the eval function of your shell, e.g. by adding the following to your
startup file::
eval "`pip completion --bash`"
@ -551,14 +552,16 @@ With Python 2.6 came the `"user scheme" for installation
which means that all Python distributions support an alternative install
location that is specific to a user. The default location for each OS is
explained in the python documentation for the `site.USER_BASE
<https://docs.python.org/3/library/site.html#site.USER_BASE>`_ variable. This mode
of installation can be turned on by specifying the :ref:`--user
<https://docs.python.org/3/library/site.html#site.USER_BASE>`_ variable.
This mode of installation can be turned on by specifying the :ref:`--user
<install_--user>` option to ``pip install``.
Moreover, the "user scheme" can be customized by setting the
``PYTHONUSERBASE`` environment variable, which updates the value of ``site.USER_BASE``.
``PYTHONUSERBASE`` environment variable, which updates the value of
``site.USER_BASE``.
To install "SomePackage" into an environment with site.USER_BASE customized to '/myappenv', do the following::
To install "SomePackage" into an environment with site.USER_BASE customized to
'/myappenv', do the following::
export PYTHONUSERBASE=/myappenv
pip install --user SomePackage
@ -591,7 +594,8 @@ From within a ``--no-site-packages`` virtualenv (i.e. the default kind)::
Can not perform a '--user' install. User site-packages are not visible in this virtualenv.
From within a ``--system-site-packages`` virtualenv where ``SomePackage==0.3`` is already installed in the virtualenv::
From within a ``--system-site-packages`` virtualenv where ``SomePackage==0.3``
is already installed in the virtualenv::
$ pip install --user SomePackage==0.4
Will not install to the user site because it will lack sys.path precedence
@ -604,7 +608,8 @@ From within a real python, where ``SomePackage`` is *not* installed globally::
Successfully installed SomePackage
From within a real python, where ``SomePackage`` *is* installed globally, but is *not* the latest version::
From within a real python, where ``SomePackage`` *is* installed globally, but
is *not* the latest version::
$ pip install --user SomePackage
[...]
@ -615,7 +620,8 @@ From within a real python, where ``SomePackage`` *is* installed globally, but is
Successfully installed SomePackage
From within a real python, where ``SomePackage`` *is* installed globally, and is the latest version::
From within a real python, where ``SomePackage`` *is* installed globally, and
is the latest version::
$ pip install --user SomePackage
[...]
@ -679,7 +685,8 @@ requirements file for free). It can also substitute for a vendor library,
providing easier upgrades and less VCS noise. It does not, of course,
provide the availability benefits of a private index or a vendor library.
For more, see :ref:`pip install\'s discussion of hash-checking mode <hash-checking mode>`.
For more, see
:ref:`pip install\'s discussion of hash-checking mode <hash-checking mode>`.
.. _`Installation Bundle`:
@ -720,50 +727,54 @@ archives are built with identical packages.
Using pip from your program
***************************
As noted previously, pip is a command line program. While it is implemented in Python,
and so is available from your Python code via ``import pip``, you must not use pip's
internal APIs in this way. There are a number of reasons for this:
As noted previously, pip is a command line program. While it is implemented in
Python, and so is available from your Python code via ``import pip``, you must
not use pip's internal APIs in this way. There are a number of reasons for this:
#. The pip code assumes that is in sole control of the global state of the program.
Pip manages things like the logging system configuration, or the values of the
standard IO streams, without considering the possibility that user code might be
affected.
#. The pip code assumes that is in sole control of the global state of the
program.
pip manages things like the logging system configuration, or the values of
the standard IO streams, without considering the possibility that user code
might be affected.
#. Pip's code is *not* thread safe. If you were to run pip in a thread, there is no
guarantee that either your code or pip's would work as you expect.
#. pip's code is *not* thread safe. If you were to run pip in a thread, there
is no guarantee that either your code or pip's would work as you expect.
#. Pip assumes that once it has finished its work, the process will terminate. It
doesn't need to handle the possibility that other code will continue to run
after that point, so (for example) calling pip twice in the same process is
likely to have issues.
#. pip assumes that once it has finished its work, the process will terminate.
It doesn't need to handle the possibility that other code will continue to
run after that point, so (for example) calling pip twice in the same process
is likely to have issues.
This does not mean that the pip developers are opposed in principle to the idea that
pip could be used as a library - it's just that this isn't how it was written, and it
would be a lot of work to redesign the internals for use as a library, handling all
of the above issues, and designing a usable, robust and stable API that we could
guarantee would remain available across multiple releases of pip. And we simply don't
currently have the resources to even consider such a task.
This does not mean that the pip developers are opposed in principle to the idea
that pip could be used as a library - it's just that this isn't how it was
written, and it would be a lot of work to redesign the internals for use as a
library, handling all of the above issues, and designing a usable, robust and
stable API that we could guarantee would remain available across multiple
releases of pip. And we simply don't currently have the resources to even
consider such a task.
What this means in practice is that everything inside of pip is considered an
implementation detail. Even the fact that the import name is ``pip`` is subject to
change without notice. While we do try not to break things as much as possible, all
the internal APIs can change at any time, for any reason. It also means that we
generally *won't* fix issues that are a result of using pip in an unsupported way.
implementation detail. Even the fact that the import name is ``pip`` is subject
to change without notice. While we do try not to break things as much as
possible, all the internal APIs can change at any time, for any reason. It also
means that we generally *won't* fix issues that are a result of using pip in an
unsupported way.
It should also be noted that installing packages into ``sys.path`` in a running Python
process is something that should only be done with care. The import system caches
certain data, and installing new packages while a program is running may not always
behave as expected. In practice, there is rarely an issue, but it is something to be
aware of.
It should also be noted that installing packages into ``sys.path`` in a running
Python process is something that should only be done with care. The import
system caches certain data, and installing new packages while a program is
running may not always behave as expected. In practice, there is rarely an
issue, but it is something to be aware of.
Having said all of the above, it is worth covering the options available if you
decide that you do want to run pip from within your program. The most reliable
approach, and the one that is fully supported, is to run pip in a subprocess. This
is easily done using the standard ``subprocess`` module::
approach, and the one that is fully supported, is to run pip in a subprocess.
This is easily done using the standard ``subprocess`` module::
subprocess.check_call([sys.executable, '-m', 'pip', 'install', 'my_package'])
If you want to process the output further, use one of the other APIs in the module::
If you want to process the output further, use one of the other APIs in the
module::
reqs = subprocess.check_output([sys.executable, '-m', 'pip', 'freeze'])

2
news/3907.bugfix Normal file
View file

@ -0,0 +1,2 @@
Abort installation if any archive contains a file which would be placed
outside the extraction location.

3
news/6532.trivial Normal file
View file

@ -0,0 +1,3 @@
Rename ``pip._internal.utils.outdated`` to
``pip._internal.self_outdated_check`` and rename ``pip_version_check``
to ``pip_self_version_check``.

1
news/6653.trivial Normal file
View file

@ -0,0 +1 @@
Add functional tests for "yanked" files.

View file

@ -1 +1 @@
Clarify WheelBuilder.build() a bit
Clarify WheelBuilder.build() a bit

View file

@ -1 +1 @@
replace is_vcs_url function by is_vcs Link property
replace is_vcs_url function by is_vcs Link property

View file

@ -1,2 +1,2 @@
Correctly uninstall symlinks that were installed in a virtualenv,
by tools such as ``flit install --symlink``.
by tools such as ``flit install --symlink``.

View file

@ -1 +1 @@
Ignore "require_virtualenv" in `pip config`
Ignore "require_virtualenv" in ``pip config``

View file

@ -1,2 +1,3 @@
Remove undocumented support for http:// requirements pointing to SVN
repositories.
Remove undocumented support for un-prefixed URL requirements pointing
to SVN repositories. Users relying on this can get the original behavior
by prefixing their URL with ``svn+`` (which is backwards-compatible).

2
news/7090.trivial Normal file
View file

@ -0,0 +1,2 @@
Move PipXmlrpcTransport from pip._internal.download to pip._internal.network.xmlrpc
and move associated tests to tests.unit.test_network_xmlrpc

1
news/7094.trivial Normal file
View file

@ -0,0 +1 @@
Remove DependencyWarning warning from pip._internal

1
news/7118.bugfix Normal file
View file

@ -0,0 +1 @@
Fix a crash when ``sys.stdin`` is set to ``None``, such as on AWS Lambda.

1
news/7119.bugfix Normal file
View file

@ -0,0 +1 @@
Fix a crash when ``sys.stdin`` is set to ``None``, such as on AWS Lambda.

View file

@ -1 +1 @@
Fix copy-paste issue in `test_pep518_forkbombs`.
Fix copy-paste issue in ``test_pep518_forkbombs``.

View file

@ -0,0 +1 @@
Remove unused assignment.

View file

@ -1,12 +1,28 @@
"""Release time helpers, executed using nox.
"""Automation using nox.
"""
import glob
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import io
import os
import shutil
import subprocess
import nox
nox.options.reuse_existing_virtualenvs = True
nox.options.sessions = ["lint"]
LOCATIONS = {
"common-wheels": "tests/data/common_wheels",
"protected-pip": "tools/tox_pip.py",
}
REQUIREMENTS = {
"tests": "tools/requirements/tests.txt",
"common-wheels": "tools/requirements/tests-common_wheels.txt",
}
def get_author_list():
"""Get the list of authors from Git commits.
@ -31,22 +47,102 @@ def get_author_list():
return sorted(authors, key=lambda x: x.lower())
def protected_pip(*arguments):
"""Get arguments for session.run, that use a "protected" pip.
This invokes a wrapper script, that forwards calls to original virtualenv
(stable) version, and not the code being tested. This ensures pip being
used is not the code being tested.
"""
return ("python", LOCATIONS["protected-pip"]) + arguments
def should_update_common_wheels():
# If the cache hasn't been created, create it.
if not os.path.exists(LOCATIONS["common-wheels"]):
return True
# If the requirements was updated after cache, we'll repopulate it.
cache_last_populated_at = os.path.getmtime(LOCATIONS["common-wheels"])
requirements_updated_at = os.path.getmtime(REQUIREMENTS["common-wheels"])
need_to_repopulate = requirements_updated_at > cache_last_populated_at
# Clear the stale cache.
if need_to_repopulate:
shutil.remove(LOCATIONS["common-wheels"], ignore_errors=True)
return need_to_repopulate
# -----------------------------------------------------------------------------
# Ad-hoc commands
# Development Commands
# These are currently prototypes to evaluate whether we want to switch over
# completely to nox for all our automation. Contributors should prefer using
# `tox -e ...` until this note is removed.
# -----------------------------------------------------------------------------
@nox.session(python=["2.7", "3.5", "3.6", "3.7", "pypy"])
def test(session):
# Get the common wheels.
if should_update_common_wheels():
session.run(*protected_pip(
"wheel",
"-w", LOCATIONS["common-wheels"],
"-r", REQUIREMENTS["common-wheels"],
))
# Install sources and dependencies
session.run(*protected_pip("install", "."))
session.run(*protected_pip("install", "-r", REQUIREMENTS["tests"]))
# Parallelize tests as much as possible, by default.
arguments = session.posargs or ["-n", "auto"]
# Run the tests
# LC_CTYPE is set to get UTF-8 output inside of the subprocesses that our
# tests use.
session.run("pytest", *arguments, env={"LC_CTYPE": "en_US.UTF-8"})
@nox.session
def validate_news(session):
session.install("rstcheck")
def docs(session):
session.install(".")
session.install("-r", REQUIREMENTS["docs"])
news_files = sorted(glob.glob("news/*"))
def get_sphinx_build_command(kind):
# Having the conf.py in the docs/html is weird but needed because we
# can not use a different configuration directory vs source directory
# on RTD currently. So, we'll pass "-c docs/html" here.
# See https://github.com/rtfd/readthedocs.org/issues/1543.
return [
"sphinx-build",
"-W",
"-c", "docs/html", # see note above
"-d", "docs/build/doctrees/" + kind,
"-b", kind,
"docs/" + kind,
"docs/build/" + kind,
]
session.run("rstcheck", *news_files)
session.run(*get_sphinx_build_command("html"))
session.run(*get_sphinx_build_command("man"))
# -----------------------------------------------------------------------------
# Commands used during the release process
# -----------------------------------------------------------------------------
@nox.session
def lint(session):
session.install("pre-commit")
if session.posargs:
args = session.posargs + ["--all-files"]
else:
args = ["--all-files", "--show-diff-on-failure"]
session.run("pre-commit", "run", *args)
# -----------------------------------------------------------------------------
# Release Commands
# -----------------------------------------------------------------------------
@nox.session(python=False)
def generate_authors(session):
# Get our list of authors
session.log("Collecting author names")

View file

@ -29,6 +29,7 @@ ignore = W504
[mypy]
follow_imports = silent
ignore_missing_imports = True
disallow_untyped_defs = True
[mypy-pip/_vendor/*]
follow_imports = skip

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import codecs
import os
import re
@ -71,9 +74,9 @@ setup(
},
entry_points={
"console_scripts": [
"pip=pip._internal:main",
"pip%s=pip._internal:main" % sys.version_info[:1],
"pip%s.%s=pip._internal:main" % sys.version_info[:2],
"pip=pip._internal.main:main",
"pip%s=pip._internal.main:main" % sys.version_info[:1],
"pip%s.%s=pip._internal.main:main" % sys.version_info[:2],
],
},

View file

@ -13,7 +13,7 @@ if __package__ == '':
path = os.path.dirname(os.path.dirname(__file__))
sys.path.insert(0, path)
from pip._internal import main as _main # isort:skip # noqa
from pip._internal.main import main as _main # isort:skip # noqa
if __name__ == '__main__':
sys.exit(_main())

View file

@ -1,59 +1,13 @@
#!/usr/bin/env python
from __future__ import absolute_import
import locale
import logging
import os
import sys
import warnings
# We ignore certain warnings from urllib3, since they are not relevant to pip's
# usecases.
from pip._vendor.urllib3.exceptions import (
DependencyWarning,
InsecureRequestWarning,
)
from pip._vendor.urllib3.exceptions import InsecureRequestWarning
import pip._internal.utils.inject_securetransport # noqa
from pip._internal.cli.autocompletion import autocomplete
from pip._internal.cli.main_parser import parse_command
from pip._internal.commands import create_command
from pip._internal.exceptions import PipError
from pip._internal.utils import deprecation
# Raised when using --trusted-host.
warnings.filterwarnings("ignore", category=InsecureRequestWarning)
# Raised since socks support depends on PySocks, which may not be installed.
# Barry Warsaw noted (on 2016-06-17) that this should be done before
# importing pip.vcs, which has since moved to pip._internal.vcs.
warnings.filterwarnings("ignore", category=DependencyWarning)
logger = logging.getLogger(__name__)
def main(args=None):
if args is None:
args = sys.argv[1:]
# Configure our deprecation warnings to be sent through loggers
deprecation.install_warning_logger()
autocomplete()
try:
cmd_name, cmd_args = parse_command(args)
except PipError as exc:
sys.stderr.write("ERROR: %s" % exc)
sys.stderr.write(os.linesep)
sys.exit(1)
# Needed for locale.getpreferredencoding(False) to work
# in pip._internal.utils.encoding.auto_decode
try:
locale.setlocale(locale.LC_ALL, '')
except locale.Error as e:
# setlocale can apparently crash if locale are uninitialized
logger.debug("Ignoring error %s when setting locale", e)
command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
return command.main(cmd_args)

View file

@ -3,6 +3,7 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
import os
@ -15,7 +16,7 @@ from sysconfig import get_paths
from pip._vendor.pkg_resources import Requirement, VersionConflict, WorkingSet
from pip import __file__ as pip_location
from pip._internal.utils.misc import call_subprocess
from pip._internal.utils.subprocess import call_subprocess
from pip._internal.utils.temp_dir import TempDirectory
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.ui import open_spinner
@ -54,7 +55,6 @@ class BuildEnvironment(object):
def __init__(self):
# type: () -> None
self._temp_dir = TempDirectory(kind="build-env")
self._temp_dir.create()
self._prefixes = OrderedDict((
(name, _Prefix(os.path.join(self._temp_dir.path, name)))

View file

@ -13,9 +13,9 @@ from pip._vendor.packaging.utils import canonicalize_name
from pip._internal.models.link import Link
from pip._internal.utils.compat import expanduser
from pip._internal.utils.misc import path_to_url
from pip._internal.utils.temp_dir import TempDirectory
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.urls import path_to_url
from pip._internal.wheel import InvalidWheelFilename, Wheel
if MYPY_CHECK_RUNNING:
@ -193,7 +193,6 @@ class EphemWheelCache(SimpleWheelCache):
def __init__(self, format_control):
# type: (FormatControl) -> None
self._temp_dir = TempDirectory(kind="ephem-wheel-cache")
self._temp_dir.create()
super(EphemWheelCache, self).__init__(
self._temp_dir.path, format_control

View file

@ -1,6 +1,9 @@
"""Logic that powers autocompletion installed by ``pip completion``.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import optparse
import os
import sys

View file

@ -9,6 +9,7 @@ pass on state. To be consistent, all options will follow this design.
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
@ -453,9 +454,9 @@ def no_binary():
help="Do not use binary packages. Can be supplied multiple times, and "
"each time adds to the existing value. Accepts either :all: to "
"disable all binary packages, :none: to empty the set, or one or "
"more package names with commas between them. Note that some "
"packages are tricky to compile and may fail to install when "
"this option is used on them.",
"more package names with commas between them (no colons). Note "
"that some packages are tricky to compile and may fail to "
"install when this option is used on them.",
)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from contextlib import contextmanager
from pip._vendor.contextlib2 import ExitStack

View file

@ -1,4 +1,8 @@
"""Base option parser setup"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging

View file

@ -5,16 +5,19 @@ needing download / PackageFinder capability don't unnecessarily import the
PackageFinder machinery and all its vendored dependencies, etc.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import os
from functools import partial
from pip._internal.cli.base_command import Command
from pip._internal.cli.command_context import CommandContextMixIn
from pip._internal.download import PipSession
from pip._internal.exceptions import CommandError
from pip._internal.index import PackageFinder
from pip._internal.legacy_resolve import Resolver
from pip._internal.models.selection_prefs import SelectionPreferences
from pip._internal.network.session import PipSession
from pip._internal.operations.prepare import RequirementPreparer
from pip._internal.req.constructors import (
install_req_from_editable,
@ -22,8 +25,11 @@ from pip._internal.req.constructors import (
install_req_from_req_string,
)
from pip._internal.req.req_file import parse_requirements
from pip._internal.self_outdated_check import (
make_link_collector,
pip_self_version_check,
)
from pip._internal.utils.misc import normalize_path
from pip._internal.utils.outdated import make_link_collector, pip_version_check
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
@ -133,7 +139,7 @@ class IndexGroupCommand(Command, SessionCommandMixin):
timeout=min(5, options.timeout)
)
with session:
pip_version_check(session, options)
pip_self_version_check(session, options)
class RequirementCommand(IndexGroupCommand):

View file

@ -2,6 +2,9 @@
The main purpose of this module is to expose LinkCollector.collect_links().
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import cgi
import itertools
import logging
@ -17,9 +20,9 @@ from pip._vendor.six.moves.urllib import request as urllib_request
from pip._internal.models.link import Link
from pip._internal.utils.filetypes import ARCHIVE_EXTENSIONS
from pip._internal.utils.misc import path_to_url, redact_auth_from_url
from pip._internal.utils.misc import redact_auth_from_url
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.urls import url_to_path
from pip._internal.utils.urls import path_to_url, url_to_path
from pip._internal.vcs import is_url, vcs
if MYPY_CHECK_RUNNING:
@ -32,7 +35,7 @@ if MYPY_CHECK_RUNNING:
from pip._vendor.requests import Response
from pip._internal.models.search_scope import SearchScope
from pip._internal.download import PipSession
from pip._internal.network.session import PipSession
HTMLElement = xml.etree.ElementTree.Element
ResponseHeaders = MutableMapping[str, str]
@ -156,7 +159,7 @@ def _get_html_response(url, session):
def _get_encoding_from_headers(headers):
# type: (Optional[ResponseHeaders]) -> Optional[str]
# type: (ResponseHeaders) -> Optional[str]
"""Determine if we have any encoding information in our headers.
"""
if headers and "Content-Type" in headers:
@ -244,22 +247,18 @@ def _create_link_from_element(
return link
def parse_links(
html, # type: bytes
encoding, # type: Optional[str]
url, # type: str
):
# type: (...) -> Iterable[Link]
def parse_links(page):
# type: (HTMLPage) -> Iterable[Link]
"""
Parse an HTML document, and yield its anchor elements as Link objects.
:param url: the URL from which the HTML was downloaded.
"""
document = html5lib.parse(
html,
transport_encoding=encoding,
page.content,
transport_encoding=page.encoding,
namespaceHTMLElements=False,
)
url = page.url
base_url = _determine_base_url(document, url)
for anchor in document.findall(".//a"):
link = _create_link_from_element(
@ -275,22 +274,24 @@ def parse_links(
class HTMLPage(object):
"""Represents one page, along with its URL"""
def __init__(self, content, url, headers=None):
# type: (bytes, str, ResponseHeaders) -> None
def __init__(
self,
content, # type: bytes
encoding, # type: Optional[str]
url, # type: str
):
# type: (...) -> None
"""
:param encoding: the encoding to decode the given content.
:param url: the URL from which the HTML was downloaded.
"""
self.content = content
self.encoding = encoding
self.url = url
self.headers = headers
def __str__(self):
return redact_auth_from_url(self.url)
def iter_links(self):
# type: () -> Iterable[Link]
"""Yields all links in the page"""
encoding = _get_encoding_from_headers(self.headers)
for link in parse_links(self.content, encoding=encoding, url=self.url):
yield link
def _handle_get_page_fail(
link, # type: Link
@ -303,6 +304,12 @@ def _handle_get_page_fail(
meth("Could not fetch URL %s: %s - skipping", link, reason)
def _make_html_page(response):
# type: (Response) -> HTMLPage
encoding = _get_encoding_from_headers(response.headers)
return HTMLPage(response.content, encoding=encoding, url=response.url)
def _get_html_page(link, session=None):
# type: (Link, Optional[PipSession]) -> Optional[HTMLPage]
if session is None:
@ -353,7 +360,7 @@ def _get_html_page(link, session=None):
except requests.Timeout:
_handle_get_page_fail(link, "timed out")
else:
return HTMLPage(resp.content, resp.url, resp.headers)
return _make_html_page(resp)
return None
@ -532,7 +539,7 @@ class LinkCollector(object):
pages_links = {}
for page in self._get_pages(url_locations):
pages_links[page.url] = list(page.iter_links())
pages_links[page.url] = list(parse_links(page))
return CollectedLinks(
files=file_links,

View file

@ -1,10 +1,14 @@
"""
Package containing all pip commands
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import importlib
from collections import namedtuple, OrderedDict
from collections import OrderedDict, namedtuple
from pip._internal.utils.typing import MYPY_CHECK_RUNNING

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
from pip._internal.cli.base_command import Command

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import sys

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
import os
import subprocess

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import locale

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import sys

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import hashlib

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
from pip._internal.cli.base_command import Command

View file

@ -3,6 +3,7 @@
# couple errors where we have to know req.name is str rather than
# Optional[str] for the InstallRequirement req.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
@ -319,7 +320,6 @@ class InstallCommand(RequirementCommand):
# Create a target directory for using with the target option
target_temp_dir = TempDirectory(kind="target")
target_temp_dir.create()
target_temp_dir_path = target_temp_dir.path
install_options.append('--home=' + target_temp_dir_path)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import json
@ -11,12 +14,12 @@ from pip._internal.cli.req_command import IndexGroupCommand
from pip._internal.exceptions import CommandError
from pip._internal.index import PackageFinder
from pip._internal.models.selection_prefs import SelectionPreferences
from pip._internal.self_outdated_check import make_link_collector
from pip._internal.utils.misc import (
dist_is_editable,
get_installed_distributions,
write_output,
)
from pip._internal.utils.outdated import make_link_collector
from pip._internal.utils.packaging import get_installer
logger = logging.getLogger(__name__)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging
@ -14,9 +17,9 @@ from pip._vendor.six.moves import xmlrpc_client # type: ignore
from pip._internal.cli.base_command import Command
from pip._internal.cli.req_command import SessionCommandMixin
from pip._internal.cli.status_codes import NO_MATCHES_FOUND, SUCCESS
from pip._internal.download import PipXmlrpcTransport
from pip._internal.exceptions import CommandError
from pip._internal.models.index import PyPI
from pip._internal.network.xmlrpc import PipXmlrpcTransport
from pip._internal.utils.compat import get_terminal_size
from pip._internal.utils.logging import indent_log
from pip._internal.utils.misc import write_output

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
from pip._vendor.packaging.utils import canonicalize_name

View file

@ -1,4 +1,8 @@
# -*- coding: utf-8 -*-
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging

View file

@ -13,6 +13,7 @@ Some terminology:
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import locale
import logging

View file

@ -1,6 +1,5 @@
from pip._internal.distributions.source.legacy import SourceDistribution
from pip._internal.distributions.wheel import WheelDistribution
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import abc
from pip._vendor.six import add_metaclass

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from pip._internal.distributions.base import AbstractDistribution

View file

@ -1,8 +1,12 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
from pip._internal.build_env import BuildEnvironment
from pip._internal.distributions.base import AbstractDistribution
from pip._internal.exceptions import InstallationError
from pip._internal.utils.subprocess import runner_with_spinner_message
logger = logging.getLogger(__name__)
@ -78,9 +82,13 @@ class SourceDistribution(AbstractDistribution):
# This must be done in a second pass, as the pyproject.toml
# dependencies must be installed before we can call the backend.
with self.req.build_env:
# We need to have the env active when calling the hook.
self.req.spin_message = "Getting requirements to build wheel"
reqs = self.req.pep517_backend.get_requires_for_build_wheel()
runner = runner_with_spinner_message(
"Getting requirements to build wheel"
)
backend = self.req.pep517_backend
with backend.subprocess_runner(runner):
reqs = backend.get_requires_for_build_wheel()
conflicting, missing = self.req.build_env.check_requirements(reqs)
if conflicting:
_raise_conflicts("the backend dependencies", conflicting)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from pip._vendor import pkg_resources
from pip._internal.distributions.base import AbstractDistribution

View file

@ -1,57 +1,34 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import cgi
import email.utils
import json
import logging
import mimetypes
import os
import platform
import re
import shutil
import sys
from contextlib import contextmanager
from pip._vendor import requests, six, urllib3
from pip._vendor.cachecontrol import CacheControlAdapter
from pip._vendor.cachecontrol.cache import BaseCache
from pip._vendor.cachecontrol.caches import FileCache
from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
from pip._vendor import requests
from pip._vendor.requests.models import CONTENT_CHUNK_SIZE, Response
from pip._vendor.requests.structures import CaseInsensitiveDict
from pip._vendor.six import PY2
# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is
# why we ignore the type on this import
from pip._vendor.six.moves import xmlrpc_client # type: ignore
from pip._vendor.six.moves.urllib import parse as urllib_parse
import pip
from pip._internal.exceptions import HashMismatch, InstallationError
from pip._internal.models.index import PyPI
from pip._internal.network.auth import MultiDomainBasicAuth
# Import ssl from compat so the initial import occurs in only one place.
from pip._internal.utils.compat import HAS_TLS, ipaddress, ssl
from pip._internal.network.session import PipSession
from pip._internal.utils.encoding import auto_decode
from pip._internal.utils.filesystem import (
adjacent_tmp_file,
check_path_owner,
copy2_fixed,
replace,
)
from pip._internal.utils.glibc import libc_ver
from pip._internal.utils.filesystem import copy2_fixed
from pip._internal.utils.misc import (
ask_path_exists,
backup_dir,
build_url_from_netloc,
consume,
display_path,
ensure_dir,
format_size,
get_installed_version,
hide_url,
parse_netloc,
path_to_display,
path_to_url,
rmtree,
splitext,
)
@ -59,12 +36,12 @@ from pip._internal.utils.temp_dir import TempDirectory
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.ui import DownloadProgressProvider
from pip._internal.utils.unpacking import unpack_file
from pip._internal.utils.urls import get_url_scheme, url_to_path
from pip._internal.utils.urls import get_url_scheme
from pip._internal.vcs import vcs
if MYPY_CHECK_RUNNING:
from typing import (
IO, Callable, Iterator, List, Optional, Text, Tuple, Union,
IO, Callable, List, Optional, Text, Tuple,
)
from mypy_extensions import TypedDict
@ -73,8 +50,6 @@ if MYPY_CHECK_RUNNING:
from pip._internal.utils.hashes import Hashes
from pip._internal.vcs.versioncontrol import VersionControl
SecureOrigin = Tuple[str, str, Optional[Union[int, str]]]
if PY2:
CopytreeKwargs = TypedDict(
'CopytreeKwargs',
@ -98,9 +73,8 @@ if MYPY_CHECK_RUNNING:
__all__ = ['get_file_content',
'path_to_url',
'unpack_vcs_link',
'unpack_file_url', 'is_file_url',
'unpack_file_url',
'unpack_http_url', 'unpack_url',
'parse_content_disposition', 'sanitize_content_filename']
@ -108,431 +82,6 @@ __all__ = ['get_file_content',
logger = logging.getLogger(__name__)
SECURE_ORIGINS = [
# protocol, hostname, port
# Taken from Chrome's list of secure origins (See: http://bit.ly/1qrySKC)
("https", "*", "*"),
("*", "localhost", "*"),
("*", "127.0.0.0/8", "*"),
("*", "::1/128", "*"),
("file", "*", None),
# ssh is always secure.
("ssh", "*", "*"),
] # type: List[SecureOrigin]
# These are environment variables present when running under various
# CI systems. For each variable, some CI systems that use the variable
# are indicated. The collection was chosen so that for each of a number
# of popular systems, at least one of the environment variables is used.
# This list is used to provide some indication of and lower bound for
# CI traffic to PyPI. Thus, it is okay if the list is not comprehensive.
# For more background, see: https://github.com/pypa/pip/issues/5499
CI_ENVIRONMENT_VARIABLES = (
# Azure Pipelines
'BUILD_BUILDID',
# Jenkins
'BUILD_ID',
# AppVeyor, CircleCI, Codeship, Gitlab CI, Shippable, Travis CI
'CI',
# Explicit environment variable.
'PIP_IS_CI',
)
def looks_like_ci():
# type: () -> bool
"""
Return whether it looks like pip is running under CI.
"""
# We don't use the method of checking for a tty (e.g. using isatty())
# because some CI systems mimic a tty (e.g. Travis CI). Thus that
# method doesn't provide definitive information in either direction.
return any(name in os.environ for name in CI_ENVIRONMENT_VARIABLES)
def user_agent():
"""
Return a string representing the user agent.
"""
data = {
"installer": {"name": "pip", "version": pip.__version__},
"python": platform.python_version(),
"implementation": {
"name": platform.python_implementation(),
},
}
if data["implementation"]["name"] == 'CPython':
data["implementation"]["version"] = platform.python_version()
elif data["implementation"]["name"] == 'PyPy':
if sys.pypy_version_info.releaselevel == 'final':
pypy_version_info = sys.pypy_version_info[:3]
else:
pypy_version_info = sys.pypy_version_info
data["implementation"]["version"] = ".".join(
[str(x) for x in pypy_version_info]
)
elif data["implementation"]["name"] == 'Jython':
# Complete Guess
data["implementation"]["version"] = platform.python_version()
elif data["implementation"]["name"] == 'IronPython':
# Complete Guess
data["implementation"]["version"] = platform.python_version()
if sys.platform.startswith("linux"):
from pip._vendor import distro
distro_infos = dict(filter(
lambda x: x[1],
zip(["name", "version", "id"], distro.linux_distribution()),
))
libc = dict(filter(
lambda x: x[1],
zip(["lib", "version"], libc_ver()),
))
if libc:
distro_infos["libc"] = libc
if distro_infos:
data["distro"] = distro_infos
if sys.platform.startswith("darwin") and platform.mac_ver()[0]:
data["distro"] = {"name": "macOS", "version": platform.mac_ver()[0]}
if platform.system():
data.setdefault("system", {})["name"] = platform.system()
if platform.release():
data.setdefault("system", {})["release"] = platform.release()
if platform.machine():
data["cpu"] = platform.machine()
if HAS_TLS:
data["openssl_version"] = ssl.OPENSSL_VERSION
setuptools_version = get_installed_version("setuptools")
if setuptools_version is not None:
data["setuptools_version"] = setuptools_version
# Use None rather than False so as not to give the impression that
# pip knows it is not being run under CI. Rather, it is a null or
# inconclusive result. Also, we include some value rather than no
# value to make it easier to know that the check has been run.
data["ci"] = True if looks_like_ci() else None
user_data = os.environ.get("PIP_USER_AGENT_USER_DATA")
if user_data is not None:
data["user_data"] = user_data
return "{data[installer][name]}/{data[installer][version]} {json}".format(
data=data,
json=json.dumps(data, separators=(",", ":"), sort_keys=True),
)
class LocalFSAdapter(BaseAdapter):
def send(self, request, stream=None, timeout=None, verify=None, cert=None,
proxies=None):
pathname = url_to_path(request.url)
resp = Response()
resp.status_code = 200
resp.url = request.url
try:
stats = os.stat(pathname)
except OSError as exc:
resp.status_code = 404
resp.raw = exc
else:
modified = email.utils.formatdate(stats.st_mtime, usegmt=True)
content_type = mimetypes.guess_type(pathname)[0] or "text/plain"
resp.headers = CaseInsensitiveDict({
"Content-Type": content_type,
"Content-Length": stats.st_size,
"Last-Modified": modified,
})
resp.raw = open(pathname, "rb")
resp.close = resp.raw.close
return resp
def close(self):
pass
@contextmanager
def suppressed_cache_errors():
"""If we can't access the cache then we can just skip caching and process
requests as if caching wasn't enabled.
"""
try:
yield
except (OSError, IOError):
pass
class SafeFileCache(BaseCache):
"""
A file based cache which is safe to use even when the target directory may
not be accessible or writable.
"""
def __init__(self, directory):
# type: (str) -> None
assert directory is not None, "Cache directory must not be None."
super(SafeFileCache, self).__init__()
self.directory = directory
def _get_cache_path(self, name):
# type: (str) -> str
# From cachecontrol.caches.file_cache.FileCache._fn, brought into our
# class for backwards-compatibility and to avoid using a non-public
# method.
hashed = FileCache.encode(name)
parts = list(hashed[:5]) + [hashed]
return os.path.join(self.directory, *parts)
def get(self, key):
# type: (str) -> Optional[bytes]
path = self._get_cache_path(key)
with suppressed_cache_errors():
with open(path, 'rb') as f:
return f.read()
def set(self, key, value):
# type: (str, bytes) -> None
path = self._get_cache_path(key)
with suppressed_cache_errors():
ensure_dir(os.path.dirname(path))
with adjacent_tmp_file(path) as f:
f.write(value)
replace(f.name, path)
def delete(self, key):
# type: (str) -> None
path = self._get_cache_path(key)
with suppressed_cache_errors():
os.remove(path)
class InsecureHTTPAdapter(HTTPAdapter):
def cert_verify(self, conn, url, verify, cert):
conn.cert_reqs = 'CERT_NONE'
conn.ca_certs = None
class PipSession(requests.Session):
timeout = None # type: Optional[int]
def __init__(self, *args, **kwargs):
"""
:param trusted_hosts: Domains not to emit warnings for when not using
HTTPS.
"""
retries = kwargs.pop("retries", 0)
cache = kwargs.pop("cache", None)
trusted_hosts = kwargs.pop("trusted_hosts", []) # type: List[str]
index_urls = kwargs.pop("index_urls", None)
super(PipSession, self).__init__(*args, **kwargs)
# Namespace the attribute with "pip_" just in case to prevent
# possible conflicts with the base class.
self.pip_trusted_origins = [] # type: List[Tuple[str, Optional[int]]]
# Attach our User Agent to the request
self.headers["User-Agent"] = user_agent()
# Attach our Authentication handler to the session
self.auth = MultiDomainBasicAuth(index_urls=index_urls)
# Create our urllib3.Retry instance which will allow us to customize
# how we handle retries.
retries = urllib3.Retry(
# Set the total number of retries that a particular request can
# have.
total=retries,
# A 503 error from PyPI typically means that the Fastly -> Origin
# connection got interrupted in some way. A 503 error in general
# is typically considered a transient error so we'll go ahead and
# retry it.
# A 500 may indicate transient error in Amazon S3
# A 520 or 527 - may indicate transient error in CloudFlare
status_forcelist=[500, 503, 520, 527],
# Add a small amount of back off between failed requests in
# order to prevent hammering the service.
backoff_factor=0.25,
)
# Check to ensure that the directory containing our cache directory
# is owned by the user current executing pip. If it does not exist
# we will check the parent directory until we find one that does exist.
if cache and not check_path_owner(cache):
logger.warning(
"The directory '%s' or its parent directory is not owned by "
"the current user and the cache has been disabled. Please "
"check the permissions and owner of that directory. If "
"executing pip with sudo, you may want sudo's -H flag.",
cache,
)
cache = None
# We want to _only_ cache responses on securely fetched origins. We do
# this because we can't validate the response of an insecurely fetched
# origin, and we don't want someone to be able to poison the cache and
# require manual eviction from the cache to fix it.
if cache:
secure_adapter = CacheControlAdapter(
cache=SafeFileCache(cache),
max_retries=retries,
)
else:
secure_adapter = HTTPAdapter(max_retries=retries)
# Our Insecure HTTPAdapter disables HTTPS validation. It does not
# support caching (see above) so we'll use it for all http:// URLs as
# well as any https:// host that we've marked as ignoring TLS errors
# for.
insecure_adapter = InsecureHTTPAdapter(max_retries=retries)
# Save this for later use in add_insecure_host().
self._insecure_adapter = insecure_adapter
self.mount("https://", secure_adapter)
self.mount("http://", insecure_adapter)
# Enable file:// urls
self.mount("file://", LocalFSAdapter())
for host in trusted_hosts:
self.add_trusted_host(host, suppress_logging=True)
def add_trusted_host(self, host, source=None, suppress_logging=False):
# type: (str, Optional[str], bool) -> None
"""
:param host: It is okay to provide a host that has previously been
added.
:param source: An optional source string, for logging where the host
string came from.
"""
if not suppress_logging:
msg = 'adding trusted host: {!r}'.format(host)
if source is not None:
msg += ' (from {})'.format(source)
logger.info(msg)
host_port = parse_netloc(host)
if host_port not in self.pip_trusted_origins:
self.pip_trusted_origins.append(host_port)
self.mount(build_url_from_netloc(host) + '/', self._insecure_adapter)
if not host_port[1]:
# Mount wildcard ports for the same host.
self.mount(
build_url_from_netloc(host) + ':',
self._insecure_adapter
)
def iter_secure_origins(self):
# type: () -> Iterator[SecureOrigin]
for secure_origin in SECURE_ORIGINS:
yield secure_origin
for host, port in self.pip_trusted_origins:
yield ('*', host, '*' if port is None else port)
def is_secure_origin(self, location):
# type: (Link) -> bool
# Determine if this url used a secure transport mechanism
parsed = urllib_parse.urlparse(str(location))
origin_protocol, origin_host, origin_port = (
parsed.scheme, parsed.hostname, parsed.port,
)
# The protocol to use to see if the protocol matches.
# Don't count the repository type as part of the protocol: in
# cases such as "git+ssh", only use "ssh". (I.e., Only verify against
# the last scheme.)
origin_protocol = origin_protocol.rsplit('+', 1)[-1]
# Determine if our origin is a secure origin by looking through our
# hardcoded list of secure origins, as well as any additional ones
# configured on this PackageFinder instance.
for secure_origin in self.iter_secure_origins():
secure_protocol, secure_host, secure_port = secure_origin
if origin_protocol != secure_protocol and secure_protocol != "*":
continue
try:
# We need to do this decode dance to ensure that we have a
# unicode object, even on Python 2.x.
addr = ipaddress.ip_address(
origin_host
if (
isinstance(origin_host, six.text_type) or
origin_host is None
)
else origin_host.decode("utf8")
)
network = ipaddress.ip_network(
secure_host
if isinstance(secure_host, six.text_type)
# setting secure_host to proper Union[bytes, str]
# creates problems in other places
else secure_host.decode("utf8") # type: ignore
)
except ValueError:
# We don't have both a valid address or a valid network, so
# we'll check this origin against hostnames.
if (origin_host and
origin_host.lower() != secure_host.lower() and
secure_host != "*"):
continue
else:
# We have a valid address and network, so see if the address
# is contained within the network.
if addr not in network:
continue
# Check to see if the port matches.
if (origin_port != secure_port and
secure_port != "*" and
secure_port is not None):
continue
# If we've gotten here, then this origin matches the current
# secure origin and we should return True
return True
# If we've gotten to this point, then the origin isn't secure and we
# will not accept it as a valid location to search. We will however
# log a warning that we are ignoring it.
logger.warning(
"The repository located at %s is not a trusted or secure host and "
"is being ignored. If this repository is available via HTTPS we "
"recommend you use HTTPS instead, otherwise you may silence "
"this warning and allow it anyway with '--trusted-host %s'.",
origin_host,
origin_host,
)
return False
def request(self, method, url, *args, **kwargs):
# Allow setting a default timeout on a session
kwargs.setdefault("timeout", self.timeout)
# Dispatch the actual request
return super(PipSession, self).request(method, url, *args, **kwargs)
def get_file_content(url, comes_from=None, session=None):
# type: (str, Optional[str], Optional[PipSession]) -> Tuple[str, Text]
"""Gets the content of a file; it may be a filename, file: URL, or
@ -602,23 +151,6 @@ def _get_used_vcs_backend(link):
return None
def is_file_url(link):
# type: (Link) -> bool
return link.url.lower().startswith('file:')
def is_dir_url(link):
# type: (Link) -> bool
"""Return whether a file:// Link points to a directory.
``link`` must not have any other scheme but file://. Call is_file_url()
first.
"""
link_path = link.file_path
return os.path.isdir(link_path)
def _progress_indicator(iterable, *args, **kwargs):
return iterable
@ -849,7 +381,7 @@ def unpack_file_url(
"""
link_path = link.file_path
# If it's a url to a local directory
if is_dir_url(link):
if link.is_existing_dir():
if os.path.isdir(location):
rmtree(location)
_copy_source_tree(link_path, location)
@ -888,35 +420,6 @@ def unpack_file_url(
_copy_file(from_path, download_dir, link)
class PipXmlrpcTransport(xmlrpc_client.Transport):
"""Provide a `xmlrpclib.Transport` implementation via a `PipSession`
object.
"""
def __init__(self, index_url, session, use_datetime=False):
xmlrpc_client.Transport.__init__(self, use_datetime)
index_parts = urllib_parse.urlparse(index_url)
self._scheme = index_parts.scheme
self._session = session
def request(self, host, handler, request_body, verbose=False):
parts = (self._scheme, host, handler, None, None, None)
url = urllib_parse.urlunparse(parts)
try:
headers = {'Content-Type': 'text/xml'}
response = self._session.post(url, data=request_body,
headers=headers, stream=True)
response.raise_for_status()
self.verbose = verbose
return self.parse_response(response.raw)
except requests.HTTPError as exc:
logger.critical(
"HTTP error %s while getting %s",
exc.response.status_code, url,
)
raise
def unpack_url(
link, # type: Link
location, # type: str
@ -945,7 +448,7 @@ def unpack_url(
unpack_vcs_link(link, location)
# file urls
elif is_file_url(link):
elif link.is_file:
unpack_file_url(link, location, download_dir, hashes=hashes)
# http urls
@ -1055,19 +558,21 @@ def _check_download_dir(link, download_dir, hashes):
If a correct file is found return its path else None
"""
download_path = os.path.join(download_dir, link.filename)
if os.path.exists(download_path):
# If already downloaded, does its hash match?
logger.info('File was already downloaded %s', download_path)
if hashes:
try:
hashes.check_against_path(download_path)
except HashMismatch:
logger.warning(
'Previously-downloaded file %s has bad hash. '
'Re-downloading.',
download_path
)
os.unlink(download_path)
return None
return download_path
return None
if not os.path.exists(download_path):
return None
# If already downloaded, does its hash match?
logger.info('File was already downloaded %s', download_path)
if hashes:
try:
hashes.check_against_path(download_path)
except HashMismatch:
logger.warning(
'Previously-downloaded file %s has bad hash. '
'Re-downloading.',
download_path
)
os.unlink(download_path)
return None
return download_path

View file

@ -1,4 +1,8 @@
"""Exceptions used throughout package"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
from itertools import chain, groupby, repeat
@ -277,7 +281,6 @@ class HashMismatch(HashError):
for e in expecteds)
lines.append(' Got %s\n' %
self.gots[hash_name].hexdigest())
prefix = ' or'
return '\n'.join(lines)

View file

@ -2,6 +2,7 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
@ -34,7 +35,7 @@ from pip._internal.wheel import Wheel
if MYPY_CHECK_RUNNING:
from typing import (
Any, FrozenSet, Iterable, List, Optional, Set, Text, Tuple,
FrozenSet, Iterable, List, Optional, Set, Text, Tuple, Union,
)
from pip._vendor.packaging.version import _BaseVersion
from pip._internal.collector import LinkCollector
@ -43,7 +44,7 @@ if MYPY_CHECK_RUNNING:
from pip._internal.pep425tags import Pep425Tag
from pip._internal.utils.hashes import Hashes
BuildTag = Tuple[Any, ...] # either empty tuple or Tuple[int, str]
BuildTag = Union[Tuple[()], Tuple[int, str]]
CandidateSortingKey = (
Tuple[int, int, int, _BaseVersion, BuildTag, Optional[int]]
)
@ -511,7 +512,7 @@ class CandidateEvaluator(object):
"""
valid_tags = self._supported_tags
support_num = len(valid_tags)
build_tag = tuple() # type: BuildTag
build_tag = () # type: BuildTag
binary_preference = 0
link = candidate.link
if link.is_wheel:

View file

@ -12,6 +12,7 @@ for sub-dependencies
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
import sys
@ -44,7 +45,7 @@ if MYPY_CHECK_RUNNING:
from pip._vendor import pkg_resources
from pip._internal.distributions import AbstractDistribution
from pip._internal.download import PipSession
from pip._internal.network.session import PipSession
from pip._internal.index import PackageFinder
from pip._internal.operations.prepare import RequirementPreparer
from pip._internal.req.req_install import InstallRequirement

View file

@ -2,6 +2,7 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import

47
src/pip/_internal/main.py Normal file
View file

@ -0,0 +1,47 @@
"""Primary application entrypoint.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import locale
import logging
import os
import sys
from pip._internal.cli.autocompletion import autocomplete
from pip._internal.cli.main_parser import parse_command
from pip._internal.commands import create_command
from pip._internal.exceptions import PipError
from pip._internal.utils import deprecation
logger = logging.getLogger(__name__)
def main(args=None):
if args is None:
args = sys.argv[1:]
# Configure our deprecation warnings to be sent through loggers
deprecation.install_warning_logger()
autocomplete()
try:
cmd_name, cmd_args = parse_command(args)
except PipError as exc:
sys.stderr.write("ERROR: %s" % exc)
sys.stderr.write(os.linesep)
sys.exit(1)
# Needed for locale.getpreferredencoding(False) to work
# in pip._internal.utils.encoding.auto_decode
try:
locale.setlocale(locale.LC_ALL, '')
except locale.Error as e:
# setlocale can apparently crash if locale are uninitialized
logger.debug("Ignoring error %s when setting locale", e)
command = create_command(cmd_name, isolated=("--isolated" in cmd_args))
return command.main(cmd_args)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from pip._vendor.packaging.version import parse as parse_version
from pip._internal.utils.models import KeyBasedCompareMixin

View file

@ -1,5 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from pip._vendor.packaging.utils import canonicalize_name

View file

@ -1,3 +1,7 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import os
import posixpath
import re
@ -5,14 +9,13 @@ from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip._internal.utils.filetypes import WHEEL_EXTENSION
from pip._internal.utils.misc import (
path_to_url,
redact_auth_from_url,
split_auth_from_netloc,
splitext,
)
from pip._internal.utils.models import KeyBasedCompareMixin
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.urls import url_to_path
from pip._internal.utils.urls import path_to_url, url_to_path
if MYPY_CHECK_RUNNING:
from typing import Optional, Text, Tuple, Union
@ -180,6 +183,15 @@ class Link(KeyBasedCompareMixin):
# type: () -> Optional[str]
return posixpath.basename(self._url.split('#', 1)[0].split('?', 1)[0])
@property
def is_file(self):
# type: () -> bool
return self.scheme == 'file'
def is_existing_dir(self):
# type: () -> bool
return self.is_file and os.path.isdir(self.file_path)
@property
def is_wheel(self):
# type: () -> bool

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import itertools
import logging
import os

View file

@ -4,6 +4,9 @@ Contains interface (MultiDomainBasicAuth) and associated glue code for
providing credentials in the context of network requests.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
from pip._vendor.requests.auth import AuthBase, HTTPBasicAuth

View file

@ -0,0 +1,75 @@
"""HTTP cache implementation.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import os
from contextlib import contextmanager
from pip._vendor.cachecontrol.cache import BaseCache
from pip._vendor.cachecontrol.caches import FileCache
from pip._internal.utils.filesystem import adjacent_tmp_file, replace
from pip._internal.utils.misc import ensure_dir
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Optional
@contextmanager
def suppressed_cache_errors():
"""If we can't access the cache then we can just skip caching and process
requests as if caching wasn't enabled.
"""
try:
yield
except (OSError, IOError):
pass
class SafeFileCache(BaseCache):
"""
A file based cache which is safe to use even when the target directory may
not be accessible or writable.
"""
def __init__(self, directory):
# type: (str) -> None
assert directory is not None, "Cache directory must not be None."
super(SafeFileCache, self).__init__()
self.directory = directory
def _get_cache_path(self, name):
# type: (str) -> str
# From cachecontrol.caches.file_cache.FileCache._fn, brought into our
# class for backwards-compatibility and to avoid using a non-public
# method.
hashed = FileCache.encode(name)
parts = list(hashed[:5]) + [hashed]
return os.path.join(self.directory, *parts)
def get(self, key):
# type: (str) -> Optional[bytes]
path = self._get_cache_path(key)
with suppressed_cache_errors():
with open(path, 'rb') as f:
return f.read()
def set(self, key, value):
# type: (str, bytes) -> None
path = self._get_cache_path(key)
with suppressed_cache_errors():
ensure_dir(os.path.dirname(path))
with adjacent_tmp_file(path) as f:
f.write(value)
replace(f.name, path)
def delete(self, key):
# type: (str) -> None
path = self._get_cache_path(key)
with suppressed_cache_errors():
os.remove(path)

View file

@ -0,0 +1,420 @@
"""PipSession and supporting code, containing all pip-specific
network request configuration and behavior.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import email.utils
import json
import logging
import mimetypes
import os
import platform
import sys
from pip._vendor import requests, six, urllib3
from pip._vendor.cachecontrol import CacheControlAdapter
from pip._vendor.requests.adapters import BaseAdapter, HTTPAdapter
from pip._vendor.requests.models import Response
from pip._vendor.requests.structures import CaseInsensitiveDict
from pip._vendor.six.moves.urllib import parse as urllib_parse
from pip import __version__
from pip._internal.network.auth import MultiDomainBasicAuth
from pip._internal.network.cache import SafeFileCache
# Import ssl from compat so the initial import occurs in only one place.
from pip._internal.utils.compat import HAS_TLS, ipaddress, ssl
from pip._internal.utils.filesystem import check_path_owner
from pip._internal.utils.glibc import libc_ver
from pip._internal.utils.misc import (
build_url_from_netloc,
get_installed_version,
parse_netloc,
)
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.urls import url_to_path
if MYPY_CHECK_RUNNING:
from typing import (
Iterator, List, Optional, Tuple, Union,
)
from pip._internal.models.link import Link
SecureOrigin = Tuple[str, str, Optional[Union[int, str]]]
logger = logging.getLogger(__name__)
SECURE_ORIGINS = [
# protocol, hostname, port
# Taken from Chrome's list of secure origins (See: http://bit.ly/1qrySKC)
("https", "*", "*"),
("*", "localhost", "*"),
("*", "127.0.0.0/8", "*"),
("*", "::1/128", "*"),
("file", "*", None),
# ssh is always secure.
("ssh", "*", "*"),
] # type: List[SecureOrigin]
# These are environment variables present when running under various
# CI systems. For each variable, some CI systems that use the variable
# are indicated. The collection was chosen so that for each of a number
# of popular systems, at least one of the environment variables is used.
# This list is used to provide some indication of and lower bound for
# CI traffic to PyPI. Thus, it is okay if the list is not comprehensive.
# For more background, see: https://github.com/pypa/pip/issues/5499
CI_ENVIRONMENT_VARIABLES = (
# Azure Pipelines
'BUILD_BUILDID',
# Jenkins
'BUILD_ID',
# AppVeyor, CircleCI, Codeship, Gitlab CI, Shippable, Travis CI
'CI',
# Explicit environment variable.
'PIP_IS_CI',
)
def looks_like_ci():
# type: () -> bool
"""
Return whether it looks like pip is running under CI.
"""
# We don't use the method of checking for a tty (e.g. using isatty())
# because some CI systems mimic a tty (e.g. Travis CI). Thus that
# method doesn't provide definitive information in either direction.
return any(name in os.environ for name in CI_ENVIRONMENT_VARIABLES)
def user_agent():
"""
Return a string representing the user agent.
"""
data = {
"installer": {"name": "pip", "version": __version__},
"python": platform.python_version(),
"implementation": {
"name": platform.python_implementation(),
},
}
if data["implementation"]["name"] == 'CPython':
data["implementation"]["version"] = platform.python_version()
elif data["implementation"]["name"] == 'PyPy':
if sys.pypy_version_info.releaselevel == 'final':
pypy_version_info = sys.pypy_version_info[:3]
else:
pypy_version_info = sys.pypy_version_info
data["implementation"]["version"] = ".".join(
[str(x) for x in pypy_version_info]
)
elif data["implementation"]["name"] == 'Jython':
# Complete Guess
data["implementation"]["version"] = platform.python_version()
elif data["implementation"]["name"] == 'IronPython':
# Complete Guess
data["implementation"]["version"] = platform.python_version()
if sys.platform.startswith("linux"):
from pip._vendor import distro
distro_infos = dict(filter(
lambda x: x[1],
zip(["name", "version", "id"], distro.linux_distribution()),
))
libc = dict(filter(
lambda x: x[1],
zip(["lib", "version"], libc_ver()),
))
if libc:
distro_infos["libc"] = libc
if distro_infos:
data["distro"] = distro_infos
if sys.platform.startswith("darwin") and platform.mac_ver()[0]:
data["distro"] = {"name": "macOS", "version": platform.mac_ver()[0]}
if platform.system():
data.setdefault("system", {})["name"] = platform.system()
if platform.release():
data.setdefault("system", {})["release"] = platform.release()
if platform.machine():
data["cpu"] = platform.machine()
if HAS_TLS:
data["openssl_version"] = ssl.OPENSSL_VERSION
setuptools_version = get_installed_version("setuptools")
if setuptools_version is not None:
data["setuptools_version"] = setuptools_version
# Use None rather than False so as not to give the impression that
# pip knows it is not being run under CI. Rather, it is a null or
# inconclusive result. Also, we include some value rather than no
# value to make it easier to know that the check has been run.
data["ci"] = True if looks_like_ci() else None
user_data = os.environ.get("PIP_USER_AGENT_USER_DATA")
if user_data is not None:
data["user_data"] = user_data
return "{data[installer][name]}/{data[installer][version]} {json}".format(
data=data,
json=json.dumps(data, separators=(",", ":"), sort_keys=True),
)
class LocalFSAdapter(BaseAdapter):
def send(self, request, stream=None, timeout=None, verify=None, cert=None,
proxies=None):
pathname = url_to_path(request.url)
resp = Response()
resp.status_code = 200
resp.url = request.url
try:
stats = os.stat(pathname)
except OSError as exc:
resp.status_code = 404
resp.raw = exc
else:
modified = email.utils.formatdate(stats.st_mtime, usegmt=True)
content_type = mimetypes.guess_type(pathname)[0] or "text/plain"
resp.headers = CaseInsensitiveDict({
"Content-Type": content_type,
"Content-Length": stats.st_size,
"Last-Modified": modified,
})
resp.raw = open(pathname, "rb")
resp.close = resp.raw.close
return resp
def close(self):
pass
class InsecureHTTPAdapter(HTTPAdapter):
def cert_verify(self, conn, url, verify, cert):
conn.cert_reqs = 'CERT_NONE'
conn.ca_certs = None
class PipSession(requests.Session):
timeout = None # type: Optional[int]
def __init__(self, *args, **kwargs):
"""
:param trusted_hosts: Domains not to emit warnings for when not using
HTTPS.
"""
retries = kwargs.pop("retries", 0)
cache = kwargs.pop("cache", None)
trusted_hosts = kwargs.pop("trusted_hosts", []) # type: List[str]
index_urls = kwargs.pop("index_urls", None)
super(PipSession, self).__init__(*args, **kwargs)
# Namespace the attribute with "pip_" just in case to prevent
# possible conflicts with the base class.
self.pip_trusted_origins = [] # type: List[Tuple[str, Optional[int]]]
# Attach our User Agent to the request
self.headers["User-Agent"] = user_agent()
# Attach our Authentication handler to the session
self.auth = MultiDomainBasicAuth(index_urls=index_urls)
# Create our urllib3.Retry instance which will allow us to customize
# how we handle retries.
retries = urllib3.Retry(
# Set the total number of retries that a particular request can
# have.
total=retries,
# A 503 error from PyPI typically means that the Fastly -> Origin
# connection got interrupted in some way. A 503 error in general
# is typically considered a transient error so we'll go ahead and
# retry it.
# A 500 may indicate transient error in Amazon S3
# A 520 or 527 - may indicate transient error in CloudFlare
status_forcelist=[500, 503, 520, 527],
# Add a small amount of back off between failed requests in
# order to prevent hammering the service.
backoff_factor=0.25,
)
# Check to ensure that the directory containing our cache directory
# is owned by the user current executing pip. If it does not exist
# we will check the parent directory until we find one that does exist.
if cache and not check_path_owner(cache):
logger.warning(
"The directory '%s' or its parent directory is not owned by "
"the current user and the cache has been disabled. Please "
"check the permissions and owner of that directory. If "
"executing pip with sudo, you may want sudo's -H flag.",
cache,
)
cache = None
# We want to _only_ cache responses on securely fetched origins. We do
# this because we can't validate the response of an insecurely fetched
# origin, and we don't want someone to be able to poison the cache and
# require manual eviction from the cache to fix it.
if cache:
secure_adapter = CacheControlAdapter(
cache=SafeFileCache(cache),
max_retries=retries,
)
else:
secure_adapter = HTTPAdapter(max_retries=retries)
# Our Insecure HTTPAdapter disables HTTPS validation. It does not
# support caching (see above) so we'll use it for all http:// URLs as
# well as any https:// host that we've marked as ignoring TLS errors
# for.
insecure_adapter = InsecureHTTPAdapter(max_retries=retries)
# Save this for later use in add_insecure_host().
self._insecure_adapter = insecure_adapter
self.mount("https://", secure_adapter)
self.mount("http://", insecure_adapter)
# Enable file:// urls
self.mount("file://", LocalFSAdapter())
for host in trusted_hosts:
self.add_trusted_host(host, suppress_logging=True)
def add_trusted_host(self, host, source=None, suppress_logging=False):
# type: (str, Optional[str], bool) -> None
"""
:param host: It is okay to provide a host that has previously been
added.
:param source: An optional source string, for logging where the host
string came from.
"""
if not suppress_logging:
msg = 'adding trusted host: {!r}'.format(host)
if source is not None:
msg += ' (from {})'.format(source)
logger.info(msg)
host_port = parse_netloc(host)
if host_port not in self.pip_trusted_origins:
self.pip_trusted_origins.append(host_port)
self.mount(build_url_from_netloc(host) + '/', self._insecure_adapter)
if not host_port[1]:
# Mount wildcard ports for the same host.
self.mount(
build_url_from_netloc(host) + ':',
self._insecure_adapter
)
def iter_secure_origins(self):
# type: () -> Iterator[SecureOrigin]
for secure_origin in SECURE_ORIGINS:
yield secure_origin
for host, port in self.pip_trusted_origins:
yield ('*', host, '*' if port is None else port)
def is_secure_origin(self, location):
# type: (Link) -> bool
# Determine if this url used a secure transport mechanism
parsed = urllib_parse.urlparse(str(location))
origin_protocol, origin_host, origin_port = (
parsed.scheme, parsed.hostname, parsed.port,
)
# The protocol to use to see if the protocol matches.
# Don't count the repository type as part of the protocol: in
# cases such as "git+ssh", only use "ssh". (I.e., Only verify against
# the last scheme.)
origin_protocol = origin_protocol.rsplit('+', 1)[-1]
# Determine if our origin is a secure origin by looking through our
# hardcoded list of secure origins, as well as any additional ones
# configured on this PackageFinder instance.
for secure_origin in self.iter_secure_origins():
secure_protocol, secure_host, secure_port = secure_origin
if origin_protocol != secure_protocol and secure_protocol != "*":
continue
try:
# We need to do this decode dance to ensure that we have a
# unicode object, even on Python 2.x.
addr = ipaddress.ip_address(
origin_host
if (
isinstance(origin_host, six.text_type) or
origin_host is None
)
else origin_host.decode("utf8")
)
network = ipaddress.ip_network(
secure_host
if isinstance(secure_host, six.text_type)
# setting secure_host to proper Union[bytes, str]
# creates problems in other places
else secure_host.decode("utf8") # type: ignore
)
except ValueError:
# We don't have both a valid address or a valid network, so
# we'll check this origin against hostnames.
if (
origin_host and
origin_host.lower() != secure_host.lower() and
secure_host != "*"
):
continue
else:
# We have a valid address and network, so see if the address
# is contained within the network.
if addr not in network:
continue
# Check to see if the port matches.
if (
origin_port != secure_port and
secure_port != "*" and
secure_port is not None
):
continue
# If we've gotten here, then this origin matches the current
# secure origin and we should return True
return True
# If we've gotten to this point, then the origin isn't secure and we
# will not accept it as a valid location to search. We will however
# log a warning that we are ignoring it.
logger.warning(
"The repository located at %s is not a trusted or secure host and "
"is being ignored. If this repository is available via HTTPS we "
"recommend you use HTTPS instead, otherwise you may silence "
"this warning and allow it anyway with '--trusted-host %s'.",
origin_host,
origin_host,
)
return False
def request(self, method, url, *args, **kwargs):
# Allow setting a default timeout on a session
kwargs.setdefault("timeout", self.timeout)
# Dispatch the actual request
return super(PipSession, self).request(method, url, *args, **kwargs)

View file

@ -0,0 +1,44 @@
"""xmlrpclib.Transport implementation
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import logging
from pip._vendor import requests
# NOTE: XMLRPC Client is not annotated in typeshed as on 2017-07-17, which is
# why we ignore the type on this import
from pip._vendor.six.moves import xmlrpc_client # type: ignore
from pip._vendor.six.moves.urllib import parse as urllib_parse
logger = logging.getLogger(__name__)
class PipXmlrpcTransport(xmlrpc_client.Transport):
"""Provide a `xmlrpclib.Transport` implementation via a `PipSession`
object.
"""
def __init__(self, index_url, session, use_datetime=False):
xmlrpc_client.Transport.__init__(self, use_datetime)
index_parts = urllib_parse.urlparse(index_url)
self._scheme = index_parts.scheme
self._session = session
def request(self, host, handler, request_body, verbose=False):
parts = (self._scheme, host, handler, None, None, None)
url = urllib_parse.urlunparse(parts)
try:
headers = {'Content-Type': 'text/xml'}
response = self._session.post(url, data=request_body,
headers=headers, stream=True)
response.raise_for_status()
self.verbose = verbose
return self.parse_response(response.raw)
except requests.HTTPError as exc:
logger.critical(
"HTTP error %s while getting %s",
exc.response.status_code, url,
)
raise

View file

@ -3,6 +3,7 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
from collections import namedtuple
@ -68,8 +69,8 @@ def check_package_set(package_set, should_ignore=None):
def should_ignore(name):
return False
missing = dict()
conflicting = dict()
missing = {}
conflicting = {}
for package_name in package_set:
# Info about dependencies of package_name

View file

@ -1,5 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import

View file

@ -4,8 +4,9 @@
import logging
import os
from pip._internal.utils.misc import call_subprocess, ensure_dir
from pip._internal.utils.misc import ensure_dir
from pip._internal.utils.setuptools_build import make_setuptools_shim_args
from pip._internal.utils.subprocess import call_subprocess
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
@ -16,7 +17,13 @@ logger = logging.getLogger(__name__)
def get_metadata_generator(install_req):
# type: (InstallRequirement) -> Callable[[InstallRequirement], None]
# type: (InstallRequirement) -> Callable[[InstallRequirement], str]
"""Return a callable metadata generator for this InstallRequirement.
A metadata generator takes an InstallRequirement (install_req) as an input,
generates metadata via the appropriate process for that install_req and
returns the generated metadata directory.
"""
if not install_req.use_pep517:
return _generate_metadata_legacy
@ -24,7 +31,7 @@ def get_metadata_generator(install_req):
def _generate_metadata_legacy(install_req):
# type: (InstallRequirement) -> None
# type: (InstallRequirement) -> str
req_details_str = install_req.name or "from {}".format(install_req.link)
logger.debug(
'Running setup.py (path:%s) egg_info for package %s',
@ -41,7 +48,9 @@ def _generate_metadata_legacy(install_req):
# egg.
egg_base_option = [] # type: List[str]
if not install_req.editable:
egg_info_dir = os.path.join(install_req.setup_py_dir, 'pip-egg-info')
egg_info_dir = os.path.join(
install_req.unpacked_source_directory, 'pip-egg-info',
)
egg_base_option = ['--egg-base', egg_info_dir]
# setuptools complains if the target directory does not exist.
@ -50,11 +59,14 @@ def _generate_metadata_legacy(install_req):
with install_req.build_env:
call_subprocess(
base_cmd + ["egg_info"] + egg_base_option,
cwd=install_req.setup_py_dir,
cwd=install_req.unpacked_source_directory,
command_desc='python setup.py egg_info',
)
# Return the metadata directory.
return install_req.find_egg_info()
def _generate_metadata(install_req):
# type: (InstallRequirement) -> None
install_req.prepare_pep517_metadata()
# type: (InstallRequirement) -> str
return install_req.prepare_pep517_metadata()

View file

@ -3,6 +3,7 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
import os
@ -13,7 +14,7 @@ from pip._internal.distributions import (
make_distribution_for_install_requirement,
)
from pip._internal.distributions.installed import InstalledDistribution
from pip._internal.download import is_dir_url, is_file_url, unpack_url
from pip._internal.download import unpack_url
from pip._internal.exceptions import (
DirectoryUrlHashUnsupported,
HashUnpinned,
@ -32,8 +33,8 @@ if MYPY_CHECK_RUNNING:
from typing import Optional
from pip._internal.distributions import AbstractDistribution
from pip._internal.download import PipSession
from pip._internal.index import PackageFinder
from pip._internal.network.session import PipSession
from pip._internal.req.req_install import InstallRequirement
from pip._internal.req.req_tracker import RequirementTracker
@ -160,7 +161,7 @@ class RequirementPreparer(object):
# hash provided.
if link.is_vcs:
raise VcsHashUnsupported()
elif is_file_url(link) and is_dir_url(link):
elif link.is_existing_dir():
raise DirectoryUrlHashUnsupported()
if not req.original_link and not req.is_pinned:
# Unpinned packages are asking for trouble when a new

View file

@ -21,9 +21,9 @@ def _is_list_of_str(obj):
)
def make_pyproject_path(setup_py_dir):
def make_pyproject_path(unpacked_source_directory):
# type: (str) -> str
path = os.path.join(setup_py_dir, 'pyproject.toml')
path = os.path.join(unpacked_source_directory, 'pyproject.toml')
# Python2 __file__ should not be unicode
if six.PY2 and isinstance(path, six.text_type):

View file

@ -5,12 +5,13 @@ from __future__ import absolute_import
import logging
from .req_install import InstallRequirement
from .req_set import RequirementSet
from .req_file import parse_requirements
from pip._internal.utils.logging import indent_log
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from .req_file import parse_requirements
from .req_install import InstallRequirement
from .req_set import RequirementSet
if MYPY_CHECK_RUNNING:
from typing import Any, List, Sequence

View file

@ -10,6 +10,7 @@ InstallRequirement.
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
import logging
import os
@ -26,9 +27,9 @@ from pip._internal.models.link import Link
from pip._internal.pyproject import make_pyproject_path
from pip._internal.req.req_install import InstallRequirement
from pip._internal.utils.filetypes import ARCHIVE_EXTENSIONS
from pip._internal.utils.misc import is_installable_dir, path_to_url, splitext
from pip._internal.utils.misc import is_installable_dir, splitext
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.urls import url_to_path
from pip._internal.utils.urls import path_to_url
from pip._internal.vcs import is_url, vcs
from pip._internal.wheel import Wheel
@ -179,6 +180,37 @@ def deduce_helpful_msg(req):
return msg
class RequirementParts(object):
def __init__(
self,
requirement, # type: Optional[Requirement]
link, # type: Optional[Link]
markers, # type: Optional[Marker]
extras, # type: Set[str]
):
self.requirement = requirement
self.link = link
self.markers = markers
self.extras = extras
def parse_req_from_editable(editable_req):
# type: (str) -> RequirementParts
name, url, extras_override = parse_editable(editable_req)
if name is not None:
try:
req = Requirement(name)
except InvalidRequirement:
raise InstallationError("Invalid requirement: '%s'" % name)
else:
req = None
link = Link(url)
return RequirementParts(req, link, None, extras_override)
# ---- The actual constructors follow ----
@ -192,29 +224,21 @@ def install_req_from_editable(
constraint=False # type: bool
):
# type: (...) -> InstallRequirement
name, url, extras_override = parse_editable(editable_req)
if url.startswith('file:'):
source_dir = url_to_path(url)
else:
source_dir = None
if name is not None:
try:
req = Requirement(name)
except InvalidRequirement:
raise InstallationError("Invalid requirement: '%s'" % name)
else:
req = None
parts = parse_req_from_editable(editable_req)
source_dir = parts.link.file_path if parts.link.scheme == 'file' else None
return InstallRequirement(
req, comes_from, source_dir=source_dir,
parts.requirement, comes_from, source_dir=source_dir,
editable=True,
link=Link(url),
link=parts.link,
constraint=constraint,
use_pep517=use_pep517,
isolated=isolated,
options=options if options else {},
wheel_cache=wheel_cache,
extras=extras_override or (),
extras=parts.extras,
)
@ -272,20 +296,6 @@ def _get_url_from_path(path, name):
return path_to_url(path)
class RequirementParts(object):
def __init__(
self,
requirement, # type: Optional[Requirement]
link, # type: Optional[Link]
markers, # type: Optional[Marker]
extras, # type: Set[str]
):
self.requirement = requirement
self.link = link
self.markers = markers
self.extras = extras
def parse_req_from_line(name, line_source):
# type: (str, Optional[str]) -> RequirementParts
if is_url(name):

View file

@ -33,7 +33,7 @@ if MYPY_CHECK_RUNNING:
from pip._internal.req import InstallRequirement
from pip._internal.cache import WheelCache
from pip._internal.index import PackageFinder
from pip._internal.download import PipSession
from pip._internal.network.session import PipSession
ReqFileLines = Iterator[Tuple[int, Text]]

View file

@ -1,5 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: strict-optional=False
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
@ -37,7 +38,6 @@ from pip._internal.utils.misc import (
_make_build_dir,
ask_path_exists,
backup_dir,
call_subprocess,
display_path,
dist_in_site_packages,
dist_in_usersite,
@ -49,15 +49,18 @@ from pip._internal.utils.misc import (
)
from pip._internal.utils.packaging import get_metadata
from pip._internal.utils.setuptools_build import make_setuptools_shim_args
from pip._internal.utils.subprocess import (
call_subprocess,
runner_with_spinner_message,
)
from pip._internal.utils.temp_dir import TempDirectory
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
from pip._internal.utils.ui import open_spinner
from pip._internal.utils.virtualenv import running_under_virtualenv
from pip._internal.vcs import vcs
if MYPY_CHECK_RUNNING:
from typing import (
Any, Dict, Iterable, List, Mapping, Optional, Sequence, Union,
Any, Dict, Iterable, List, Optional, Sequence, Union,
)
from pip._internal.build_env import BuildEnvironment
from pip._internal.cache import WheelCache
@ -121,7 +124,6 @@ class InstallRequirement(object):
markers = req.marker
self.markers = markers
self._egg_info_path = None # type: Optional[str]
# This holds the pkg_resources.Distribution object if this requirement
# is already available:
self.satisfied_by = None
@ -338,7 +340,6 @@ class InstallRequirement(object):
# builds (such as numpy). Thus, we ensure that the real path
# is returned.
self._temp_build_dir = TempDirectory(kind="req-build")
self._temp_build_dir.create()
self._ideal_build_dir = build_dir
return self._temp_build_dir.path
@ -368,29 +369,34 @@ class InstallRequirement(object):
return
assert self.req is not None
assert self._temp_build_dir
assert (self._ideal_build_dir is not None and
self._ideal_build_dir.path) # type: ignore
assert (
self._ideal_build_dir is not None and
self._ideal_build_dir.path # type: ignore
)
old_location = self._temp_build_dir
self._temp_build_dir = None
self._temp_build_dir = None # checked inside ensure_build_location
# Figure out the correct place to put the files.
new_location = self.ensure_build_location(self._ideal_build_dir)
if os.path.exists(new_location):
raise InstallationError(
'A package already exists in %s; please remove it to continue'
% display_path(new_location))
% display_path(new_location)
)
# Move the files to the correct location.
logger.debug(
'Moving package %s from %s to new location %s',
self, display_path(old_location.path), display_path(new_location),
)
shutil.move(old_location.path, new_location)
# Update directory-tracking variables, to be in line with new_location
self.source_dir = os.path.normpath(os.path.abspath(new_location))
self._temp_build_dir = TempDirectory(
path=new_location, kind="req-install",
)
self._ideal_build_dir = None
self.source_dir = os.path.normpath(os.path.abspath(new_location))
self._egg_info_path = None
# Correct the metadata directory, if it exists
if self.metadata_directory:
old_meta = self.metadata_directory
@ -399,6 +405,11 @@ class InstallRequirement(object):
new_meta = os.path.normpath(os.path.abspath(new_meta))
self.metadata_directory = new_meta
# Done with any "move built files" work, since have moved files to the
# "ideal" build location. Setting to None allows to clearly flag that
# no more moves are needed.
self._ideal_build_dir = None
def remove_temporary_source(self):
# type: () -> None
"""Remove the source files from this requirement, if they are marked
@ -486,7 +497,7 @@ class InstallRequirement(object):
# Things valid for sdists
@property
def setup_py_dir(self):
def unpacked_source_directory(self):
# type: () -> str
return os.path.join(
self.source_dir,
@ -496,8 +507,7 @@ class InstallRequirement(object):
def setup_py_path(self):
# type: () -> str
assert self.source_dir, "No source dir for %s" % self
setup_py = os.path.join(self.setup_py_dir, 'setup.py')
setup_py = os.path.join(self.unpacked_source_directory, 'setup.py')
# Python2 __file__ should not be unicode
if six.PY2 and isinstance(setup_py, six.text_type):
@ -509,8 +519,7 @@ class InstallRequirement(object):
def pyproject_toml_path(self):
# type: () -> str
assert self.source_dir, "No source dir for %s" % self
return make_pyproject_path(self.setup_py_dir)
return make_pyproject_path(self.unpacked_source_directory)
def load_pyproject_toml(self):
# type: () -> None
@ -536,27 +545,9 @@ class InstallRequirement(object):
requires, backend, check = pyproject_toml_data
self.requirements_to_check = check
self.pyproject_requires = requires
self.pep517_backend = Pep517HookCaller(self.setup_py_dir, backend)
# Use a custom function to call subprocesses
self.spin_message = ""
def runner(
cmd, # type: List[str]
cwd=None, # type: Optional[str]
extra_environ=None # type: Optional[Mapping[str, Any]]
):
# type: (...) -> None
with open_spinner(self.spin_message) as spinner:
call_subprocess(
cmd,
cwd=cwd,
extra_environ=extra_environ,
spinner=spinner
)
self.spin_message = ""
self.pep517_backend._subprocess_runner = runner
self.pep517_backend = Pep517HookCaller(
self.unpacked_source_directory, backend
)
def prepare_metadata(self):
# type: () -> None
@ -569,7 +560,7 @@ class InstallRequirement(object):
metadata_generator = get_metadata_generator(self)
with indent_log():
metadata_generator(self)
self.metadata_directory = metadata_generator(self)
if not self.req:
if isinstance(parse_version(self.metadata["Version"]), Version):
@ -595,40 +586,34 @@ class InstallRequirement(object):
)
self.req = Requirement(metadata_name)
def cleanup(self):
# type: () -> None
if self._temp_dir is not None:
self._temp_dir.cleanup()
def prepare_pep517_metadata(self):
# type: () -> None
# type: () -> str
assert self.pep517_backend is not None
# NOTE: This needs to be refactored to stop using atexit
self._temp_dir = TempDirectory(delete=False, kind="req-install")
self._temp_dir.create()
temp_dir = TempDirectory(kind="modern-metadata")
atexit.register(temp_dir.cleanup)
metadata_dir = os.path.join(
self._temp_dir.path,
temp_dir.path,
'pip-wheel-metadata',
)
atexit.register(self.cleanup)
ensure_dir(metadata_dir)
with self.build_env:
# Note that Pep517HookCaller implements a fallback for
# prepare_metadata_for_build_wheel, so we don't have to
# consider the possibility that this hook doesn't exist.
runner = runner_with_spinner_message("Preparing wheel metadata")
backend = self.pep517_backend
self.spin_message = "Preparing wheel metadata"
distinfo_dir = backend.prepare_metadata_for_build_wheel(
metadata_dir
)
with backend.subprocess_runner(runner):
distinfo_dir = backend.prepare_metadata_for_build_wheel(
metadata_dir
)
self.metadata_directory = os.path.join(metadata_dir, distinfo_dir)
return os.path.join(metadata_dir, distinfo_dir)
@property
def egg_info_path(self):
def find_egg_info(self):
# type: () -> str
def looks_like_virtual_env(path):
return (
@ -636,44 +621,49 @@ class InstallRequirement(object):
os.path.exists(os.path.join(path, 'Scripts', 'Python.exe'))
)
if self._egg_info_path is None:
if self.editable:
base = self.source_dir
filenames = []
for root, dirs, files in os.walk(base):
for dir in vcs.dirnames:
if dir in dirs:
dirs.remove(dir)
# Iterate over a copy of ``dirs``, since mutating
# a list while iterating over it can cause trouble.
# (See https://github.com/pypa/pip/pull/462.)
for dir in list(dirs):
if looks_like_virtual_env(os.path.join(root, dir)):
dirs.remove(dir)
# Also don't search through tests
elif dir == 'test' or dir == 'tests':
dirs.remove(dir)
filenames.extend([os.path.join(root, dir)
for dir in dirs])
filenames = [f for f in filenames if f.endswith('.egg-info')]
else:
base = os.path.join(self.setup_py_dir, 'pip-egg-info')
filenames = os.listdir(base)
def locate_editable_egg_info(base):
candidates = []
for root, dirs, files in os.walk(base):
for dir_ in vcs.dirnames:
if dir_ in dirs:
dirs.remove(dir_)
# Iterate over a copy of ``dirs``, since mutating
# a list while iterating over it can cause trouble.
# (See https://github.com/pypa/pip/pull/462.)
for dir_ in list(dirs):
if looks_like_virtual_env(os.path.join(root, dir_)):
dirs.remove(dir_)
# Also don't search through tests
elif dir_ == 'test' or dir_ == 'tests':
dirs.remove(dir_)
candidates.extend(os.path.join(root, dir_) for dir_ in dirs)
return [f for f in candidates if f.endswith('.egg-info')]
if not filenames:
raise InstallationError(
"Files/directories not found in %s" % base
)
# if we have more than one match, we pick the toplevel one. This
# can easily be the case if there is a dist folder which contains
# an extracted tarball for testing purposes.
if len(filenames) > 1:
filenames.sort(
key=lambda x: x.count(os.path.sep) +
(os.path.altsep and x.count(os.path.altsep) or 0)
)
self._egg_info_path = os.path.join(base, filenames[0])
return self._egg_info_path
def depth_of_directory(dir_):
return (
dir_.count(os.path.sep) +
(os.path.altsep and dir_.count(os.path.altsep) or 0)
)
if self.editable:
base = self.source_dir
filenames = locate_editable_egg_info(base)
else:
base = os.path.join(self.unpacked_source_directory, 'pip-egg-info')
filenames = os.listdir(base)
if not filenames:
raise InstallationError(
"Files/directories not found in %s" % base
)
# If we have more than one match, we pick the toplevel one. This
# can easily be the case if there is a dist folder which contains
# an extracted tarball for testing purposes.
if len(filenames) > 1:
filenames.sort(key=depth_of_directory)
return os.path.join(base, filenames[0])
@property
def metadata(self):
@ -686,16 +676,16 @@ class InstallRequirement(object):
def get_dist(self):
# type: () -> Distribution
"""Return a pkg_resources.Distribution for this requirement"""
if self.metadata_directory:
dist_dir = self.metadata_directory
dist_cls = pkg_resources.DistInfoDistribution
else:
dist_dir = self.egg_info_path.rstrip(os.path.sep)
# https://github.com/python/mypy/issues/1174
dist_cls = pkg_resources.Distribution # type: ignore
dist_dir = self.metadata_directory.rstrip(os.sep)
# dist_dir_name can be of the form "<project>.dist-info" or
# e.g. "<project>.egg-info".
# Determine the correct Distribution object type.
if dist_dir.endswith(".egg-info"):
dist_cls = pkg_resources.Distribution
else:
assert dist_dir.endswith(".dist-info")
dist_cls = pkg_resources.DistInfoDistribution
# Build a PathMetadata object, from path to metadata. :wink:
base_dir, dist_dir_name = os.path.split(dist_dir)
dist_name = os.path.splitext(dist_dir_name)[0]
metadata = pkg_resources.PathMetadata(base_dir, dist_dir)
@ -763,8 +753,7 @@ class InstallRequirement(object):
base_cmd +
['develop', '--no-deps'] +
list(install_options),
cwd=self.setup_py_dir,
cwd=self.unpacked_source_directory,
)
self.install_succeeded = True
@ -876,7 +865,9 @@ class InstallRequirement(object):
archive_path, 'w', zipfile.ZIP_DEFLATED, allowZip64=True,
)
with zip_output:
dir = os.path.normcase(os.path.abspath(self.setup_py_dir))
dir = os.path.normcase(
os.path.abspath(self.unpacked_source_directory)
)
for dirpath, dirnames, filenames in os.walk(dir):
if 'pip-egg-info' in dirnames:
dirnames.remove('pip-egg-info')
@ -943,15 +934,15 @@ class InstallRequirement(object):
install_args = self.get_install_args(
global_options, record_filename, root, prefix, pycompile,
)
msg = 'Running setup.py install for %s' % (self.name,)
with open_spinner(msg) as spinner:
with indent_log():
with self.build_env:
call_subprocess(
install_args + install_options,
cwd=self.setup_py_dir,
spinner=spinner,
)
runner = runner_with_spinner_message(
"Running setup.py install for {}".format(self.name)
)
with indent_log(), self.build_env:
runner(
cmd=install_args + install_options,
cwd=self.unpacked_source_directory,
)
if not os.path.exists(record_filename):
logger.debug('Record file %s not found', record_filename)

View file

@ -28,7 +28,6 @@ class RequirementTracker(object):
self._root = os.environ.get('PIP_REQ_TRACKER')
if self._root is None:
self._temp_dir = TempDirectory(delete=False, kind='req-tracker')
self._temp_dir.create()
self._root = os.environ['PIP_REQ_TRACKER'] = self._temp_dir.path
logger.debug('Created requirements tracker %r', self._root)
else:

View file

@ -227,10 +227,8 @@ class StashedUninstallPathSet(object):
try:
save_dir = AdjacentTempDirectory(path) # type: TempDirectory
save_dir.create()
except OSError:
save_dir = TempDirectory(kind="uninstall")
save_dir.create()
self._save_dirs[os.path.normcase(path)] = save_dir
return save_dir.path
@ -256,7 +254,6 @@ class StashedUninstallPathSet(object):
# Did not find any suitable root
head = os.path.dirname(path)
save_dir = TempDirectory(kind='uninstall')
save_dir.create()
self._save_dirs[head] = save_dir
relpath = os.path.relpath(path, head)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import datetime
@ -33,7 +36,8 @@ if MYPY_CHECK_RUNNING:
import optparse
from optparse import Values
from typing import Any, Dict, Text, Union
from pip._internal.download import PipSession
from pip._internal.network.session import PipSession
SELFCHECK_DATE_FMT = "%Y-%m-%dT%H:%M:%SZ"
@ -153,7 +157,7 @@ def was_installed_by_pip(pkg):
return False
def pip_version_check(session, options):
def pip_self_version_check(session, options):
# type: (PipSession, optparse.Values) -> None
"""Check for an update for pip.

View file

@ -2,6 +2,10 @@
This code was taken from https://github.com/ActiveState/appdirs and modified
to suit our purposes.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import os
@ -227,7 +231,8 @@ def _get_win_folder_with_ctypes(csidl_name):
}[csidl_name]
buf = ctypes.create_unicode_buffer(1024)
ctypes.windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)
windll = ctypes.windll # type: ignore
windll.shell32.SHGetFolderPathW(None, csidl_const, None, 0, buf)
# Downgrade to short path name if have highbit chars. See
# <http://bugs.activestate.com/show_bug.cgi?id=85099>.
@ -238,7 +243,7 @@ def _get_win_folder_with_ctypes(csidl_name):
break
if has_high_char:
buf2 = ctypes.create_unicode_buffer(1024)
if ctypes.windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
if windll.kernel32.GetShortPathNameW(buf.value, buf2, 1024):
buf = buf2
# The type: ignore is explained under the type annotation for this function

View file

@ -1,5 +1,9 @@
"""Stuff that differs in different Python versions and platform
distributions."""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import, division
import codecs

View file

@ -1,6 +1,10 @@
"""
A module that implements tooling to enable easy warnings about deprecations.
"""
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import logging

View file

@ -1,11 +1,16 @@
"""Filetype information.
"""
from pip._internal.utils.typing import MYPY_CHECK_RUNNING
if MYPY_CHECK_RUNNING:
from typing import Tuple
WHEEL_EXTENSION = '.whl'
BZ2_EXTENSIONS = ('.tar.bz2', '.tbz')
XZ_EXTENSIONS = ('.tar.xz', '.txz', '.tlz', '.tar.lz', '.tar.lzma')
ZIP_EXTENSIONS = ('.zip', WHEEL_EXTENSION)
TAR_EXTENSIONS = ('.tar.gz', '.tgz', '.tar')
BZ2_EXTENSIONS = ('.tar.bz2', '.tbz') # type: Tuple[str, ...]
XZ_EXTENSIONS = ('.tar.xz', '.txz', '.tlz',
'.tar.lz', '.tar.lzma') # type: Tuple[str, ...]
ZIP_EXTENSIONS = ('.zip', WHEEL_EXTENSION) # type: Tuple[str, ...]
TAR_EXTENSIONS = ('.tar.gz', '.tgz', '.tar') # type: Tuple[str, ...]
ARCHIVE_EXTENSIONS = (
ZIP_EXTENSIONS + BZ2_EXTENSIONS + TAR_EXTENSIONS + XZ_EXTENSIONS
)

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import hashlib

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
from __future__ import absolute_import
import contextlib
@ -6,13 +9,13 @@ import logging
import logging.handlers
import os
import sys
from logging import Filter
from logging import Filter, getLogger
from pip._vendor.six import PY2
from pip._internal.utils.compat import WINDOWS
from pip._internal.utils.deprecation import DEPRECATION_MSG_PREFIX
from pip._internal.utils.misc import ensure_dir, subprocess_logger
from pip._internal.utils.misc import ensure_dir
try:
import threading
@ -50,6 +53,7 @@ else:
_log_state = threading.local()
_log_state.indentation = 0
subprocess_logger = getLogger('pip.subprocessor')
class BrokenStdoutLoggingError(Exception):

View file

@ -1,3 +1,6 @@
# The following comment should be removed at some point in the future.
# mypy: disallow-untyped-defs=False
import os.path
DELETE_MARKER_MESSAGE = '''\

Some files were not shown because too many files have changed in this diff Show more