1
1
Fork 0
mirror of https://github.com/pypa/pip synced 2023-12-13 21:30:23 +01:00

Merge branch 'main' into fix/8418-permission-denied-hg

This commit is contained in:
Tzu-ping Chung 2021-04-03 16:17:44 +08:00
commit 2aac9f233e
131 changed files with 3878 additions and 6439 deletions

View file

@ -1,36 +0,0 @@
parameters:
vmImage:
jobs:
- job: Package
dependsOn:
- Test_Primary
- Test_Secondary
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- task: UsePythonVersion@0
displayName: Use Python 3 latest
inputs:
versionSpec: '3'
- bash: |
git config --global user.email "distutils-sig@python.org"
git config --global user.name "pip"
displayName: Setup Git credentials
- bash: pip install nox
displayName: Install dependencies
- bash: nox -s prepare-release -- 99.9
displayName: Prepare dummy release
- bash: nox -s build-release -- 99.9
displayName: Generate distributions for the dummy release
- task: PublishBuildArtifacts@1
displayName: 'Publish Artifact: dist'
inputs:
pathtoPublish: dist
artifactName: dist

View file

@ -1,53 +0,0 @@
parameters:
vmImage:
jobs:
- job: Test_Primary
displayName: Tests /
pool:
vmImage: ${{ parameters.vmImage }}
strategy:
matrix:
"3.6": # lowest Python version
python.version: '3.6'
python.architecture: x64
"3.8": # current
python.version: '3.8'
python.architecture: x64
maxParallel: 6
steps:
- template: ../steps/run-tests-windows.yml
parameters:
runIntegrationTests: true
- job: Test_Secondary
displayName: Tests /
# Don't run integration tests for these runs
# Run after Test_Primary so we don't devour time and jobs if tests are going to fail
dependsOn: Test_Primary
pool:
vmImage: ${{ parameters.vmImage }}
strategy:
matrix:
"3.7":
python.version: '3.7'
python.architecture: x64
# This is for Windows, so test x86 builds
"3.6-x86":
python.version: '3.6'
python.architecture: x86
"3.7-x86":
python.version: '3.7'
python.architecture: x86
"3.8-x86":
python.version: '3.8'
python.architecture: x86
maxParallel: 6
steps:
- template: ../steps/run-tests-windows.yml
parameters:
runIntegrationTests: false

View file

@ -1,38 +0,0 @@
parameters:
vmImage:
jobs:
- job: Test_Primary
displayName: Tests /
pool:
vmImage: ${{ parameters.vmImage }}
strategy:
matrix:
"3.6": # lowest Python version
python.version: '3.6'
python.architecture: x64
"3.8":
python.version: '3.8'
python.architecture: x64
maxParallel: 2
steps:
- template: ../steps/run-tests.yml
- job: Test_Secondary
displayName: Tests /
# Run after Test_Primary so we don't devour time and jobs if tests are going to fail
dependsOn: Test_Primary
pool:
vmImage: ${{ parameters.vmImage }}
strategy:
matrix:
"3.7":
python.version: '3.7'
python.architecture: x64
maxParallel: 4
steps:
- template: ../steps/run-tests.yml

View file

@ -1,11 +0,0 @@
variables:
CI: true
jobs:
- template: jobs/test.yml
parameters:
vmImage: ubuntu-16.04
- template: jobs/package.yml
parameters:
vmImage: ubuntu-16.04

View file

@ -1,54 +0,0 @@
parameters:
runIntegrationTests:
steps:
- task: UsePythonVersion@0
displayName: Use Python $(python.version)
inputs:
versionSpec: '$(python.version)'
architecture: '$(python.architecture)'
- task: PowerShell@2
inputs:
filePath: .azure-pipelines/scripts/New-RAMDisk.ps1
arguments: "-Drive R -Size 1GB"
displayName: Setup RAMDisk
- powershell: |
mkdir R:\Temp
$acl = Get-Acl "R:\Temp"
$rule = New-Object System.Security.AccessControl.FileSystemAccessRule(
"Everyone", "FullControl", "ContainerInherit,ObjectInherit", "None", "Allow"
)
$acl.AddAccessRule($rule)
Set-Acl "R:\Temp" $acl
displayName: Set RAMDisk Permissions
- bash: pip install --upgrade 'virtualenv<20' setuptools tox
displayName: Install Tox
- script: tox -e py -- -m unit -n auto --junit-xml=junit/unit-test.xml
env:
TEMP: "R:\\Temp"
displayName: Tox run unit tests
- ${{ if eq(parameters.runIntegrationTests, 'true') }}:
- powershell: |
# Fix Git SSL errors
pip install certifi tox
python -m certifi > cacert.txt
$env:GIT_SSL_CAINFO = $(Get-Content cacert.txt)
# Shorten paths to get under MAX_PATH or else integration tests will fail
# https://bugs.python.org/issue18199
$env:TEMP = "R:\Temp"
tox -e py -- -m integration -n auto --durations=5 --junit-xml=junit/integration-test.xml
displayName: Tox run integration tests
- task: PublishTestResults@2
displayName: Publish Test Results
inputs:
testResultsFiles: junit/*.xml
testRunTitle: 'Python $(python.version)'
condition: succeededOrFailed()

View file

@ -1,25 +0,0 @@
steps:
- task: UsePythonVersion@0
displayName: Use Python $(python.version)
inputs:
versionSpec: '$(python.version)'
- bash: pip install --upgrade 'virtualenv<20' setuptools tox
displayName: Install Tox
- script: tox -e py -- -m unit -n auto --junit-xml=junit/unit-test.xml
displayName: Tox run unit tests
# Run integration tests in two groups so we will fail faster if there is a failure in the first group
- script: tox -e py -- -m integration -n auto --durations=5 -k "not test_install" --junit-xml=junit/integration-test-group0.xml
displayName: Tox run Group 0 integration tests
- script: tox -e py -- -m integration -n auto --durations=5 -k "test_install" --junit-xml=junit/integration-test-group1.xml
displayName: Tox run Group 1 integration tests
- task: PublishTestResults@2
displayName: Publish Test Results
inputs:
testResultsFiles: junit/*.xml
testRunTitle: 'Python $(python.version)'
condition: succeededOrFailed()

View file

@ -1,11 +0,0 @@
variables:
CI: true
jobs:
- template: jobs/test-windows.yml
parameters:
vmImage: vs2017-win2016
- template: jobs/package.yml
parameters:
vmImage: vs2017-win2016

2
.gitattributes vendored
View file

@ -1,4 +1,4 @@
# Patches must have Unix-style line endings, even on Windows
tools/automation/vendoring/patches/* eol=lf
tools/vendoring/patches/* eol=lf
# The CA Bundle should always use Unix-style line endings, even on Windows
src/pip/_vendor/certifi/*.pem eol=lf

View file

@ -1,81 +1,62 @@
---
name: Bug report
about: Something is not working correctly.
description: Something is not working correctly.
title: ""
labels: "S: needs triage, type: bug"
issue_body: true # default: true, adds a classic WSYWIG textarea, if on
body:
- type: markdown
attributes:
value: |
If you're reporting an issue for `--use-feature=2020-resolver`,
use the "Dependency resolver failures / errors" template instead.
- type: markdown
attributes:
value: "**Environment**"
- type: input
attributes:
label: pip version
validations:
required: true
- type: input
attributes:
label: Python version
validations:
required: true
- type: input
attributes:
label: OS
validations:
required: true
- type: textarea
attributes:
label: Additional information
description: >-
Feel free to add more information about your environment here.
- type: textarea
body:
- type: textarea
attributes:
label: Description
description: >-
A clear and concise description of what the bug is.
validations:
required: true
- type: textarea
- type: textarea
attributes:
label: Expected behavior
description: >-
A clear and concise description of what you expected to happen.
- type: textarea
- type: input
attributes:
label: pip version
validations:
required: true
- type: input
attributes:
label: Python version
validations:
required: true
- type: input
attributes:
label: OS
validations:
required: true
- type: textarea
attributes:
label: How to Reproduce
description: >-
Describe the steps to reproduce this bug.
description: Please provide steps to reproduce this bug.
value: |
1. Get package from '...'
2. Then run '...'
3. An error occurs.
validations:
required: true
- type: textarea
- type: textarea
attributes:
label: Output
description: >-
Paste the output of the steps above, including the commands
Provide the output of the steps above, including the commands
themselves and pip's output/traceback etc.
value: |
```console
render: sh-session
```
- type: checkboxes
- type: checkboxes
attributes:
label: Code of Conduct
description: |
Read the [PSF Code of Conduct][CoC] first.
[CoC]: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
options:
- label: I agree to follow the PSF Code of Conduct
- label: >-
I agree to follow the [PSF Code of Conduct](https://www.python.org/psf/conduct/).
required: true
...

View file

@ -1,34 +0,0 @@
---
name: Dependency resolver failures / errors
about: Report when the pip dependency resolver fails
labels: ["K: UX", "K: crash", "C: new resolver", "C: dependency resolution"]
---
<!--
Please provide as much information as you can about your failure, so that we can understand the root cause.
Try if your issue has been fixed in the in-development version of pip. Use the following command to install pip from master:
python -m pip install -U "pip @ https://github.com/pypa/pip/archive/master.zip"
-->
**What did you want to do?**
<!-- Include any inputs you gave to pip, for example:
* Package requirements: any CLI arguments and/or your requirements.txt file
* Already installed packages, outputted via `pip freeze`
-->
**Output**
```
Paste what pip outputted in a code block. https://github.github.com/gfm/#fenced-code-blocks
```
**Additional information**
<!--
It would be great if you could also include your dependency tree. For this you can use pipdeptree: https://pypi.org/project/pipdeptree/
For users installing packages from a private repository or local directory, please try your best to describe your setup. We'd like to understand how to reproduce the error locally, so would need (at a minimum) a description of the packages you are trying to install, and a list of dependencies for each package.
-->

187
.github/workflows/ci.yml vendored Normal file
View file

@ -0,0 +1,187 @@
name: CI
on:
push:
branches: [master]
tags:
# Tags for all potential release numbers till 2030.
- "2[0-9].[0-3]" # 20.0 -> 29.3
- "2[0-9].[0-3].[0-9]+" # 20.0.0 -> 29.3.[0-9]+
pull_request:
schedule:
- cron: 0 0 * * MON # Run every Monday at 00:00 UTC
jobs:
determine-changes:
runs-on: ubuntu-latest
outputs:
tests: ${{ steps.filter.outputs.tests }}
vendoring: ${{ steps.filter.outputs.vendoring }}
steps:
# For pull requests it's not necessary to checkout the code
- uses: dorny/paths-filter@v2
id: filter
with:
filters: |
vendoring:
# Anything that's touching "vendored code"
- "src/pip/_vendor/**"
- "pyproject.toml"
tests:
# Anything that's touching testable stuff
- ".github/workflows/ci.yml"
- "tools/requirements/tests.txt"
- "src/**"
- "tests/**"
pre-commit:
name: pre-commit
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: pre-commit/action@v2.0.0
with:
extra_args: --hook-stage=manual
packaging:
name: packaging
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- name: Set up git credentials
run: |
git config --global user.email "pypa-dev@googlegroups.com"
git config --global user.name "pip"
- run: pip install nox
- run: nox -s prepare-release -- 99.9
- run: nox -s build-release -- 99.9
vendoring:
name: vendoring
runs-on: ubuntu-latest
needs: [determine-changes]
if: ${{ needs.determine-changes.outputs.vendoring == 'true' }}
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- run: pip install vendoring
- run: vendoring sync . --verbose
- run: git diff --exit-code
tests-unix:
name: tests / ${{ matrix.python }} / ${{ matrix.os }}
runs-on: ${{ matrix.os }}-latest
needs: [pre-commit, packaging, determine-changes]
if: ${{ needs.determine-changes.outputs.tests == 'true' }}
strategy:
fail-fast: true
matrix:
os: [Ubuntu, MacOS]
python:
- 3.6
- 3.7
- 3.8
- 3.9
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python }}
- run: pip install tox 'virtualenv<20'
# Main check
- name: Run unit tests
run: >-
tox -e py --
-m unit
--verbose --numprocesses auto --showlocals
- name: Run integration tests
run: >-
tox -e py --
-m integration
--verbose --numprocesses auto --showlocals
--durations=5
tests-windows:
name: tests / ${{ matrix.python }} / ${{ matrix.os }} / ${{ matrix.group }}
runs-on: ${{ matrix.os }}-latest
needs: [pre-commit, packaging, determine-changes]
if: ${{ needs.determine-changes.outputs.tests == 'true' }}
strategy:
fail-fast: true
matrix:
os: [Windows]
python:
- 3.6
# Commented out, since Windows tests are expensively slow.
# - 3.7
# - 3.8
- 3.9
group: [1, 2]
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python }}
# We use a RAMDisk on Windows, since filesystem IO is a big slowdown
# for our tests.
- name: Create a RAMDisk
run: ./tools/ci/New-RAMDisk.ps1 -Drive R -Size 1GB
- name: Setup RAMDisk permissions
run: |
mkdir R:\Temp
$acl = Get-Acl "R:\Temp"
$rule = New-Object System.Security.AccessControl.FileSystemAccessRule(
"Everyone", "FullControl", "ContainerInherit,ObjectInherit", "None", "Allow"
)
$acl.AddAccessRule($rule)
Set-Acl "R:\Temp" $acl
- run: pip install tox 'virtualenv<20'
env:
TEMP: "R:\\Temp"
# Main check
- name: Run unit tests
if: matrix.group == 1
run: >-
tox -e py --
-m unit
--verbose --numprocesses auto --showlocals
env:
TEMP: "R:\\Temp"
- name: Run integration tests (group 1)
if: matrix.group == 1
run: >-
tox -e py --
-m integration -k "not test_install"
--verbose --numprocesses auto --showlocals
env:
TEMP: "R:\\Temp"
- name: Run integration tests (group 2)
if: matrix.group == 2
run: >-
tox -e py --
-m integration -k "test_install"
--verbose --numprocesses auto --showlocals
env:
TEMP: "R:\\Temp"

View file

@ -1,53 +0,0 @@
name: Linting
on:
push:
pull_request:
schedule:
# Run every Friday at 18:02 UTC
- cron: 2 18 * * 5
jobs:
lint:
name: ${{ matrix.os }}
runs-on: ${{ matrix.os }}-latest
env:
TOXENV: lint,docs,vendoring
strategy:
matrix:
os:
- Ubuntu
- Windows
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.9
uses: actions/setup-python@v2
with:
python-version: 3.9
# Setup Caching
- name: pip cache
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('tools/requirements/tests.txt') }}-${{ hashFiles('tools/requirements/docs.txt') }}-${{ hashFiles('tox.ini') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
- name: Set PY (for pre-commit cache)
run: echo "PY=$(python -c 'import hashlib, sys;print(hashlib.sha256(sys.version.encode()+sys.executable.encode()).hexdigest())')" >> $GITHUB_ENV
- name: pre-commit cache
uses: actions/cache@v1
with:
path: ~/.cache/pre-commit
key: pre-commit|2020-02-14|${{ env.PY }}|${{ hashFiles('.pre-commit-config.yaml') }}
# Get the latest tox
- name: Install tox
run: python -m pip install tox
# Main check
- run: python -m tox

View file

@ -1,127 +0,0 @@
name: MacOS
on:
push:
pull_request:
schedule:
# Run every Friday at 18:02 UTC
- cron: 2 18 * * 5
jobs:
dev-tools:
name: Quality Check
runs-on: macos-latest
steps:
# Caches
- name: pip cache
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('tools/requirements/tests.txt') }}-${{ hashFiles('tools/requirements/docs.txt') }}-${{ hashFiles('tox.ini') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
- name: Set PY (for pre-commit cache)
run: echo "PY=$(python -c 'import hashlib, sys;print(hashlib.sha256(sys.version.encode()+sys.executable.encode()).hexdigest())')" >> $GITHUB_ENV
- name: pre-commit cache
uses: actions/cache@v1
with:
path: ~/.cache/pre-commit
key: pre-commit|2020-02-14|${{ env.PY }}|${{ hashFiles('.pre-commit-config.yaml') }}
# Setup
- uses: actions/checkout@v2
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install tox
run: python -m pip install tox
# Main check
- run: python -m tox -e "lint,docs"
packaging:
name: Packaging
runs-on: macos-latest
steps:
# Caches
- name: pip cache
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('tools/requirements/tests.txt') }}-${{ hashFiles('tools/requirements/docs.txt') }}-${{ hashFiles('tox.ini') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
# Setup
- name: Set up git credentials
run: |
git config --global user.email "pypa-dev@googlegroups.com"
git config --global user.name "pip"
- uses: actions/checkout@v2
- name: Set up Python 3.8
uses: actions/setup-python@v2
with:
python-version: 3.8
- name: Install tox and nox
run: python -m pip install tox nox
# Main check
- name: Check vendored packages
run: python -m tox -e "vendoring"
- name: Prepare dummy release
run: nox -s prepare-release -- 99.9
- name: Generate distributions for the dummy release
run: nox -s build-release -- 99.9
tests:
name: Tests / ${{ matrix.python }}
runs-on: macos-latest
needs: dev-tools
strategy:
fail-fast: false
matrix:
python: [3.6, 3.7, 3.8, 3.9]
steps:
# Caches
- name: pip cache
uses: actions/cache@v1
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('tools/requirements/tests.txt') }}-${{ hashFiles('tools/requirements/docs.txt') }}-${{ hashFiles('tox.ini') }}
restore-keys: |
${{ runner.os }}-pip-
${{ runner.os }}-
# Setup
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python }}
- name: Install tox
run: python -m pip install tox 'virtualenv<20'
# Main check
- name: Run unit tests
run: >-
python -m tox -e py --
-m unit
--verbose
--numprocesses auto
- name: Run integration tests
run: >-
python -m tox -e py --
-m integration
--verbose
--numprocesses auto
--durations=5

View file

@ -30,22 +30,15 @@ repos:
^src/pip/_internal/req|
^src/pip/_internal/vcs|
^src/pip/_internal/\w+\.py$|
^src/pip/__main__.py$|
^tools/|
# Tests
^tests/conftest.py|
^tests/yaml|
^tests/lib|
^tests/data|
^tests/unit|
^tests/functional/(?!test_install)|
^tests/functional/test_install|
# Files in the root of the repository
^setup.py|
# A blank ignore, to avoid merge conflicts later.
^$
- repo: https://gitlab.com/pycqa/flake8
rev: 3.8.4
hooks:
@ -66,7 +59,7 @@ repos:
rev: v0.800
hooks:
- id: mypy
exclude: docs|tests
exclude: tests
args: ["--pretty"]
additional_dependencies: ['nox==2020.12.31']

View file

@ -1,32 +0,0 @@
language: python
cache: pip
dist: xenial
python: 3.9
addons:
apt:
packages:
- bzr
stages:
- primary
- secondary
jobs:
include:
# Basic Checks
- stage: primary
env: TOXENV=docs
- env: TOXENV=lint
- env: TOXENV=vendoring
# Complete checking for ensuring compatibility
# PyPy
- stage: secondary
env: GROUP=1
python: pypy3.6-7.3.1
- env: GROUP=2
python: pypy3.6-7.3.1
before_install: tools/travis/setup.sh
install: travis_retry tools/travis/install.sh
script: tools/travis/run.sh

48
docs/html/cli/index.md Normal file
View file

@ -0,0 +1,48 @@
# Commands
The general options that apply to all the commands listed below can be
found [under the `pip` page in this section](pip).
```{toctree}
:maxdepth: 1
:hidden:
pip
```
```{toctree}
:maxdepth: 1
:caption: Environment Management and Introspection
pip_install
pip_uninstall
pip_list
pip_freeze
pip_check
```
```{toctree}
:maxdepth: 1
:caption: Handling Distribution Files
pip_download
pip_wheel
pip_hash
```
```{toctree}
:maxdepth: 1
:caption: Package Index information
pip_show
pip_search
```
```{toctree}
:maxdepth: 1
:caption: Managing pip itself
pip_cache
pip_config
pip_debug
```

255
docs/html/cli/pip.rst Normal file
View file

@ -0,0 +1,255 @@
===
pip
===
Usage
*****
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip <command> [options]
.. tab:: Windows
.. code-block:: shell
py -m pip <command> [options]
Description
***********
.. _`Logging`:
Logging
=======
Console logging
~~~~~~~~~~~~~~~
pip offers :ref:`-v, --verbose <--verbose>` and :ref:`-q, --quiet <--quiet>`
to control the console log level. By default, some messages (error and warnings)
are colored in the terminal. If you want to suppress the colored output use
:ref:`--no-color <--no-color>`.
.. _`FileLogging`:
File logging
~~~~~~~~~~~~
pip offers the :ref:`--log <--log>` option for specifying a file where a maximum
verbosity log will be kept. This option is empty by default. This log appends
to previous logging.
Like all pip options, ``--log`` can also be set as an environment variable, or
placed into the pip config file. See the :ref:`Configuration` section.
.. _`exists-action`:
--exists-action option
======================
This option specifies default behavior when path already exists.
Possible cases: downloading files or checking out repositories for installation,
creating archives. If ``--exists-action`` is not defined, pip will prompt
when decision is needed.
*(s)witch*
Only relevant to VCS checkout. Attempt to switch the checkout
to the appropriate URL and/or revision.
*(i)gnore*
Abort current operation (e.g. don't copy file, don't create archive,
don't modify a checkout).
*(w)ipe*
Delete the file or VCS checkout before trying to create, download, or checkout a new one.
*(b)ackup*
Rename the file or checkout to ``{name}{'.bak' * n}``, where n is some number
of ``.bak`` extensions, such that the file didn't exist at some point.
So the most recent backup will be the one with the largest number after ``.bak``.
*(a)abort*
Abort pip and return non-zero exit status.
.. _`build-interface`:
Build System Interface
======================
pip builds packages by invoking the build system. By default, builds will use
``setuptools``, but if a project specifies a different build system using a
``pyproject.toml`` file, as per :pep:`517`, pip will use that instead. As well
as package building, the build system is also invoked to install packages
direct from source. This is handled by invoking the build system to build a
wheel, and then installing from that wheel. The built wheel is cached locally
by pip to avoid repeated identical builds.
The current interface to the build system is via the ``setup.py`` command line
script - all build actions are defined in terms of the specific ``setup.py``
command line that will be run to invoke the required action.
Setuptools Injection
~~~~~~~~~~~~~~~~~~~~
When :pep:`517` is not used, the supported build system is ``setuptools``.
However, not all packages use ``setuptools`` in their build scripts. To support
projects that use "pure ``distutils``", pip injects ``setuptools`` into
``sys.modules`` before invoking ``setup.py``. The injection should be
transparent to ``distutils``-based projects, but 3rd party build tools wishing
to provide a ``setup.py`` emulating the commands pip requires may need to be
aware that it takes place.
Projects using :pep:`517` *must* explicitly use setuptools - pip does not do
the above injection process in this case.
Build System Output
~~~~~~~~~~~~~~~~~~~
Any output produced by the build system will be read by pip (for display to the
user if requested). In order to correctly read the build system output, pip
requires that the output is written in a well-defined encoding, specifically
the encoding the user has configured for text output (which can be obtained in
Python using ``locale.getpreferredencoding``). If the configured encoding is
ASCII, pip assumes UTF-8 (to account for the behaviour of some Unix systems).
Build systems should ensure that any tools they invoke (compilers, etc) produce
output in the correct encoding. In practice - and in particular on Windows,
where tools are inconsistent in their use of the "OEM" and "ANSI" codepages -
this may not always be possible. pip will therefore attempt to recover cleanly
if presented with incorrectly encoded build tool output, by translating
unexpected byte sequences to Python-style hexadecimal escape sequences
(``"\x80\xff"``, etc). However, it is still possible for output to be displayed
using an incorrect encoding (mojibake).
Under :pep:`517`, handling of build tool output is the backend's responsibility,
and pip simply displays the output produced by the backend. (Backends, however,
will likely still have to address the issues described above).
PEP 517 and 518 Support
~~~~~~~~~~~~~~~~~~~~~~~
As of version 10.0, pip supports projects declaring dependencies that are
required at install time using a ``pyproject.toml`` file, in the form described
in :pep:`518`. When building a project, pip will install the required
dependencies locally, and make them available to the build process.
Furthermore, from version 19.0 onwards, pip supports projects specifying the
build backend they use in ``pyproject.toml``, in the form described in
:pep:`517`.
When making build requirements available, pip does so in an *isolated
environment*. That is, pip does not install those requirements into the user's
``site-packages``, but rather installs them in a temporary directory which it
adds to the user's ``sys.path`` for the duration of the build. This ensures
that build requirements are handled independently of the user's runtime
environment. For example, a project that needs a recent version of setuptools
to build can still be installed, even if the user has an older version
installed (and without silently replacing that version).
In certain cases, projects (or redistributors) may have workflows that
explicitly manage the build environment. For such workflows, build isolation
can be problematic. If this is the case, pip provides a
``--no-build-isolation`` flag to disable build isolation. Users supplying this
flag are responsible for ensuring the build environment is managed
appropriately (including ensuring that all required build dependencies are
installed).
By default, pip will continue to use the legacy (direct ``setup.py`` execution
based) build processing for projects that do not have a ``pyproject.toml`` file.
Projects with a ``pyproject.toml`` file will use a :pep:`517` backend. Projects
with a ``pyproject.toml`` file, but which don't have a ``build-system`` section,
will be assumed to have the following backend settings::
[build-system]
requires = ["setuptools>=40.8.0", "wheel"]
build-backend = "setuptools.build_meta:__legacy__"
.. note::
``setuptools`` 40.8.0 is the first version of setuptools that offers a
:pep:`517` backend that closely mimics directly executing ``setup.py``.
If a project has ``[build-system]``, but no ``build-backend``, pip will also use
``setuptools.build_meta:__legacy__``, but will expect the project requirements
to include ``setuptools`` and ``wheel`` (and will report an error if the
installed version of ``setuptools`` is not recent enough).
If a user wants to explicitly request :pep:`517` handling even though a project
doesn't have a ``pyproject.toml`` file, this can be done using the
``--use-pep517`` command line option. Similarly, to request legacy processing
even though ``pyproject.toml`` is present, the ``--no-use-pep517`` option is
available (although obviously it is an error to choose ``--no-use-pep517`` if
the project has no ``setup.py``, or explicitly requests a build backend). As
with other command line flags, pip recognises the ``PIP_USE_PEP517``
environment veriable and a ``use-pep517`` config file option (set to true or
false) to set this option globally. Note that overriding pip's choice of
whether to use :pep:`517` processing in this way does *not* affect whether pip
will use an isolated build environment (which is controlled via
``--no-build-isolation`` as noted above).
Except in the case noted above (projects with no :pep:`518` ``[build-system]``
section in ``pyproject.toml``), pip will never implicitly install a build
system. Projects **must** ensure that the correct build system is listed in
their ``requires`` list (this applies even if pip assumes that the
``setuptools`` backend is being used, as noted above).
.. _pep-518-limitations:
**Historical Limitations**:
* ``pip<18.0``: only supports installing build requirements from wheels, and
does not support the use of environment markers and extras (only version
specifiers are respected).
* ``pip<18.1``: build dependencies using .pth files are not properly supported;
as a result namespace packages do not work under Python 3.2 and earlier.
Future Developments
~~~~~~~~~~~~~~~~~~~
:pep:`426` notes that the intention is to add hooks to project metadata in
version 2.1 of the metadata spec, to explicitly define how to build a project
from its source. Once this version of the metadata spec is final, pip will
migrate to using that interface. At that point, the ``setup.py`` interface
documented here will be retained solely for legacy purposes, until projects
have migrated.
Specifically, applications should *not* expect to rely on there being any form
of backward compatibility guarantees around the ``setup.py`` interface.
Build Options
~~~~~~~~~~~~~
The ``--global-option`` and ``--build-option`` arguments to the ``pip install``
and ``pip wheel`` inject additional arguments into the ``setup.py`` command
(``--build-option`` is only available in ``pip wheel``). These arguments are
included in the command as follows:
.. tab:: Unix/macOS
.. code-block:: console
python setup.py <global_options> BUILD COMMAND <build_options>
.. tab:: Windows
.. code-block:: shell
py setup.py <global_options> BUILD COMMAND <build_options>
The options are passed unmodified, and presently offer direct access to the
distutils command line. Use of ``--global-option`` and ``--build-option``
should be considered as build system dependent, and may not be supported in the
current form if support for alternative build systems is added to pip.
.. _`General Options`:
General Options
***************
.. pip-general-options::

View file

@ -0,0 +1,27 @@
.. _`pip cache`:
pip cache
---------
Usage
*****
.. tab:: Unix/macOS
.. pip-command-usage:: cache "python -m pip"
.. tab:: Windows
.. pip-command-usage:: cache "py -m pip"
Description
***********
.. pip-command-description:: cache
Options
*******
.. pip-command-options:: cache

View file

@ -0,0 +1,87 @@
.. _`pip check`:
=========
pip check
=========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: check "python -m pip"
.. tab:: Windows
.. pip-command-usage:: check "py -m pip"
Description
===========
.. pip-command-description:: check
Examples
========
#. If all dependencies are compatible:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
No broken requirements found.
$ echo $?
0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
No broken requirements found.
C:\> echo %errorlevel%
0
#. If a package is missing:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
pyramid 1.5.2 requires WebOb, which is not installed.
$ echo $?
1
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
pyramid 1.5.2 requires WebOb, which is not installed.
C:\> echo %errorlevel%
1
#. If a package has the wrong version:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
pyramid 1.5.2 has requirement WebOb>=1.3.1, but you have WebOb 0.8.
$ echo $?
1
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
pyramid 1.5.2 has requirement WebOb>=1.3.1, but you have WebOb 0.8.
C:\> echo %errorlevel%
1

View file

@ -0,0 +1,30 @@
.. _`pip config`:
==========
pip config
==========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: config "python -m pip"
.. tab:: Windows
.. pip-command-usage:: config "py -m pip"
Description
===========
.. pip-command-description:: config
Options
=======
.. pip-command-options:: config

View file

@ -0,0 +1,35 @@
.. _`pip debug`:
=========
pip debug
=========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: debug "python -m pip"
.. tab:: Windows
.. pip-command-usage:: debug "py -m pip"
.. warning::
This command is only meant for debugging.
Its options and outputs are provisional and may change without notice.
Description
===========
.. pip-command-description:: debug
Options
=======
.. pip-command-options:: debug

View file

@ -0,0 +1,226 @@
.. _`pip download`:
============
pip download
============
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: download "python -m pip"
.. tab:: Windows
.. pip-command-usage:: download "py -m pip"
Description
===========
.. pip-command-description:: download
Overview
--------
``pip download`` does the same resolution and downloading as ``pip install``,
but instead of installing the dependencies, it collects the downloaded
distributions into the directory provided (defaulting to the current
directory). This directory can later be passed as the value to ``pip install
--find-links`` to facilitate offline or locked down package installation.
``pip download`` with the ``--platform``, ``--python-version``,
``--implementation``, and ``--abi`` options provides the ability to fetch
dependencies for an interpreter and system other than the ones that pip is
running on. ``--only-binary=:all:`` or ``--no-deps`` is required when using any
of these options. It is important to note that these options all default to the
current system/interpreter, and not to the most restrictive constraints (e.g.
platform any, abi none, etc). To avoid fetching dependencies that happen to
match the constraint of the current interpreter (but not your target one), it
is recommended to specify all of these options if you are specifying one of
them. Generic dependencies (e.g. universal wheels, or dependencies with no
platform, abi, or implementation constraints) will still match an over-
constrained download requirement.
Options
=======
.. pip-command-options:: download
.. pip-index-options:: download
Examples
========
#. Download a package and all of its dependencies
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download SomePackage
python -m pip download -d . SomePackage # equivalent to above
python -m pip download --no-index --find-links=/tmp/wheelhouse -d /tmp/otherwheelhouse SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download SomePackage
py -m pip download -d . SomePackage # equivalent to above
py -m pip download --no-index --find-links=/tmp/wheelhouse -d /tmp/otherwheelhouse SomePackage
#. Download a package and all of its dependencies with OSX specific interpreter constraints.
This forces OSX 10.10 or lower compatibility. Since OSX deps are forward compatible,
this will also match ``macosx-10_9_x86_64``, ``macosx-10_8_x86_64``, ``macosx-10_8_intel``,
etc.
It will also match deps with platform ``any``. Also force the interpreter version to ``27``
(or more generic, i.e. ``2``) and implementation to ``cp`` (or more generic, i.e. ``py``).
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform macosx-10_10_x86_64 \
--python-version 27 \
--implementation cp \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform macosx-10_10_x86_64 ^
--python-version 27 ^
--implementation cp ^
SomePackage
#. Download a package and its dependencies with linux specific constraints.
Force the interpreter to be any minor version of py3k, and only accept
``cp34m`` or ``none`` as the abi.
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform linux_x86_64 \
--python-version 3 \
--implementation cp \
--abi cp34m \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform linux_x86_64 ^
--python-version 3 ^
--implementation cp ^
--abi cp34m ^
SomePackage
#. Force platform, implementation, and abi agnostic deps.
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform any \
--python-version 3 \
--implementation py \
--abi none \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform any ^
--python-version 3 ^
--implementation py ^
--abi none ^
SomePackage
#. Even when overconstrained, this will still correctly fetch the pip universal wheel.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download \
--only-binary=:all: \
--platform linux_x86_64 \
--python-version 33 \
--implementation cp \
--abi cp34m \
pip>=8
.. code-block:: console
$ ls pip-8.1.1-py2.py3-none-any.whl
pip-8.1.1-py2.py3-none-any.whl
.. tab:: Windows
.. code-block:: console
C:\> py -m pip download ^
--only-binary=:all: ^
--platform linux_x86_64 ^
--python-version 33 ^
--implementation cp ^
--abi cp34m ^
pip>=8
.. code-block:: console
C:\> dir pip-8.1.1-py2.py3-none-any.whl
pip-8.1.1-py2.py3-none-any.whl
#. Download a package supporting one of several ABIs and platforms.
This is useful when fetching wheels for a well-defined interpreter, whose
supported ABIs and platforms are known and fixed, different than the one pip is
running under.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download \
--only-binary=:all: \
--platform manylinux1_x86_64 --platform linux_x86_64 --platform any \
--python-version 36 \
--implementation cp \
--abi cp36m --abi cp36 --abi abi3 --abi none \
SomePackage
.. tab:: Windows
.. code-block:: console
C:> py -m pip download ^
--only-binary=:all: ^
--platform manylinux1_x86_64 --platform linux_x86_64 --platform any ^
--python-version 36 ^
--implementation cp ^
--abi cp36m --abi cp36 --abi abi3 --abi none ^
SomePackage

View file

@ -0,0 +1,74 @@
.. _`pip freeze`:
==========
pip freeze
==========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: freeze "python -m pip"
.. tab:: Windows
.. pip-command-usage:: freeze "py -m pip"
Description
===========
.. pip-command-description:: freeze
Options
=======
.. pip-command-options:: freeze
Examples
========
#. Generate output suitable for a requirements file.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip freeze
docutils==0.11
Jinja2==2.7.2
MarkupSafe==0.19
Pygments==1.6
Sphinx==1.2.2
.. tab:: Windows
.. code-block:: console
C:\> py -m pip freeze
docutils==0.11
Jinja2==2.7.2
MarkupSafe==0.19
Pygments==1.6
Sphinx==1.2.2
#. Generate a requirements file and then install from it in another environment.
.. tab:: Unix/macOS
.. code-block:: shell
env1/bin/python -m pip freeze > requirements.txt
env2/bin/python -m pip install -r requirements.txt
.. tab:: Windows
.. code-block:: shell
env1\bin\python -m pip freeze > requirements.txt
env2\bin\python -m pip install -r requirements.txt

View file

@ -0,0 +1,72 @@
.. _`pip hash`:
========
pip hash
========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: hash "python -m pip"
.. tab:: Windows
.. pip-command-usage:: hash "py -m pip"
Description
===========
.. pip-command-description:: hash
Overview
--------
``pip hash`` is a convenient way to get a hash digest for use with
:ref:`hash-checking mode`, especially for packages with multiple archives. The
error message from ``pip install --require-hashes ...`` will give you one
hash, but, if there are multiple archives (like source and binary ones), you
will need to manually download and compute a hash for the others. Otherwise, a
spurious hash mismatch could occur when :ref:`pip install` is passed a
different set of options, like :ref:`--no-binary <install_--no-binary>`.
Options
=======
.. pip-command-options:: hash
Example
=======
Compute the hash of a downloaded archive:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download SomePackage
Collecting SomePackage
Downloading SomePackage-2.2.tar.gz
Saved ./pip_downloads/SomePackage-2.2.tar.gz
Successfully downloaded SomePackage
$ python -m pip hash ./pip_downloads/SomePackage-2.2.tar.gz
./pip_downloads/SomePackage-2.2.tar.gz:
--hash=sha256:93e62e05c7ad3da1a233def6731e8285156701e3419a5fe279017c429ec67ce0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip download SomePackage
Collecting SomePackage
Downloading SomePackage-2.2.tar.gz
Saved ./pip_downloads/SomePackage-2.2.tar.gz
Successfully downloaded SomePackage
C:\> py -m pip hash ./pip_downloads/SomePackage-2.2.tar.gz
./pip_downloads/SomePackage-2.2.tar.gz:
--hash=sha256:93e62e05c7ad3da1a233def6731e8285156701e3419a5fe279017c429ec67ce0

File diff suppressed because it is too large Load diff

201
docs/html/cli/pip_list.rst Normal file
View file

@ -0,0 +1,201 @@
.. _`pip list`:
========
pip list
========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: list "python -m pip"
.. tab:: Windows
.. pip-command-usage:: list "py -m pip"
Description
===========
.. pip-command-description:: list
Options
=======
.. pip-command-options:: list
.. pip-index-options:: list
Examples
========
#. List installed packages.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list
docutils (0.10)
Jinja2 (2.7.2)
MarkupSafe (0.18)
Pygments (1.6)
Sphinx (1.2.1)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list
docutils (0.10)
Jinja2 (2.7.2)
MarkupSafe (0.18)
Pygments (1.6)
Sphinx (1.2.1)
#. List outdated packages (excluding editables), and the latest version available.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --outdated
docutils (Current: 0.10 Latest: 0.11)
Sphinx (Current: 1.2.1 Latest: 1.2.2)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --outdated
docutils (Current: 0.10 Latest: 0.11)
Sphinx (Current: 1.2.1 Latest: 1.2.2)
#. List installed packages with column formatting.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format columns
Package Version
------- -------
docopt 0.6.2
idlex 1.13
jedi 0.9.0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format columns
Package Version
------- -------
docopt 0.6.2
idlex 1.13
jedi 0.9.0
#. List outdated packages with column formatting.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list -o --format columns
Package Version Latest Type
---------- ------- ------ -----
retry 0.8.1 0.9.1 wheel
setuptools 20.6.7 21.0.0 wheel
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list -o --format columns
Package Version Latest Type
---------- ------- ------ -----
retry 0.8.1 0.9.1 wheel
setuptools 20.6.7 21.0.0 wheel
#. List packages that are not dependencies of other packages. Can be combined with
other options.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --outdated --not-required
docutils (Current: 0.10 Latest: 0.11)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --outdated --not-required
docutils (Current: 0.10 Latest: 0.11)
#. Use legacy formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=legacy
colorama (0.3.7)
docopt (0.6.2)
idlex (1.13)
jedi (0.9.0)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=legacy
colorama (0.3.7)
docopt (0.6.2)
idlex (1.13)
jedi (0.9.0)
#. Use json formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=json
[{'name': 'colorama', 'version': '0.3.7'}, {'name': 'docopt', 'version': '0.6.2'}, ...
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=json
[{'name': 'colorama', 'version': '0.3.7'}, {'name': 'docopt', 'version': '0.6.2'}, ...
#. Use freeze formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=freeze
colorama==0.3.7
docopt==0.6.2
idlex==1.13
jedi==0.9.0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=freeze
colorama==0.3.7
docopt==0.6.2
idlex==1.13
jedi==0.9.0

View file

@ -0,0 +1,52 @@
.. _`pip search`:
==========
pip search
==========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: search "python -m pip"
.. tab:: Windows
.. pip-command-usage:: search "py -m pip"
Description
===========
.. pip-command-description:: search
Options
=======
.. pip-command-options:: search
Examples
========
#. Search for "peppercorn"
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip search peppercorn
pepperedform - Helpers for using peppercorn with formprocess.
peppercorn - A library for converting a token stream into [...]
.. tab:: Windows
.. code-block:: console
C:\> py -m pip search peppercorn
pepperedform - Helpers for using peppercorn with formprocess.
peppercorn - A library for converting a token stream into [...]

154
docs/html/cli/pip_show.rst Normal file
View file

@ -0,0 +1,154 @@
.. _`pip show`:
========
pip show
========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: show "python -m pip"
.. tab:: Windows
.. pip-command-usage:: show "py -m pip"
Description
===========
.. pip-command-description:: show
Options
=======
.. pip-command-options:: show
Examples
========
#. Show information about a package:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip show sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
.. tab:: Windows
.. code-block:: console
C:\> py -m pip show sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
#. Show all information about a package
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip show --verbose sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
Metadata-Version: 2.0
Installer:
Classifiers:
Development Status :: 5 - Production/Stable
Environment :: Console
Environment :: Web Environment
Intended Audience :: Developers
Intended Audience :: Education
License :: OSI Approved :: BSD License
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Framework :: Sphinx
Framework :: Sphinx :: Extension
Framework :: Sphinx :: Theme
Topic :: Documentation
Topic :: Documentation :: Sphinx
Topic :: Text Processing
Topic :: Utilities
Entry-points:
[console_scripts]
sphinx-apidoc = sphinx.apidoc:main
sphinx-autogen = sphinx.ext.autosummary.generate:main
sphinx-build = sphinx:main
sphinx-quickstart = sphinx.quickstart:main
[distutils.commands]
build_sphinx = sphinx.setup_command:BuildDoc
.. tab:: Windows
.. code-block:: console
C:\> py -m pip show --verbose sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
Metadata-Version: 2.0
Installer:
Classifiers:
Development Status :: 5 - Production/Stable
Environment :: Console
Environment :: Web Environment
Intended Audience :: Developers
Intended Audience :: Education
License :: OSI Approved :: BSD License
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Framework :: Sphinx
Framework :: Sphinx :: Extension
Framework :: Sphinx :: Theme
Topic :: Documentation
Topic :: Documentation :: Sphinx
Topic :: Text Processing
Topic :: Utilities
Entry-points:
[console_scripts]
sphinx-apidoc = sphinx.apidoc:main
sphinx-autogen = sphinx.ext.autosummary.generate:main
sphinx-build = sphinx:main
sphinx-quickstart = sphinx.quickstart:main
[distutils.commands]
build_sphinx = sphinx.setup_command:BuildDoc

View file

@ -0,0 +1,58 @@
.. _`pip uninstall`:
=============
pip uninstall
=============
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: uninstall "python -m pip"
.. tab:: Windows
.. pip-command-usage:: uninstall "py -m pip"
Description
===========
.. pip-command-description:: uninstall
Options
=======
.. pip-command-options:: uninstall
Examples
========
#. Uninstall a package.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip uninstall simplejson
Uninstalling simplejson:
/home/me/env/lib/python3.9/site-packages/simplejson
/home/me/env/lib/python3.9/site-packages/simplejson-2.2.1-py3.9.egg-info
Proceed (y/n)? y
Successfully uninstalled simplejson
.. tab:: Windows
.. code-block:: console
C:\> py -m pip uninstall simplejson
Uninstalling simplejson:
/home/me/env/lib/python3.9/site-packages/simplejson
/home/me/env/lib/python3.9/site-packages/simplejson-2.2.1-py3.9.egg-info
Proceed (y/n)? y
Successfully uninstalled simplejson

125
docs/html/cli/pip_wheel.rst Normal file
View file

@ -0,0 +1,125 @@
.. _`pip wheel`:
=========
pip wheel
=========
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: wheel "python -m pip"
.. tab:: Windows
.. pip-command-usage:: wheel "py -m pip"
Description
===========
.. pip-command-description:: wheel
Build System Interface
----------------------
In order for pip to build a wheel, ``setup.py`` must implement the
``bdist_wheel`` command with the following syntax:
.. tab:: Unix/macOS
.. code-block:: shell
python setup.py bdist_wheel -d TARGET
.. tab:: Windows
.. code-block:: shell
py setup.py bdist_wheel -d TARGET
This command must create a wheel compatible with the invoking Python
interpreter, and save that wheel in the directory TARGET.
No other build system commands are invoked by the ``pip wheel`` command.
Customising the build
^^^^^^^^^^^^^^^^^^^^^
It is possible using ``--global-option`` to include additional build commands
with their arguments in the ``setup.py`` command. This is currently the only
way to influence the building of C extensions from the command line. For
example:
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --global-option bdist_ext --global-option -DFOO wheel
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --global-option bdist_ext --global-option -DFOO wheel
will result in a build command of
::
setup.py bdist_ext -DFOO bdist_wheel -d TARGET
which passes a preprocessor symbol to the extension build.
Such usage is considered highly build-system specific and more an accident of
the current implementation than a supported interface.
Options
=======
.. pip-command-options:: wheel
.. pip-index-options:: wheel
Examples
========
#. Build wheels for a requirement (and all its dependencies), and then install
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --wheel-dir=/tmp/wheelhouse SomePackage
python -m pip install --no-index --find-links=/tmp/wheelhouse SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --wheel-dir=/tmp/wheelhouse SomePackage
py -m pip install --no-index --find-links=/tmp/wheelhouse SomePackage
#. Build a wheel for a package from source
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --no-binary SomePackage SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --no-binary SomePackage SomePackage

View file

@ -5,6 +5,7 @@ import os
import pathlib
import re
import sys
from typing import List, Tuple
# Add the docs/ directory to sys.path, because pip_sphinxext.py is there.
docs_dir = os.path.dirname(os.path.dirname(__file__))
@ -93,10 +94,10 @@ html_use_index = False
# List of manual pages generated
def determine_man_pages():
def determine_man_pages() -> List[Tuple[str, str, str, str, int]]:
"""Determine which man pages need to be generated."""
def to_document_name(path, base_dir):
def to_document_name(path: str, base_dir: str) -> str:
"""Convert a provided path to a Sphinx "document name"."""
relative_path = os.path.relpath(path, base_dir)
root, _ = os.path.splitext(relative_path)

View file

@ -6,4 +6,4 @@ Copyright
pip and this documentation is:
Copyright © 2008-2020 The pip developers (see `AUTHORS.txt <https://github.com/pypa/pip/blob/master/AUTHORS.txt>`_ file). All rights reserved.
Copyright © 2008-2020 The pip developers (see `AUTHORS.txt <https://github.com/pypa/pip/blob/main/AUTHORS.txt>`_ file). All rights reserved.

View file

@ -51,7 +51,6 @@ The ``README``, license, ``pyproject.toml``, ``setup.py``, and so on are in the
* ``functional/`` *[functional tests of pips CLI -- end-to-end, invoke pip in subprocess & check results of execution against desired result. This also is what makes test suite slow]*
* ``lib/`` *[helpers for tests]*
* ``unit/`` *[unit tests -- fast and small and nice!]*
* ``yaml/`` *[resolver tests! Theyre written in YAML. This folder just contains .yaml files -- actual code for reading/running them is in lib/yaml.py . This is fine!]*
* ``tools`` *[misc development workflow tools, like requirements files & Travis CI files & helpers for tox]*
* ``.azure-pipelines``
@ -105,5 +104,5 @@ Within ``src/``:
.. _`tracking issue`: https://github.com/pypa/pip/issues/6831
.. _GitHub repository: https://github.com/pypa/pip/
.. _tox.ini: https://github.com/pypa/pip/blob/master/tox.ini
.. _tox.ini: https://github.com/pypa/pip/blob/main/tox.ini
.. _improving the pip dependency resolver: https://github.com/pypa/pip/issues/988

View file

@ -1,7 +1,8 @@
.. note::
This section of the documentation is currently being written. pip
developers welcome your help to complete this documentation. If
This section of the documentation is currently out of date.
pip developers welcome your help to update this documentation. If
you're interested in helping out, please let us know in the
`tracking issue`_, or just submit a pull request and mention it in
that tracking issue.

View file

@ -11,7 +11,7 @@ We have an in-progress guide to the
Submitting Pull Requests
========================
Submit pull requests against the ``master`` branch, providing a good
Submit pull requests against the ``main`` branch, providing a good
description of what you're doing and why. You must have legal permission to
distribute any code you contribute to pip and it must be available under the
MIT License.
@ -39,7 +39,7 @@ separately, as a "formatting cleanup" PR, if needed.
Automated Testing
=================
All pull requests and merges to 'master' branch are tested using `Travis CI`_,
All pull requests and merges to 'main' branch are tested using `Travis CI`_,
`Azure Pipelines`_ and `GitHub Actions`_ based on our `.travis.yml`_,
`.azure-pipelines`_ and `.github/workflows`_ files. More details about pip's
Continuous Integration can be found in the `CI Documentation`_
@ -131,8 +131,8 @@ updating deprecation policy, etc.
Updating your branch
====================
As you work, you might need to update your local master branch up-to-date with
the ``master`` branch in the main pip repository, which moves forward as the
As you work, you might need to update your local main branch up-to-date with
the ``main`` branch in the main pip repository, which moves forward as the
maintainers merge pull requests. Most people working on the project use the
following workflow.
@ -160,24 +160,24 @@ First, fetch the latest changes from the main pip repository, ``upstream``:
git fetch upstream
Then, check out your local ``master`` branch, and rebase the changes on top of
Then, check out your local ``main`` branch, and rebase the changes on top of
it:
.. code-block:: console
git checkout master
git rebase upstream/master
git checkout main
git rebase upstream/main
At this point, you might have to `resolve merge conflicts`_. Once this is done,
push the updates you have just made to your local ``master`` branch to your
push the updates you have just made to your local ``main`` branch to your
``origin`` repository on GitHub:
.. code-block:: console
git checkout master
git push origin master
git checkout main
git push origin main
Now your local ``master`` branch and the ``master`` branch in your ``origin``
Now your local ``main`` branch and the ``main`` branch in your ``origin``
repo have been updated with the most recent changes from the main pip
repository.
@ -187,10 +187,10 @@ To keep your branches updated, the process is similar:
git checkout awesome-feature
git fetch upstream
git rebase upstream/master
git rebase upstream/main
Now your branch has been updated with the latest changes from the
``master`` branch on the upstream pip repository.
``main`` branch on the upstream pip repository.
It's good practice to back up your branches by pushing them to your
``origin`` on GitHub as you are working on them. To push a branch,
@ -230,7 +230,7 @@ If you get an error message like this:
Try force-pushing your branch with ``push -f``.
The ``master`` branch in the main pip repository gets updated frequently, so
The ``main`` branch in the main pip repository gets updated frequently, so
you might have to update your branch at least once while you are working on it.
Thank you for your contribution!
@ -267,9 +267,9 @@ will initiate a vote among the existing maintainers.
.. _`Travis CI`: https://travis-ci.org/
.. _`Azure Pipelines`: https://azure.microsoft.com/en-in/services/devops/pipelines/
.. _`GitHub Actions`: https://github.com/features/actions
.. _`.travis.yml`: https://github.com/pypa/pip/blob/master/.travis.yml
.. _`.azure-pipelines`: https://github.com/pypa/pip/blob/master/.azure-pipelines
.. _`.github/workflows`: https://github.com/pypa/pip/blob/master/.github/workflows
.. _`.travis.yml`: https://github.com/pypa/pip/blob/main/.travis.yml
.. _`.azure-pipelines`: https://github.com/pypa/pip/blob/main/.azure-pipelines
.. _`.github/workflows`: https://github.com/pypa/pip/blob/main/.github/workflows
.. _`CI Documentation`: https://pip.pypa.io/en/latest/development/ci/
.. _`towncrier`: https://pypi.org/project/towncrier/
.. _`Testing the next-gen pip dependency resolver`: https://pradyunsg.me/blog/2020/03/27/pip-resolver-testing/

View file

@ -229,7 +229,7 @@ Examples:
(`link <https://github.com/pypa/pip/issues/6498#issuecomment-513501112>`__)
- get-pip on system with no ``/usr/lib64``
(`link <https://github.com/pypa/pip/issues/5379#issuecomment-515270576>`__)
- reproducing with ``pip`` from master branch
- reproducing with ``pip`` from current development branch
(`link <https://github.com/pypa/pip/issues/6707#issue-467770959>`__)
@ -285,7 +285,7 @@ An issue may be considered resolved and closed when:
- already tracked by another issue
- A project-specific issue has been identified and the issue no
longer occurs as of the latest commit on the master branch.
longer occurs as of the latest commit on the main branch.
- An enhancement or feature request no longer has a proponent and the maintainers
don't think it's worth keeping open.

View file

@ -7,7 +7,7 @@ Release process
Release Cadence
===============
The pip project has a release cadence of releasing whatever is on ``master``
The pip project has a release cadence of releasing whatever is on ``main``
every 3 months. This gives users a predictable pattern for when releases
are going to happen and prevents locking up improvements for fixes for long
periods of time, while still preventing massively fracturing the user base
@ -22,8 +22,8 @@ The release manager may, at their discretion, choose whether or not there
will be a pre-release period for a release, and if there is may extend that
period into the next month if needed.
Because releases are made direct from the ``master`` branch, it is essential
that ``master`` is always in a releasable state. It is acceptable to merge
Because releases are made direct from the ``main`` branch, it is essential
that ``main`` is always in a releasable state. It is acceptable to merge
PRs that partially implement a new feature, but only if the partially
implemented version is usable in that state (for example, with reduced
functionality or disabled by default). In the case where a merged PR is found
@ -116,13 +116,13 @@ Release Process
Creating a new release
----------------------
#. Checkout the current pip ``master`` branch.
#. Checkout the current pip ``main`` branch.
#. Ensure you have the latest ``nox`` installed.
#. Prepare for release using ``nox -s prepare-release -- YY.N``.
This will update the relevant files and tag the correct commit.
#. Build the release artifacts using ``nox -s build-release -- YY.N``.
This will checkout the tag, generate the distribution files to be
uploaded and checkout the master branch again.
uploaded and checkout the main branch again.
#. Upload the release to PyPI using ``nox -s upload-release -- YY.N``.
#. Push all of the changes including the tag.
#. Regenerate the ``get-pip.py`` script in the `get-pip repository`_ (as
@ -155,20 +155,20 @@ Creating a bug-fix release
Sometimes we need to release a bugfix release of the form ``YY.N.Z+1``. In
order to create one of these the changes should already be merged into the
``master`` branch.
``main`` branch.
#. Create a new ``release/YY.N.Z+1`` branch off of the ``YY.N`` tag using the
command ``git checkout -b release/YY.N.Z+1 YY.N``.
#. Cherry pick the fixed commits off of the ``master`` branch, fixing any
#. Cherry pick the fixed commits off of the ``main`` branch, fixing any
conflicts.
#. Run ``nox -s prepare-release -- YY.N.Z+1``.
#. Merge master into your release branch and drop the news files that have been
#. Merge main into your release branch and drop the news files that have been
included in your release (otherwise they would also appear in the ``YY.N+1``
changelog)
#. Push the ``release/YY.N.Z+1`` branch to github and submit a PR for it against
the ``master`` branch and wait for the tests to run.
#. Once tests run, merge the ``release/YY.N.Z+1`` branch into master, and follow
the above release process starting with step 4.
the ``main`` branch and wait for the tests to run.
#. Once tests run, merge the ``release/YY.N.Z+1`` branch into ``main``, and
follow the above release process starting with step 4.
.. _`get-pip repository`: https://github.com/pypa/get-pip
.. _`psf-salt repository`: https://github.com/python/psf-salt

View file

@ -13,7 +13,7 @@ install packages from the [Python Package Index][pypi] and other indexes.
quickstart
installing
user_guide
reference/index
cli/index
```
```{toctree}

View file

@ -9,4 +9,4 @@ Changelog
.. towncrier-draft-entries:: |release|, unreleased as on
.. include:: ../../NEWS.rst
.. pip-news-include:: ../../NEWS.rst

View file

@ -1,21 +1,11 @@
===============
Reference Guide
===============
:orphan:
.. toctree::
:maxdepth: 2
.. meta::
pip
pip_install
pip_download
pip_uninstall
pip_freeze
pip_list
pip_show
pip_search
pip_cache
pip_check
pip_config
pip_wheel
pip_hash
pip_debug
:http-equiv=refresh: 3; url=../cli/
This page has moved
===================
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/index`

View file

@ -1,255 +1,11 @@
===
pip
===
:orphan:
.. meta::
Usage
*****
:http-equiv=refresh: 3; url=../../cli/pip/
.. tab:: Unix/macOS
This page has moved
===================
.. code-block:: shell
python -m pip <command> [options]
.. tab:: Windows
.. code-block:: shell
py -m pip <command> [options]
Description
***********
.. _`Logging`:
Logging
=======
Console logging
~~~~~~~~~~~~~~~
pip offers :ref:`-v, --verbose <--verbose>` and :ref:`-q, --quiet <--quiet>`
to control the console log level. By default, some messages (error and warnings)
are colored in the terminal. If you want to suppress the colored output use
:ref:`--no-color <--no-color>`.
.. _`FileLogging`:
File logging
~~~~~~~~~~~~
pip offers the :ref:`--log <--log>` option for specifying a file where a maximum
verbosity log will be kept. This option is empty by default. This log appends
to previous logging.
Like all pip options, ``--log`` can also be set as an environment variable, or
placed into the pip config file. See the :ref:`Configuration` section.
.. _`exists-action`:
--exists-action option
======================
This option specifies default behavior when path already exists.
Possible cases: downloading files or checking out repositories for installation,
creating archives. If ``--exists-action`` is not defined, pip will prompt
when decision is needed.
*(s)witch*
Only relevant to VCS checkout. Attempt to switch the checkout
to the appropriate URL and/or revision.
*(i)gnore*
Abort current operation (e.g. don't copy file, don't create archive,
don't modify a checkout).
*(w)ipe*
Delete the file or VCS checkout before trying to create, download, or checkout a new one.
*(b)ackup*
Rename the file or checkout to ``{name}{'.bak' * n}``, where n is some number
of ``.bak`` extensions, such that the file didn't exist at some point.
So the most recent backup will be the one with the largest number after ``.bak``.
*(a)abort*
Abort pip and return non-zero exit status.
.. _`build-interface`:
Build System Interface
======================
pip builds packages by invoking the build system. By default, builds will use
``setuptools``, but if a project specifies a different build system using a
``pyproject.toml`` file, as per :pep:`517`, pip will use that instead. As well
as package building, the build system is also invoked to install packages
direct from source. This is handled by invoking the build system to build a
wheel, and then installing from that wheel. The built wheel is cached locally
by pip to avoid repeated identical builds.
The current interface to the build system is via the ``setup.py`` command line
script - all build actions are defined in terms of the specific ``setup.py``
command line that will be run to invoke the required action.
Setuptools Injection
~~~~~~~~~~~~~~~~~~~~
When :pep:`517` is not used, the supported build system is ``setuptools``.
However, not all packages use ``setuptools`` in their build scripts. To support
projects that use "pure ``distutils``", pip injects ``setuptools`` into
``sys.modules`` before invoking ``setup.py``. The injection should be
transparent to ``distutils``-based projects, but 3rd party build tools wishing
to provide a ``setup.py`` emulating the commands pip requires may need to be
aware that it takes place.
Projects using :pep:`517` *must* explicitly use setuptools - pip does not do
the above injection process in this case.
Build System Output
~~~~~~~~~~~~~~~~~~~
Any output produced by the build system will be read by pip (for display to the
user if requested). In order to correctly read the build system output, pip
requires that the output is written in a well-defined encoding, specifically
the encoding the user has configured for text output (which can be obtained in
Python using ``locale.getpreferredencoding``). If the configured encoding is
ASCII, pip assumes UTF-8 (to account for the behaviour of some Unix systems).
Build systems should ensure that any tools they invoke (compilers, etc) produce
output in the correct encoding. In practice - and in particular on Windows,
where tools are inconsistent in their use of the "OEM" and "ANSI" codepages -
this may not always be possible. pip will therefore attempt to recover cleanly
if presented with incorrectly encoded build tool output, by translating
unexpected byte sequences to Python-style hexadecimal escape sequences
(``"\x80\xff"``, etc). However, it is still possible for output to be displayed
using an incorrect encoding (mojibake).
Under :pep:`517`, handling of build tool output is the backend's responsibility,
and pip simply displays the output produced by the backend. (Backends, however,
will likely still have to address the issues described above).
PEP 517 and 518 Support
~~~~~~~~~~~~~~~~~~~~~~~
As of version 10.0, pip supports projects declaring dependencies that are
required at install time using a ``pyproject.toml`` file, in the form described
in :pep:`518`. When building a project, pip will install the required
dependencies locally, and make them available to the build process.
Furthermore, from version 19.0 onwards, pip supports projects specifying the
build backend they use in ``pyproject.toml``, in the form described in
:pep:`517`.
When making build requirements available, pip does so in an *isolated
environment*. That is, pip does not install those requirements into the user's
``site-packages``, but rather installs them in a temporary directory which it
adds to the user's ``sys.path`` for the duration of the build. This ensures
that build requirements are handled independently of the user's runtime
environment. For example, a project that needs a recent version of setuptools
to build can still be installed, even if the user has an older version
installed (and without silently replacing that version).
In certain cases, projects (or redistributors) may have workflows that
explicitly manage the build environment. For such workflows, build isolation
can be problematic. If this is the case, pip provides a
``--no-build-isolation`` flag to disable build isolation. Users supplying this
flag are responsible for ensuring the build environment is managed
appropriately (including ensuring that all required build dependencies are
installed).
By default, pip will continue to use the legacy (direct ``setup.py`` execution
based) build processing for projects that do not have a ``pyproject.toml`` file.
Projects with a ``pyproject.toml`` file will use a :pep:`517` backend. Projects
with a ``pyproject.toml`` file, but which don't have a ``build-system`` section,
will be assumed to have the following backend settings::
[build-system]
requires = ["setuptools>=40.8.0", "wheel"]
build-backend = "setuptools.build_meta:__legacy__"
.. note::
``setuptools`` 40.8.0 is the first version of setuptools that offers a
:pep:`517` backend that closely mimics directly executing ``setup.py``.
If a project has ``[build-system]``, but no ``build-backend``, pip will also use
``setuptools.build_meta:__legacy__``, but will expect the project requirements
to include ``setuptools`` and ``wheel`` (and will report an error if the
installed version of ``setuptools`` is not recent enough).
If a user wants to explicitly request :pep:`517` handling even though a project
doesn't have a ``pyproject.toml`` file, this can be done using the
``--use-pep517`` command line option. Similarly, to request legacy processing
even though ``pyproject.toml`` is present, the ``--no-use-pep517`` option is
available (although obviously it is an error to choose ``--no-use-pep517`` if
the project has no ``setup.py``, or explicitly requests a build backend). As
with other command line flags, pip recognises the ``PIP_USE_PEP517``
environment veriable and a ``use-pep517`` config file option (set to true or
false) to set this option globally. Note that overriding pip's choice of
whether to use :pep:`517` processing in this way does *not* affect whether pip
will use an isolated build environment (which is controlled via
``--no-build-isolation`` as noted above).
Except in the case noted above (projects with no :pep:`518` ``[build-system]``
section in ``pyproject.toml``), pip will never implicitly install a build
system. Projects **must** ensure that the correct build system is listed in
their ``requires`` list (this applies even if pip assumes that the
``setuptools`` backend is being used, as noted above).
.. _pep-518-limitations:
**Historical Limitations**:
* ``pip<18.0``: only supports installing build requirements from wheels, and
does not support the use of environment markers and extras (only version
specifiers are respected).
* ``pip<18.1``: build dependencies using .pth files are not properly supported;
as a result namespace packages do not work under Python 3.2 and earlier.
Future Developments
~~~~~~~~~~~~~~~~~~~
:pep:`426` notes that the intention is to add hooks to project metadata in
version 2.1 of the metadata spec, to explicitly define how to build a project
from its source. Once this version of the metadata spec is final, pip will
migrate to using that interface. At that point, the ``setup.py`` interface
documented here will be retained solely for legacy purposes, until projects
have migrated.
Specifically, applications should *not* expect to rely on there being any form
of backward compatibility guarantees around the ``setup.py`` interface.
Build Options
~~~~~~~~~~~~~
The ``--global-option`` and ``--build-option`` arguments to the ``pip install``
and ``pip wheel`` inject additional arguments into the ``setup.py`` command
(``--build-option`` is only available in ``pip wheel``). These arguments are
included in the command as follows:
.. tab:: Unix/macOS
.. code-block:: console
python setup.py <global_options> BUILD COMMAND <build_options>
.. tab:: Windows
.. code-block:: shell
py setup.py <global_options> BUILD COMMAND <build_options>
The options are passed unmodified, and presently offer direct access to the
distutils command line. Use of ``--global-option`` and ``--build-option``
should be considered as build system dependent, and may not be supported in the
current form if support for alternative build systems is added to pip.
.. _`General Options`:
General Options
***************
.. pip-general-options::
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip`

View file

@ -1,27 +1,11 @@
:orphan:
.. _`pip cache`:
.. meta::
pip cache
---------
:http-equiv=refresh: 3; url=../../cli/pip_cache/
This page has moved
===================
Usage
*****
.. tab:: Unix/macOS
.. pip-command-usage:: cache "python -m pip"
.. tab:: Windows
.. pip-command-usage:: cache "py -m pip"
Description
***********
.. pip-command-description:: cache
Options
*******
.. pip-command-options:: cache
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_cache`

View file

@ -1,87 +1,11 @@
.. _`pip check`:
:orphan:
=========
pip check
=========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_check/
Usage
=====
This page has moved
===================
.. tab:: Unix/macOS
.. pip-command-usage:: check "python -m pip"
.. tab:: Windows
.. pip-command-usage:: check "py -m pip"
Description
===========
.. pip-command-description:: check
Examples
========
#. If all dependencies are compatible:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
No broken requirements found.
$ echo $?
0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
No broken requirements found.
C:\> echo %errorlevel%
0
#. If a package is missing:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
pyramid 1.5.2 requires WebOb, which is not installed.
$ echo $?
1
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
pyramid 1.5.2 requires WebOb, which is not installed.
C:\> echo %errorlevel%
1
#. If a package has the wrong version:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip check
pyramid 1.5.2 has requirement WebOb>=1.3.1, but you have WebOb 0.8.
$ echo $?
1
.. tab:: Windows
.. code-block:: console
C:\> py -m pip check
pyramid 1.5.2 has requirement WebOb>=1.3.1, but you have WebOb 0.8.
C:\> echo %errorlevel%
1
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_check`

View file

@ -1,30 +1,11 @@
:orphan:
.. _`pip config`:
.. meta::
==========
pip config
==========
:http-equiv=refresh: 3; url=../../cli/pip_config/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: config "python -m pip"
.. tab:: Windows
.. pip-command-usage:: config "py -m pip"
Description
===========
.. pip-command-description:: config
Options
=======
.. pip-command-options:: config
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_config`

View file

@ -1,35 +1,11 @@
.. _`pip debug`:
:orphan:
=========
pip debug
=========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_debug/
Usage
=====
This page has moved
===================
.. tab:: Unix/macOS
.. pip-command-usage:: debug "python -m pip"
.. tab:: Windows
.. pip-command-usage:: debug "py -m pip"
.. warning::
This command is only meant for debugging.
Its options and outputs are provisional and may change without notice.
Description
===========
.. pip-command-description:: debug
Options
=======
.. pip-command-options:: debug
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_debug`

View file

@ -1,226 +1,11 @@
:orphan:
.. _`pip download`:
.. meta::
============
pip download
============
:http-equiv=refresh: 3; url=../../cli/pip_download/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: download "python -m pip"
.. tab:: Windows
.. pip-command-usage:: download "py -m pip"
Description
===========
.. pip-command-description:: download
Overview
--------
``pip download`` does the same resolution and downloading as ``pip install``,
but instead of installing the dependencies, it collects the downloaded
distributions into the directory provided (defaulting to the current
directory). This directory can later be passed as the value to ``pip install
--find-links`` to facilitate offline or locked down package installation.
``pip download`` with the ``--platform``, ``--python-version``,
``--implementation``, and ``--abi`` options provides the ability to fetch
dependencies for an interpreter and system other than the ones that pip is
running on. ``--only-binary=:all:`` or ``--no-deps`` is required when using any
of these options. It is important to note that these options all default to the
current system/interpreter, and not to the most restrictive constraints (e.g.
platform any, abi none, etc). To avoid fetching dependencies that happen to
match the constraint of the current interpreter (but not your target one), it
is recommended to specify all of these options if you are specifying one of
them. Generic dependencies (e.g. universal wheels, or dependencies with no
platform, abi, or implementation constraints) will still match an over-
constrained download requirement.
Options
=======
.. pip-command-options:: download
.. pip-index-options:: download
Examples
========
#. Download a package and all of its dependencies
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download SomePackage
python -m pip download -d . SomePackage # equivalent to above
python -m pip download --no-index --find-links=/tmp/wheelhouse -d /tmp/otherwheelhouse SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download SomePackage
py -m pip download -d . SomePackage # equivalent to above
py -m pip download --no-index --find-links=/tmp/wheelhouse -d /tmp/otherwheelhouse SomePackage
#. Download a package and all of its dependencies with OSX specific interpreter constraints.
This forces OSX 10.10 or lower compatibility. Since OSX deps are forward compatible,
this will also match ``macosx-10_9_x86_64``, ``macosx-10_8_x86_64``, ``macosx-10_8_intel``,
etc.
It will also match deps with platform ``any``. Also force the interpreter version to ``27``
(or more generic, i.e. ``2``) and implementation to ``cp`` (or more generic, i.e. ``py``).
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform macosx-10_10_x86_64 \
--python-version 27 \
--implementation cp \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform macosx-10_10_x86_64 ^
--python-version 27 ^
--implementation cp ^
SomePackage
#. Download a package and its dependencies with linux specific constraints.
Force the interpreter to be any minor version of py3k, and only accept
``cp34m`` or ``none`` as the abi.
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform linux_x86_64 \
--python-version 3 \
--implementation cp \
--abi cp34m \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform linux_x86_64 ^
--python-version 3 ^
--implementation cp ^
--abi cp34m ^
SomePackage
#. Force platform, implementation, and abi agnostic deps.
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip download \
--only-binary=:all: \
--platform any \
--python-version 3 \
--implementation py \
--abi none \
SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip download ^
--only-binary=:all: ^
--platform any ^
--python-version 3 ^
--implementation py ^
--abi none ^
SomePackage
#. Even when overconstrained, this will still correctly fetch the pip universal wheel.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download \
--only-binary=:all: \
--platform linux_x86_64 \
--python-version 33 \
--implementation cp \
--abi cp34m \
pip>=8
.. code-block:: console
$ ls pip-8.1.1-py2.py3-none-any.whl
pip-8.1.1-py2.py3-none-any.whl
.. tab:: Windows
.. code-block:: console
C:\> py -m pip download ^
--only-binary=:all: ^
--platform linux_x86_64 ^
--python-version 33 ^
--implementation cp ^
--abi cp34m ^
pip>=8
.. code-block:: console
C:\> dir pip-8.1.1-py2.py3-none-any.whl
pip-8.1.1-py2.py3-none-any.whl
#. Download a package supporting one of several ABIs and platforms.
This is useful when fetching wheels for a well-defined interpreter, whose
supported ABIs and platforms are known and fixed, different than the one pip is
running under.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download \
--only-binary=:all: \
--platform manylinux1_x86_64 --platform linux_x86_64 --platform any \
--python-version 36 \
--implementation cp \
--abi cp36m --abi cp36 --abi abi3 --abi none \
SomePackage
.. tab:: Windows
.. code-block:: console
C:> py -m pip download ^
--only-binary=:all: ^
--platform manylinux1_x86_64 --platform linux_x86_64 --platform any ^
--python-version 36 ^
--implementation cp ^
--abi cp36m --abi cp36 --abi abi3 --abi none ^
SomePackage
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_download`

View file

@ -1,74 +1,11 @@
:orphan:
.. _`pip freeze`:
.. meta::
==========
pip freeze
==========
:http-equiv=refresh: 3; url=../../cli/pip_freeze/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: freeze "python -m pip"
.. tab:: Windows
.. pip-command-usage:: freeze "py -m pip"
Description
===========
.. pip-command-description:: freeze
Options
=======
.. pip-command-options:: freeze
Examples
========
#. Generate output suitable for a requirements file.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip freeze
docutils==0.11
Jinja2==2.7.2
MarkupSafe==0.19
Pygments==1.6
Sphinx==1.2.2
.. tab:: Windows
.. code-block:: console
C:\> py -m pip freeze
docutils==0.11
Jinja2==2.7.2
MarkupSafe==0.19
Pygments==1.6
Sphinx==1.2.2
#. Generate a requirements file and then install from it in another environment.
.. tab:: Unix/macOS
.. code-block:: shell
env1/bin/python -m pip freeze > requirements.txt
env2/bin/python -m pip install -r requirements.txt
.. tab:: Windows
.. code-block:: shell
env1\bin\python -m pip freeze > requirements.txt
env2\bin\python -m pip install -r requirements.txt
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_freeze`

View file

@ -1,72 +1,11 @@
.. _`pip hash`:
:orphan:
========
pip hash
========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_hash/
Usage
=====
This page has moved
===================
.. tab:: Unix/macOS
.. pip-command-usage:: hash "python -m pip"
.. tab:: Windows
.. pip-command-usage:: hash "py -m pip"
Description
===========
.. pip-command-description:: hash
Overview
--------
``pip hash`` is a convenient way to get a hash digest for use with
:ref:`hash-checking mode`, especially for packages with multiple archives. The
error message from ``pip install --require-hashes ...`` will give you one
hash, but, if there are multiple archives (like source and binary ones), you
will need to manually download and compute a hash for the others. Otherwise, a
spurious hash mismatch could occur when :ref:`pip install` is passed a
different set of options, like :ref:`--no-binary <install_--no-binary>`.
Options
=======
.. pip-command-options:: hash
Example
=======
Compute the hash of a downloaded archive:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip download SomePackage
Collecting SomePackage
Downloading SomePackage-2.2.tar.gz
Saved ./pip_downloads/SomePackage-2.2.tar.gz
Successfully downloaded SomePackage
$ python -m pip hash ./pip_downloads/SomePackage-2.2.tar.gz
./pip_downloads/SomePackage-2.2.tar.gz:
--hash=sha256:93e62e05c7ad3da1a233def6731e8285156701e3419a5fe279017c429ec67ce0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip download SomePackage
Collecting SomePackage
Downloading SomePackage-2.2.tar.gz
Saved ./pip_downloads/SomePackage-2.2.tar.gz
Successfully downloaded SomePackage
C:\> py -m pip hash ./pip_downloads/SomePackage-2.2.tar.gz
./pip_downloads/SomePackage-2.2.tar.gz:
--hash=sha256:93e62e05c7ad3da1a233def6731e8285156701e3419a5fe279017c429ec67ce0
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_hash`

File diff suppressed because it is too large Load diff

View file

@ -1,201 +1,11 @@
.. _`pip list`:
:orphan:
========
pip list
========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_list/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: list "python -m pip"
.. tab:: Windows
.. pip-command-usage:: list "py -m pip"
Description
===========
.. pip-command-description:: list
Options
=======
.. pip-command-options:: list
.. pip-index-options:: list
Examples
========
#. List installed packages.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list
docutils (0.10)
Jinja2 (2.7.2)
MarkupSafe (0.18)
Pygments (1.6)
Sphinx (1.2.1)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list
docutils (0.10)
Jinja2 (2.7.2)
MarkupSafe (0.18)
Pygments (1.6)
Sphinx (1.2.1)
#. List outdated packages (excluding editables), and the latest version available.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --outdated
docutils (Current: 0.10 Latest: 0.11)
Sphinx (Current: 1.2.1 Latest: 1.2.2)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --outdated
docutils (Current: 0.10 Latest: 0.11)
Sphinx (Current: 1.2.1 Latest: 1.2.2)
#. List installed packages with column formatting.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format columns
Package Version
------- -------
docopt 0.6.2
idlex 1.13
jedi 0.9.0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format columns
Package Version
------- -------
docopt 0.6.2
idlex 1.13
jedi 0.9.0
#. List outdated packages with column formatting.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list -o --format columns
Package Version Latest Type
---------- ------- ------ -----
retry 0.8.1 0.9.1 wheel
setuptools 20.6.7 21.0.0 wheel
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list -o --format columns
Package Version Latest Type
---------- ------- ------ -----
retry 0.8.1 0.9.1 wheel
setuptools 20.6.7 21.0.0 wheel
#. List packages that are not dependencies of other packages. Can be combined with
other options.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --outdated --not-required
docutils (Current: 0.10 Latest: 0.11)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --outdated --not-required
docutils (Current: 0.10 Latest: 0.11)
#. Use legacy formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=legacy
colorama (0.3.7)
docopt (0.6.2)
idlex (1.13)
jedi (0.9.0)
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=legacy
colorama (0.3.7)
docopt (0.6.2)
idlex (1.13)
jedi (0.9.0)
#. Use json formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=json
[{'name': 'colorama', 'version': '0.3.7'}, {'name': 'docopt', 'version': '0.6.2'}, ...
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=json
[{'name': 'colorama', 'version': '0.3.7'}, {'name': 'docopt', 'version': '0.6.2'}, ...
#. Use freeze formatting
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip list --format=freeze
colorama==0.3.7
docopt==0.6.2
idlex==1.13
jedi==0.9.0
.. tab:: Windows
.. code-block:: console
C:\> py -m pip list --format=freeze
colorama==0.3.7
docopt==0.6.2
idlex==1.13
jedi==0.9.0
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_list`

View file

@ -1,52 +1,11 @@
.. _`pip search`:
:orphan:
==========
pip search
==========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_search/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: search "python -m pip"
.. tab:: Windows
.. pip-command-usage:: search "py -m pip"
Description
===========
.. pip-command-description:: search
Options
=======
.. pip-command-options:: search
Examples
========
#. Search for "peppercorn"
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip search peppercorn
pepperedform - Helpers for using peppercorn with formprocess.
peppercorn - A library for converting a token stream into [...]
.. tab:: Windows
.. code-block:: console
C:\> py -m pip search peppercorn
pepperedform - Helpers for using peppercorn with formprocess.
peppercorn - A library for converting a token stream into [...]
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_search`

View file

@ -1,154 +1,11 @@
.. _`pip show`:
:orphan:
========
pip show
========
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_show/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: show "python -m pip"
.. tab:: Windows
.. pip-command-usage:: show "py -m pip"
Description
===========
.. pip-command-description:: show
Options
=======
.. pip-command-options:: show
Examples
========
#. Show information about a package:
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip show sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
.. tab:: Windows
.. code-block:: console
C:\> py -m pip show sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
#. Show all information about a package
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip show --verbose sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
Metadata-Version: 2.0
Installer:
Classifiers:
Development Status :: 5 - Production/Stable
Environment :: Console
Environment :: Web Environment
Intended Audience :: Developers
Intended Audience :: Education
License :: OSI Approved :: BSD License
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Framework :: Sphinx
Framework :: Sphinx :: Extension
Framework :: Sphinx :: Theme
Topic :: Documentation
Topic :: Documentation :: Sphinx
Topic :: Text Processing
Topic :: Utilities
Entry-points:
[console_scripts]
sphinx-apidoc = sphinx.apidoc:main
sphinx-autogen = sphinx.ext.autosummary.generate:main
sphinx-build = sphinx:main
sphinx-quickstart = sphinx.quickstart:main
[distutils.commands]
build_sphinx = sphinx.setup_command:BuildDoc
.. tab:: Windows
.. code-block:: console
C:\> py -m pip show --verbose sphinx
Name: Sphinx
Version: 1.4.5
Summary: Python documentation generator
Home-page: http://sphinx-doc.org/
Author: Georg Brandl
Author-email: georg@python.org
License: BSD
Location: /my/env/lib/python2.7/site-packages
Requires: docutils, snowballstemmer, alabaster, Pygments, imagesize, Jinja2, babel, six
Metadata-Version: 2.0
Installer:
Classifiers:
Development Status :: 5 - Production/Stable
Environment :: Console
Environment :: Web Environment
Intended Audience :: Developers
Intended Audience :: Education
License :: OSI Approved :: BSD License
Operating System :: OS Independent
Programming Language :: Python
Programming Language :: Python :: 2
Programming Language :: Python :: 3
Framework :: Sphinx
Framework :: Sphinx :: Extension
Framework :: Sphinx :: Theme
Topic :: Documentation
Topic :: Documentation :: Sphinx
Topic :: Text Processing
Topic :: Utilities
Entry-points:
[console_scripts]
sphinx-apidoc = sphinx.apidoc:main
sphinx-autogen = sphinx.ext.autosummary.generate:main
sphinx-build = sphinx:main
sphinx-quickstart = sphinx.quickstart:main
[distutils.commands]
build_sphinx = sphinx.setup_command:BuildDoc
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_show`

View file

@ -1,58 +1,11 @@
.. _`pip uninstall`:
:orphan:
=============
pip uninstall
=============
.. meta::
:http-equiv=refresh: 3; url=../../cli/pip_uninstall/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: uninstall "python -m pip"
.. tab:: Windows
.. pip-command-usage:: uninstall "py -m pip"
Description
===========
.. pip-command-description:: uninstall
Options
=======
.. pip-command-options:: uninstall
Examples
========
#. Uninstall a package.
.. tab:: Unix/macOS
.. code-block:: console
$ python -m pip uninstall simplejson
Uninstalling simplejson:
/home/me/env/lib/python3.9/site-packages/simplejson
/home/me/env/lib/python3.9/site-packages/simplejson-2.2.1-py3.9.egg-info
Proceed (y/n)? y
Successfully uninstalled simplejson
.. tab:: Windows
.. code-block:: console
C:\> py -m pip uninstall simplejson
Uninstalling simplejson:
/home/me/env/lib/python3.9/site-packages/simplejson
/home/me/env/lib/python3.9/site-packages/simplejson-2.2.1-py3.9.egg-info
Proceed (y/n)? y
Successfully uninstalled simplejson
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_uninstall`

View file

@ -1,125 +1,11 @@
:orphan:
.. _`pip wheel`:
.. meta::
=========
pip wheel
=========
:http-equiv=refresh: 3; url=../../cli/pip_wheel/
This page has moved
===================
Usage
=====
.. tab:: Unix/macOS
.. pip-command-usage:: wheel "python -m pip"
.. tab:: Windows
.. pip-command-usage:: wheel "py -m pip"
Description
===========
.. pip-command-description:: wheel
Build System Interface
----------------------
In order for pip to build a wheel, ``setup.py`` must implement the
``bdist_wheel`` command with the following syntax:
.. tab:: Unix/macOS
.. code-block:: shell
python setup.py bdist_wheel -d TARGET
.. tab:: Windows
.. code-block:: shell
py setup.py bdist_wheel -d TARGET
This command must create a wheel compatible with the invoking Python
interpreter, and save that wheel in the directory TARGET.
No other build system commands are invoked by the ``pip wheel`` command.
Customising the build
^^^^^^^^^^^^^^^^^^^^^
It is possible using ``--global-option`` to include additional build commands
with their arguments in the ``setup.py`` command. This is currently the only
way to influence the building of C extensions from the command line. For
example:
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --global-option bdist_ext --global-option -DFOO wheel
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --global-option bdist_ext --global-option -DFOO wheel
will result in a build command of
::
setup.py bdist_ext -DFOO bdist_wheel -d TARGET
which passes a preprocessor symbol to the extension build.
Such usage is considered highly build-system specific and more an accident of
the current implementation than a supported interface.
Options
=======
.. pip-command-options:: wheel
.. pip-index-options:: wheel
Examples
========
#. Build wheels for a requirement (and all its dependencies), and then install
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --wheel-dir=/tmp/wheelhouse SomePackage
python -m pip install --no-index --find-links=/tmp/wheelhouse SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --wheel-dir=/tmp/wheelhouse SomePackage
py -m pip install --no-index --find-links=/tmp/wheelhouse SomePackage
#. Build a wheel for a package from source
.. tab:: Unix/macOS
.. code-block:: shell
python -m pip wheel --no-binary SomePackage SomePackage
.. tab:: Windows
.. code-block:: shell
py -m pip wheel --no-binary SomePackage SomePackage
You should be redirected automatically in 3 seconds. If that didn't
work, here's a link: :doc:`../cli/pip_wheel`

View file

@ -1887,6 +1887,6 @@ announcements on the `low-traffic packaging announcements list`_ and
.. _low-traffic packaging announcements list: https://mail.python.org/mailman3/lists/pypi-announce.python.org/
.. _our survey on upgrades that create conflicts: https://docs.google.com/forms/d/e/1FAIpQLSeBkbhuIlSofXqCyhi3kGkLmtrpPOEBwr6iJA6SzHdxWKfqdA/viewform
.. _the official Python blog: https://blog.python.org/
.. _requests: https://requests.readthedocs.io/en/master/user/authentication/#netrc-authentication
.. _requests: https://requests.readthedocs.io/en/latest/user/authentication/#netrc-authentication
.. _Python standard library: https://docs.python.org/3/library/netrc.html
.. _Python Windows launcher: https://docs.python.org/3/using/windows.html#launcher

View file

@ -1,24 +1,84 @@
"""pip sphinx extensions"""
import optparse
import pathlib
import re
import sys
from textwrap import dedent
from typing import Iterable, List, Optional
from docutils import nodes
from docutils import nodes, statemachine
from docutils.parsers import rst
from docutils.statemachine import StringList, ViewList
from sphinx.application import Sphinx
from pip._internal.cli import cmdoptions
from pip._internal.commands import commands_dict, create_command
from pip._internal.req.req_file import SUPPORTED_OPTIONS
class PipNewsInclude(rst.Directive):
required_arguments = 1
def _is_version_section_title_underline(self, prev, curr):
"""Find a ==== line that marks the version section title."""
if prev is None:
return False
if re.match(r"^=+$", curr) is None:
return False
if len(curr) < len(prev):
return False
return True
def _iter_lines_with_refs(self, lines):
"""Transform the input lines to add a ref before each section title.
This is done by looking one line ahead and locate a title's underline,
and add a ref before the title text.
Dots in the version is converted into dash, and a ``v`` is prefixed.
This makes Sphinx use them as HTML ``id`` verbatim without generating
auto numbering (which would make the the anchors unstable).
"""
prev = None
for line in lines:
# Transform the previous line to include an explicit ref.
if self._is_version_section_title_underline(prev, line):
vref = prev.split(None, 1)[0].replace(".", "-")
yield f".. _`v{vref}`:"
yield "" # Empty line between ref and the title.
if prev is not None:
yield prev
prev = line
if prev is not None:
yield prev
def run(self):
source = self.state_machine.input_lines.source(
self.lineno - self.state_machine.input_offset - 1,
)
path = (
pathlib.Path(source)
.resolve()
.parent
.joinpath(self.arguments[0])
.resolve()
)
include_lines = statemachine.string2lines(
path.read_text(encoding="utf-8"),
self.state.document.settings.tab_width,
convert_whitespace=True,
)
include_lines = list(self._iter_lines_with_refs(include_lines))
self.state_machine.insert_input(include_lines, str(path))
return []
class PipCommandUsage(rst.Directive):
required_arguments = 1
optional_arguments = 3
def run(self):
def run(self) -> List[nodes.Node]:
cmd = create_command(self.arguments[0])
cmd_prefix = "python -m pip"
if len(self.arguments) > 1:
@ -33,11 +93,12 @@ class PipCommandUsage(rst.Directive):
class PipCommandDescription(rst.Directive):
required_arguments = 1
def run(self):
def run(self) -> List[nodes.Node]:
node = nodes.paragraph()
node.document = self.state.document
desc = ViewList()
cmd = create_command(self.arguments[0])
assert cmd.__doc__ is not None
description = dedent(cmd.__doc__)
for line in description.split("\n"):
desc.append(line, "")
@ -46,7 +107,9 @@ class PipCommandDescription(rst.Directive):
class PipOptions(rst.Directive):
def _format_option(self, option, cmd_name=None):
def _format_option(
self, option: optparse.Option, cmd_name: Optional[str] = None
) -> List[str]:
bookmark_line = (
f".. _`{cmd_name}_{option._long_opts[0]}`:"
if cmd_name
@ -60,22 +123,27 @@ class PipOptions(rst.Directive):
elif option._long_opts:
line += option._long_opts[0]
if option.takes_value():
metavar = option.metavar or option.dest.lower()
metavar = option.metavar or option.dest
assert metavar is not None
line += f" <{metavar.lower()}>"
# fix defaults
opt_help = option.help.replace("%default", str(option.default))
assert option.help is not None
# https://github.com/python/typeshed/pull/5080
opt_help = option.help.replace("%default", str(option.default)) # type: ignore
# fix paths with sys.prefix
opt_help = opt_help.replace(sys.prefix, "<sys.prefix>")
return [bookmark_line, "", line, "", " " + opt_help, ""]
def _format_options(self, options, cmd_name=None):
def _format_options(
self, options: Iterable[optparse.Option], cmd_name: Optional[str] = None
) -> None:
for option in options:
if option.help == optparse.SUPPRESS_HELP:
continue
for line in self._format_option(option, cmd_name):
self.view_list.append(line, "")
def run(self):
def run(self) -> List[nodes.Node]:
node = nodes.paragraph()
node.document = self.state.document
self.view_list = ViewList()
@ -85,14 +153,14 @@ class PipOptions(rst.Directive):
class PipGeneralOptions(PipOptions):
def process_options(self):
def process_options(self) -> None:
self._format_options([o() for o in cmdoptions.general_group["options"]])
class PipIndexOptions(PipOptions):
required_arguments = 1
def process_options(self):
def process_options(self) -> None:
cmd_name = self.arguments[0]
self._format_options(
[o() for o in cmdoptions.index_group["options"]],
@ -103,7 +171,7 @@ class PipIndexOptions(PipOptions):
class PipCommandOptions(PipOptions):
required_arguments = 1
def process_options(self):
def process_options(self) -> None:
cmd = create_command(self.arguments[0])
self._format_options(
cmd.parser.option_groups[0].option_list,
@ -112,7 +180,7 @@ class PipCommandOptions(PipOptions):
class PipReqFileOptionsReference(PipOptions):
def determine_opt_prefix(self, opt_name):
def determine_opt_prefix(self, opt_name: str) -> str:
for command in commands_dict:
cmd = create_command(command)
if cmd.cmd_opts.has_option(opt_name):
@ -120,7 +188,7 @@ class PipReqFileOptionsReference(PipOptions):
raise KeyError(f"Could not identify prefix of opt {opt_name}")
def process_options(self):
def process_options(self) -> None:
for option in SUPPORTED_OPTIONS:
if getattr(option, "deprecated", False):
continue
@ -157,7 +225,7 @@ class PipCLIDirective(rst.Directive):
has_content = True
optional_arguments = 1
def run(self):
def run(self) -> List[nodes.Node]:
node = nodes.paragraph()
node.document = self.state.document
@ -226,7 +294,7 @@ class PipCLIDirective(rst.Directive):
return [node]
def setup(app):
def setup(app: Sphinx) -> None:
app.add_directive("pip-command-usage", PipCommandUsage)
app.add_directive("pip-command-description", PipCommandDescription)
app.add_directive("pip-command-options", PipCommandOptions)
@ -235,4 +303,5 @@ def setup(app):
app.add_directive(
"pip-requirements-file-options-ref-list", PipReqFileOptionsReference
)
app.add_directive('pip-news-include', PipNewsInclude)
app.add_directive("pip-cli", PipCLIDirective)

View file

@ -0,0 +1,2 @@
Update "setuptools extras" ink to match upstream.

1
news/9565.bugfix.rst Normal file
View file

@ -0,0 +1 @@
Make wheel compatibility tag preferences more important than the build tag

View file

@ -0,0 +1 @@
Update urllib3 to 1.26.4 to fix CVE-2021-28363

View file

@ -12,7 +12,7 @@ import nox
# fmt: off
sys.path.append(".")
from tools.automation import release # isort:skip # noqa
from tools import release # isort:skip # noqa
sys.path.pop()
# fmt: on
@ -174,7 +174,6 @@ def lint(session):
args = session.posargs + ["--all-files"]
else:
args = ["--all-files", "--show-diff-on-failure"]
args.append("--hook-stage=manual")
session.run("pre-commit", "run", *args)

View file

@ -9,7 +9,7 @@ filename = "NEWS.rst"
directory = "news/"
title_format = "{version} ({project_date})"
issue_format = "`#{issue} <https://github.com/pypa/pip/issues/{issue}>`_"
template = "tools/automation/news/template.rst"
template = "tools/news/template.rst"
type = [
{ name = "Process", directory = "process", showcontent = true },
{ name = "Deprecations and Removals", directory = "removal", showcontent = true },
@ -26,7 +26,7 @@ requirements = "src/pip/_vendor/vendor.txt"
namespace = "pip._vendor"
protected-files = ["__init__.py", "README.rst", "vendor.txt"]
patches-dir = "tools/automation/vendoring/patches"
patches-dir = "tools/vendoring/patches"
[tool.vendoring.transformations]
substitute = [

View file

@ -66,7 +66,6 @@ markers =
svn: VCS: Subversion
mercurial: VCS: Mercurial
git: VCS: git
yaml: yaml based tests
search: tests for 'pip search'
[coverage:run]

View file

@ -16,22 +16,21 @@ def read(rel_path):
def get_version(rel_path):
# type: (str) -> str
for line in read(rel_path).splitlines():
if line.startswith('__version__'):
if line.startswith("__version__"):
# __version__ = "0.9"
delim = '"' if '"' in line else "'"
return line.split(delim)[1]
raise RuntimeError("Unable to find version string.")
long_description = read('README.rst')
long_description = read("README.rst")
setup(
name="pip",
version=get_version("src/pip/__init__.py"),
description="The PyPA recommended tool for installing Python packages.",
long_description=long_description,
license='MIT',
license="MIT",
classifiers=[
"Development Status :: 5 - Production/Stable",
"Intended Audience :: Developers",
@ -47,17 +46,14 @@ setup(
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
],
url='https://pip.pypa.io/',
keywords='distutils easy_install egg setuptools wheel virtualenv',
url="https://pip.pypa.io/",
project_urls={
"Documentation": "https://pip.pypa.io",
"Source": "https://github.com/pypa/pip",
"Changelog": "https://pip.pypa.io/en/stable/news/",
},
author='The pip developers',
author_email='distutils-sig@python.org',
author="The pip developers",
author_email="distutils-sig@python.org",
package_dir={"": "src"},
packages=find_packages(
where="src",
@ -75,12 +71,9 @@ setup(
"console_scripts": [
"pip=pip._internal.cli.main:main",
"pip{}=pip._internal.cli.main:main".format(sys.version_info[0]),
"pip{}.{}=pip._internal.cli.main:main".format(
*sys.version_info[:2]
),
"pip{}.{}=pip._internal.cli.main:main".format(*sys.version_info[:2]),
],
},
zip_safe=False,
python_requires='>=3.6',
python_requires=">=3.6",
)

View file

@ -5,12 +5,12 @@ import sys
# of sys.path, if present to avoid using current directory
# in pip commands check, freeze, install, list and show,
# when invoked as python -m pip <command>
if sys.path[0] in ('', os.getcwd()):
if sys.path[0] in ("", os.getcwd()):
sys.path.pop(0)
# If we are running from a wheel, add the wheel to sys.path
# This allows the usage python pip-*.whl/pip install pip-*.whl
if __package__ == '':
if __package__ == "":
# __file__ is pip-*.whl/pip/__main__.py
# first dirname call strips of '/__main__.py', second strips off '/pip'
# Resulting path is the name of the wheel itself
@ -20,5 +20,5 @@ if __package__ == '':
from pip._internal.cli.main import main as _main
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(_main())

View file

@ -44,7 +44,7 @@ logger = logging.getLogger(__name__)
BuildTag = Union[Tuple[()], Tuple[int, str]]
CandidateSortingKey = (
Tuple[int, int, int, _BaseVersion, BuildTag, Optional[int]]
Tuple[int, int, int, _BaseVersion, Optional[int], BuildTag]
)
@ -530,7 +530,7 @@ class CandidateEvaluator:
yank_value = -1 * int(link.is_yanked) # -1 for yanked.
return (
has_allowed_hash, yank_value, binary_preference, candidate.version,
build_tag, pri,
pri, build_tag,
)
def sort_best_candidate(

View file

@ -42,12 +42,12 @@ def raise_for_status(resp):
reason = resp.reason
if 400 <= resp.status_code < 500:
http_error_msg = '%s Client Error: %s for url: %s' % (
resp.status_code, reason, resp.url)
http_error_msg = (
f'{resp.status_code} Client Error: {reason} for url: {resp.url}')
elif 500 <= resp.status_code < 600:
http_error_msg = '%s Server Error: %s for url: %s' % (
resp.status_code, reason, resp.url)
http_error_msg = (
f'{resp.status_code} Server Error: {reason} for url: {resp.url}')
if http_error_msg:
raise NetworkConnectionError(http_error_msg, response=resp)

View file

@ -514,7 +514,7 @@ class Factory:
relevant_constraints.add(req.name)
msg = msg + "\n "
if parent:
msg = msg + "{} {} depends on ".format(parent.name, parent.version)
msg = msg + f"{parent.name} {parent.version} depends on "
else:
msg = msg + "The user requested "
msg = msg + req.format_for_error()

View file

@ -166,7 +166,7 @@ class UnsatisfiableRequirement(Requirement):
def __str__(self):
# type: () -> str
return "{} (unavailable)".format(self._name)
return f"{self._name} (unavailable)"
def __repr__(self):
# type: () -> str

View file

@ -49,9 +49,7 @@ def get_path_uid(path):
file_uid = os.stat(path).st_uid
else:
# raise OSError for parity with os.O_NOFOLLOW above
raise OSError(
"{} is a symlink; Will not return uid for symlinks".format(path)
)
raise OSError(f"{path} is a symlink; Will not return uid for symlinks")
return file_uid

View file

@ -193,7 +193,7 @@ def _check_no_input(message):
"""Raise an error if no input is allowed."""
if os.environ.get("PIP_NO_INPUT"):
raise Exception(
"No input was expected ($PIP_NO_INPUT set); question: {}".format(message)
f"No input was expected ($PIP_NO_INPUT set); question: {message}"
)
@ -241,7 +241,7 @@ def strtobool(val):
elif val in ("n", "no", "f", "false", "off", "0"):
return 0
else:
raise ValueError("invalid truth value %r" % (val,))
raise ValueError(f"invalid truth value {val!r}")
def format_size(bytes):

View file

@ -252,7 +252,7 @@ def call_subprocess(
elif on_returncode == "ignore":
pass
else:
raise ValueError("Invalid value: on_returncode={!r}".format(on_returncode))
raise ValueError(f"Invalid value: on_returncode={on_returncode!r}")
return output

View file

@ -36,7 +36,7 @@ class WheelMetadata(DictMetadata):
except UnicodeDecodeError as e:
# Augment the default error with the origin of the file.
raise UnsupportedWheel(
"Error decoding metadata for {}: {}".format(self._wheel_name, e)
f"Error decoding metadata for {self._wheel_name}: {e}"
)

View file

@ -187,7 +187,7 @@ def _verify_one(req, wheel_path):
try:
metadata_version = Version(metadata_version_value)
except InvalidVersion:
msg = "Invalid Metadata-Version: {}".format(metadata_version_value)
msg = f"Invalid Metadata-Version: {metadata_version_value}"
raise UnsupportedWheel(msg)
if (metadata_version >= Version("1.2")
and not isinstance(dist.version, Version)):

View file

@ -1,2 +1,2 @@
# This file is protected via CODEOWNERS
__version__ = "1.26.2"
__version__ = "1.26.4"

View file

@ -67,7 +67,7 @@ port_by_scheme = {"http": 80, "https": 443}
# When it comes time to update this value as a part of regular maintenance
# (ie test_recent_date is failing) update it to ~6 months before the current date.
RECENT_DATE = datetime.date(2019, 1, 1)
RECENT_DATE = datetime.date(2020, 7, 1)
_CONTAINS_CONTROL_CHAR_RE = re.compile(r"[^-!#$%&'*+.^_`|~0-9a-zA-Z]")
@ -215,7 +215,7 @@ class HTTPConnection(_HTTPConnection, object):
def putheader(self, header, *values):
""""""
if SKIP_HEADER not in values:
if not any(isinstance(v, str) and v == SKIP_HEADER for v in values):
_HTTPConnection.putheader(self, header, *values)
elif six.ensure_str(header.lower()) not in SKIPPABLE_HEADERS:
raise ValueError(
@ -490,6 +490,10 @@ class HTTPSConnection(HTTPConnection):
self.ca_cert_dir,
self.ca_cert_data,
)
# By default urllib3's SSLContext disables `check_hostname` and uses
# a custom check. For proxies we're good with relying on the default
# verification.
ssl_context.check_hostname = True
# If no cert was provided, use only the default options for server
# certificate validation

View file

@ -289,7 +289,17 @@ class ProxySchemeUnknown(AssertionError, URLSchemeUnknown):
# TODO(t-8ch): Stop inheriting from AssertionError in v2.0.
def __init__(self, scheme):
message = "Not supported proxy scheme %s" % scheme
# 'localhost' is here because our URL parser parses
# localhost:8080 -> scheme=localhost, remove if we fix this.
if scheme == "localhost":
scheme = None
if scheme is None:
message = "Proxy URL had no scheme, should start with http:// or https://"
else:
message = (
"Proxy URL had unsupported scheme %s, should use http:// or https://"
% scheme
)
super(ProxySchemeUnknown, self).__init__(message)

View file

@ -253,6 +253,7 @@ class Retry(object):
"Using 'method_whitelist' with Retry is deprecated and "
"will be removed in v2.0. Use 'allowed_methods' instead",
DeprecationWarning,
stacklevel=2,
)
allowed_methods = method_whitelist
if allowed_methods is _Default:

View file

@ -13,7 +13,7 @@ requests==2.25.1
certifi==2020.12.05
chardet==4.0.0
idna==2.10
urllib3==1.26.2
urllib3==1.26.4
resolvelib==0.5.4
retrying==1.3.3
setuptools==44.0.0

View file

@ -56,26 +56,26 @@ def pytest_addoption(parser):
def pytest_collection_modifyitems(config, items):
for item in items:
if not hasattr(item, 'module'): # e.g.: DoctestTextfile
if not hasattr(item, "module"): # e.g.: DoctestTextfile
continue
if (item.get_closest_marker('search') and
not config.getoption('--run-search')):
item.add_marker(pytest.mark.skip('pip search test skipped'))
if item.get_closest_marker("search") and not config.getoption("--run-search"):
item.add_marker(pytest.mark.skip("pip search test skipped"))
if "CI" in os.environ:
# Mark network tests as flaky
if item.get_closest_marker('network') is not None:
if item.get_closest_marker("network") is not None:
item.add_marker(pytest.mark.flaky(reruns=3, reruns_delay=2))
if (item.get_closest_marker('incompatible_with_test_venv') and
config.getoption("--use-venv")):
item.add_marker(pytest.mark.skip(
'Incompatible with test venv'))
if (item.get_closest_marker('incompatible_with_venv') and
sys.prefix != sys.base_prefix):
item.add_marker(pytest.mark.skip(
'Incompatible with venv'))
if item.get_closest_marker("incompatible_with_test_venv") and config.getoption(
"--use-venv"
):
item.add_marker(pytest.mark.skip("Incompatible with test venv"))
if (
item.get_closest_marker("incompatible_with_venv")
and sys.prefix != sys.base_prefix
):
item.add_marker(pytest.mark.skip("Incompatible with venv"))
module_path = os.path.relpath(
item.module.__file__,
@ -83,22 +83,21 @@ def pytest_collection_modifyitems(config, items):
)
module_root_dir = module_path.split(os.pathsep)[0]
if (module_root_dir.startswith("functional") or
module_root_dir.startswith("integration") or
module_root_dir.startswith("lib")):
if (
module_root_dir.startswith("functional")
or module_root_dir.startswith("integration")
or module_root_dir.startswith("lib")
):
item.add_marker(pytest.mark.integration)
elif module_root_dir.startswith("unit"):
item.add_marker(pytest.mark.unit)
else:
raise RuntimeError(
f"Unknown test type (filename = {module_path})"
)
raise RuntimeError(f"Unknown test type (filename = {module_path})")
@pytest.fixture(scope="session", autouse=True)
def resolver_variant(request):
"""Set environment variable to make pip default to the correct resolver.
"""
"""Set environment variable to make pip default to the correct resolver."""
resolver = request.config.getoption("--resolver")
# Handle the environment variables for this test.
@ -118,9 +117,9 @@ def resolver_variant(request):
yield resolver
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def tmpdir_factory(request, tmpdir_factory):
""" Modified `tmpdir_factory` session fixture
"""Modified `tmpdir_factory` session fixture
that will automatically cleanup after itself.
"""
yield tmpdir_factory
@ -172,17 +171,17 @@ def isolate(tmpdir, monkeypatch):
fake_root = os.path.join(str(tmpdir), "fake-root")
os.makedirs(fake_root)
if sys.platform == 'win32':
if sys.platform == "win32":
# Note: this will only take effect in subprocesses...
home_drive, home_path = os.path.splitdrive(home_dir)
monkeypatch.setenv('USERPROFILE', home_dir)
monkeypatch.setenv('HOMEDRIVE', home_drive)
monkeypatch.setenv('HOMEPATH', home_path)
monkeypatch.setenv("USERPROFILE", home_dir)
monkeypatch.setenv("HOMEDRIVE", home_drive)
monkeypatch.setenv("HOMEPATH", home_path)
for env_var, sub_path in (
('APPDATA', 'AppData/Roaming'),
('LOCALAPPDATA', 'AppData/Local'),
("APPDATA", "AppData/Roaming"),
("LOCALAPPDATA", "AppData/Local"),
):
path = os.path.join(home_dir, *sub_path.split('/'))
path = os.path.join(home_dir, *sub_path.split("/"))
monkeypatch.setenv(env_var, path)
os.makedirs(path)
else:
@ -191,23 +190,46 @@ def isolate(tmpdir, monkeypatch):
# of the user's actual $HOME directory.
monkeypatch.setenv("HOME", home_dir)
# Isolate ourselves from XDG directories
monkeypatch.setenv("XDG_DATA_HOME", os.path.join(
home_dir, ".local", "share",
))
monkeypatch.setenv("XDG_CONFIG_HOME", os.path.join(
home_dir, ".config",
))
monkeypatch.setenv(
"XDG_DATA_HOME",
os.path.join(
home_dir,
".local",
"share",
),
)
monkeypatch.setenv(
"XDG_CONFIG_HOME",
os.path.join(
home_dir,
".config",
),
)
monkeypatch.setenv("XDG_CACHE_HOME", os.path.join(home_dir, ".cache"))
monkeypatch.setenv("XDG_RUNTIME_DIR", os.path.join(
home_dir, ".runtime",
))
monkeypatch.setenv("XDG_DATA_DIRS", os.pathsep.join([
monkeypatch.setenv(
"XDG_RUNTIME_DIR",
os.path.join(
home_dir,
".runtime",
),
)
monkeypatch.setenv(
"XDG_DATA_DIRS",
os.pathsep.join(
[
os.path.join(fake_root, "usr", "local", "share"),
os.path.join(fake_root, "usr", "share"),
]))
monkeypatch.setenv("XDG_CONFIG_DIRS", os.path.join(
fake_root, "etc", "xdg",
))
]
),
)
monkeypatch.setenv(
"XDG_CONFIG_DIRS",
os.path.join(
fake_root,
"etc",
"xdg",
),
)
# Configure git, because without an author name/email git will complain
# and cause test failures.
@ -224,9 +246,7 @@ def isolate(tmpdir, monkeypatch):
# FIXME: Windows...
os.makedirs(os.path.join(home_dir, ".config", "git"))
with open(os.path.join(home_dir, ".config", "git", "config"), "wb") as fp:
fp.write(
b"[user]\n\tname = pip\n\temail = distutils-sig@python.org\n"
)
fp.write(b"[user]\n\tname = pip\n\temail = distutils-sig@python.org\n")
@pytest.fixture(autouse=True)
@ -245,7 +265,7 @@ def scoped_global_tempdir_manager(request):
yield
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def pip_src(tmpdir_factory):
def not_code_files_and_folders(path, names):
# In the root directory...
@ -265,7 +285,7 @@ def pip_src(tmpdir_factory):
ignored.update(fnmatch.filter(names, pattern))
return ignored
pip_src = Path(str(tmpdir_factory.mktemp('pip_src'))).joinpath('pip_src')
pip_src = Path(str(tmpdir_factory.mktemp("pip_src"))).joinpath("pip_src")
# Copy over our source tree so that each use is self contained
shutil.copytree(
SRC_DIR,
@ -276,83 +296,77 @@ def pip_src(tmpdir_factory):
def _common_wheel_editable_install(tmpdir_factory, common_wheels, package):
wheel_candidates = list(
common_wheels.glob(f'{package}-*.whl'))
wheel_candidates = list(common_wheels.glob(f"{package}-*.whl"))
assert len(wheel_candidates) == 1, wheel_candidates
install_dir = Path(str(tmpdir_factory.mktemp(package))) / 'install'
install_dir = Path(str(tmpdir_factory.mktemp(package))) / "install"
Wheel(wheel_candidates[0]).install_as_egg(install_dir)
(install_dir / 'EGG-INFO').rename(
install_dir / f'{package}.egg-info')
(install_dir / "EGG-INFO").rename(install_dir / f"{package}.egg-info")
assert compileall.compile_dir(str(install_dir), quiet=1)
return install_dir
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def setuptools_install(tmpdir_factory, common_wheels):
return _common_wheel_editable_install(tmpdir_factory,
common_wheels,
'setuptools')
return _common_wheel_editable_install(tmpdir_factory, common_wheels, "setuptools")
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def wheel_install(tmpdir_factory, common_wheels):
return _common_wheel_editable_install(tmpdir_factory,
common_wheels,
'wheel')
return _common_wheel_editable_install(tmpdir_factory, common_wheels, "wheel")
@pytest.fixture(scope='session')
@pytest.fixture(scope="session")
def coverage_install(tmpdir_factory, common_wheels):
return _common_wheel_editable_install(tmpdir_factory,
common_wheels,
'coverage')
return _common_wheel_editable_install(tmpdir_factory, common_wheels, "coverage")
def install_egg_link(venv, project_name, egg_info_dir):
with open(venv.site / 'easy-install.pth', 'a') as fp:
fp.write(str(egg_info_dir.resolve()) + '\n')
with open(venv.site / (project_name + '.egg-link'), 'w') as fp:
fp.write(str(egg_info_dir) + '\n.')
with open(venv.site / "easy-install.pth", "a") as fp:
fp.write(str(egg_info_dir.resolve()) + "\n")
with open(venv.site / (project_name + ".egg-link"), "w") as fp:
fp.write(str(egg_info_dir) + "\n.")
@pytest.fixture(scope='session')
def virtualenv_template(request, tmpdir_factory, pip_src,
setuptools_install, coverage_install):
@pytest.fixture(scope="session")
def virtualenv_template(
request, tmpdir_factory, pip_src, setuptools_install, coverage_install
):
if request.config.getoption('--use-venv'):
venv_type = 'venv'
if request.config.getoption("--use-venv"):
venv_type = "venv"
else:
venv_type = 'virtualenv'
venv_type = "virtualenv"
# Create the virtual environment
tmpdir = Path(str(tmpdir_factory.mktemp('virtualenv')))
venv = VirtualEnvironment(
tmpdir.joinpath("venv_orig"), venv_type=venv_type
)
tmpdir = Path(str(tmpdir_factory.mktemp("virtualenv")))
venv = VirtualEnvironment(tmpdir.joinpath("venv_orig"), venv_type=venv_type)
# Install setuptools and pip.
install_egg_link(venv, 'setuptools', setuptools_install)
pip_editable = Path(str(tmpdir_factory.mktemp('pip'))) / 'pip'
install_egg_link(venv, "setuptools", setuptools_install)
pip_editable = Path(str(tmpdir_factory.mktemp("pip"))) / "pip"
shutil.copytree(pip_src, pip_editable, symlinks=True)
# noxfile.py is Python 3 only
assert compileall.compile_dir(
str(pip_editable), quiet=1, rx=re.compile("noxfile.py$"),
str(pip_editable),
quiet=1,
rx=re.compile("noxfile.py$"),
)
subprocess.check_call(
[venv.bin / "python", "setup.py", "-q", "develop"], cwd=pip_editable
)
subprocess.check_call([venv.bin / 'python', 'setup.py', '-q', 'develop'],
cwd=pip_editable)
# Install coverage and pth file for executing it in any spawned processes
# in this virtual environment.
install_egg_link(venv, 'coverage', coverage_install)
install_egg_link(venv, "coverage", coverage_install)
# zz prefix ensures the file is after easy-install.pth.
with open(venv.site / 'zz-coverage-helper.pth', 'a') as f:
f.write('import coverage; coverage.process_startup()')
with open(venv.site / "zz-coverage-helper.pth", "a") as f:
f.write("import coverage; coverage.process_startup()")
# Drop (non-relocatable) launchers.
for exe in os.listdir(venv.bin):
if not (
exe.startswith('python') or
exe.startswith('libpy') # Don't remove libpypy-c.so...
exe.startswith("python")
or exe.startswith("libpy") # Don't remove libpypy-c.so...
):
(venv.bin / exe).unlink()
@ -387,7 +401,7 @@ def virtualenv(virtualenv_factory, tmpdir):
@pytest.fixture
def with_wheel(virtualenv, wheel_install):
install_egg_link(virtualenv, 'wheel', wheel_install)
install_egg_link(virtualenv, "wheel", wheel_install)
@pytest.fixture(scope="session")
@ -398,21 +412,16 @@ def script_factory(virtualenv_factory, deprecated_python):
return PipTestEnvironment(
# The base location for our test environment
tmpdir,
# Tell the Test Environment where our virtualenv is located
virtualenv=virtualenv,
# Do not ignore hidden files, they need to be checked as well
ignore_hidden=False,
# We are starting with an already empty directory
start_clear=False,
# We want to ensure no temporary files are left behind, so the
# PipTestEnvironment needs to capture and assert against temp
capture_temp=True,
assert_no_temp=True,
# Deprecated python versions produce an extra deprecation warning
pip_expect_warning=deprecated_python,
)
@ -434,7 +443,7 @@ def script(tmpdir, virtualenv, script_factory):
@pytest.fixture(scope="session")
def common_wheels():
"""Provide a directory with latest setuptools and wheel wheels"""
return DATA_DIR.joinpath('common_wheels')
return DATA_DIR.joinpath("common_wheels")
@pytest.fixture(scope="session")
@ -482,8 +491,7 @@ def deprecated_python():
def cert_factory(tmpdir_factory):
def factory():
# type: () -> str
"""Returns path to cert/key file.
"""
"""Returns path to cert/key file."""
output_path = Path(str(tmpdir_factory.mktemp("certs"))) / "cert.pem"
# Must be Text on PY2.
cert, key = make_tls_cert("localhost")
@ -537,14 +545,11 @@ class MockServer:
def get_requests(self):
# type: () -> Dict[str, str]
"""Get environ for each received request.
"""
"""Get environ for each received request."""
assert not self._running, "cannot get mock from running server"
# Legacy: replace call[0][0] with call.args[0]
# when pip drops support for python3.7
return [
call[0][0] for call in self._server.mock.call_args_list
]
return [call[0][0] for call in self._server.mock.call_args_list]
@pytest.fixture
@ -558,8 +563,8 @@ def mock_server():
@pytest.fixture
def utc():
# time.tzset() is not implemented on some platforms, e.g. Windows.
tzset = getattr(time, 'tzset', lambda: None)
with patch.dict(os.environ, {'TZ': 'UTC'}):
tzset = getattr(time, "tzset", lambda: None)
with patch.dict(os.environ, {"TZ": "UTC"}):
tzset()
yield
tzset()

View file

@ -163,7 +163,6 @@ def test_pep518_with_namespace_package(script, data, common_wheels):
)
@pytest.mark.timeout(60)
@pytest.mark.parametrize('command', ('install', 'wheel'))
@pytest.mark.parametrize('package', ('pep518_forkbomb',
'pep518_twin_forkbombs_first',

View file

@ -298,7 +298,7 @@ def test_prompt_for_keyring_if_needed(script, data, cert_factory, auth_needed):
response(str(data.packages / "simple-3.0.tar.gz")),
]
url = "https://{}:{}/simple".format(server.host, server.port)
url = f"https://{server.host}:{server.port}/simple"
keyring_content = textwrap.dedent("""\
import os

View file

@ -1,203 +0,0 @@
"""
Tests for the resolver
"""
import os
import re
import sys
import pytest
import yaml
from tests.lib import DATA_DIR, create_basic_wheel_for_package, path_to_url
def generate_yaml_tests(directory):
"""
Generate yaml test cases from the yaml files in the given directory
"""
for yml_file in directory.glob("*.yml"):
data = yaml.safe_load(yml_file.read_text())
assert "cases" in data, "A fixture needs cases to be used in testing"
# Strip the parts of the directory to only get a name without
# extension and resolver directory
base_name = str(yml_file)[len(str(directory)) + 1:-4]
base = data.get("base", {})
cases = data["cases"]
for resolver in 'legacy', '2020-resolver':
for i, case_template in enumerate(cases):
case = base.copy()
case.update(case_template)
case[":name:"] = base_name
if len(cases) > 1:
case[":name:"] += "-" + str(i)
case[":name:"] += "*" + resolver
case[":resolver:"] = resolver
skip = case.pop("skip", False)
assert skip in [False, True, 'legacy', '2020-resolver']
if skip is True or skip == resolver:
case = pytest.param(case, marks=pytest.mark.xfail)
yield case
def id_func(param):
"""
Give a nice parameter name to the generated function parameters
"""
if isinstance(param, dict) and ":name:" in param:
return param[":name:"]
retval = str(param)
if len(retval) > 25:
retval = retval[:20] + "..." + retval[-2:]
return retval
def convert_to_dict(string):
def stripping_split(my_str, splitwith, count=None):
if count is None:
return [x.strip() for x in my_str.strip().split(splitwith)]
else:
return [x.strip() for x in my_str.strip().split(splitwith, count)]
parts = stripping_split(string, ";")
retval = {}
retval["depends"] = []
retval["extras"] = {}
retval["name"], retval["version"] = stripping_split(parts[0], " ")
for part in parts[1:]:
verb, args_str = stripping_split(part, " ", 1)
assert verb in ["depends"], f"Unknown verb {verb!r}"
retval[verb] = stripping_split(args_str, ",")
return retval
def handle_request(script, action, requirement, options, resolver_variant):
if action == 'install':
args = ['install']
if resolver_variant == "legacy":
args.append("--use-deprecated=legacy-resolver")
args.extend(["--no-index", "--find-links",
path_to_url(script.scratch_path)])
elif action == 'uninstall':
args = ['uninstall', '--yes']
else:
raise f"Did not excpet action: {action!r}"
if isinstance(requirement, str):
args.append(requirement)
elif isinstance(requirement, list):
args.extend(requirement)
else:
raise f"requirement neither str nor list {requirement!r}"
args.extend(options)
args.append("--verbose")
result = script.pip(*args,
allow_stderr_error=True,
allow_stderr_warning=True,
allow_error=True)
# Check which packages got installed
state = []
for path in os.listdir(script.site_packages_path):
if path.endswith(".dist-info"):
name, version = (
os.path.basename(path)[:-len(".dist-info")]
).rsplit("-", 1)
# TODO: information about extras.
state.append(" ".join((name, version)))
return {"result": result, "state": sorted(state)}
def check_error(error, result):
return_code = error.get('code')
if return_code:
assert result.returncode == return_code
stderr = error.get('stderr')
if not stderr:
return
if isinstance(stderr, str):
patters = [stderr]
elif isinstance(stderr, list):
patters = stderr
else:
raise "string or list expected, found %r" % stderr
for patter in patters:
match = re.search(patter, result.stderr, re.I)
assert match, 'regex %r not found in stderr: %r' % (
stderr, result.stderr)
@pytest.mark.yaml
@pytest.mark.parametrize(
"case", generate_yaml_tests(DATA_DIR.parent / "yaml"), ids=id_func
)
def test_yaml_based(script, case):
available = case.get("available", [])
requests = case.get("request", [])
responses = case.get("response", [])
assert len(requests) == len(responses), (
"Expected requests and responses counts to be same"
)
# Create a custom index of all the packages that are supposed to be
# available
# XXX: This doesn't work because this isn't making an index of files.
for package in available:
if isinstance(package, str):
package = convert_to_dict(package)
assert isinstance(package, dict), "Needs to be a dictionary"
create_basic_wheel_for_package(script, **package)
# use scratch path for index
for request, response in zip(requests, responses):
for action in 'install', 'uninstall':
if action in request:
break
else:
raise f"Unsupported request {request!r}"
# Perform the requested action
effect = handle_request(script, action,
request[action],
request.get('options', '').split(),
resolver_variant=case[':resolver:'])
result = effect['result']
if 0: # for analyzing output easier
with open(DATA_DIR.parent / "yaml" /
case[':name:'].replace('*', '-'), 'w') as fo:
fo.write("=== RETURNCODE = %d\n" % result.returncode)
fo.write("=== STDERR ===:\n%s\n" % result.stderr)
if 'state' in response:
assert effect['state'] == (response['state'] or []), str(result)
error = response.get('error')
if error and case[":resolver:"] == 'new' and sys.platform != 'win32':
# Note: we currently skip running these tests on Windows, as they
# were failing due to different error codes. There should not
# be a reason for not running these this check on Windows.
check_error(error, result)

View file

@ -48,12 +48,12 @@ def path_to_url(path):
path = os.path.normpath(os.path.abspath(path))
drive, path = os.path.splitdrive(path)
filepath = path.split(os.path.sep)
url = '/'.join(filepath)
url = "/".join(filepath)
if drive:
# Note: match urllib.request.pathname2url's
# behavior: uppercase the drive letter.
return 'file:///' + drive.upper() + url
return 'file://' + url
return "file:///" + drive.upper() + url
return "file://" + url
def _test_path_to_file_url(path):
@ -63,12 +63,11 @@ def _test_path_to_file_url(path):
Args:
path: a tests.lib.path.Path object.
"""
return 'file://' + path.resolve().replace('\\', '/')
return "file://" + path.resolve().replace("\\", "/")
def create_file(path, contents=None):
"""Create a file on the path, with the given contents
"""
"""Create a file on the path, with the given contents"""
from pip._internal.utils.misc import ensure_dir
ensure_dir(os.path.dirname(path))
@ -220,51 +219,59 @@ class TestFailure(AssertionError):
"""
An "assertion" failed during testing.
"""
pass
class TestPipResult:
def __init__(self, impl, verbose=False):
self._impl = impl
if verbose:
print(self.stdout)
if self.stderr:
print('======= stderr ========')
print("======= stderr ========")
print(self.stderr)
print('=======================')
print("=======================")
def __getattr__(self, attr):
return getattr(self._impl, attr)
if sys.platform == 'win32':
if sys.platform == "win32":
@property
def stdout(self):
return self._impl.stdout.replace('\r\n', '\n')
return self._impl.stdout.replace("\r\n", "\n")
@property
def stderr(self):
return self._impl.stderr.replace('\r\n', '\n')
return self._impl.stderr.replace("\r\n", "\n")
def __str__(self):
return str(self._impl).replace('\r\n', '\n')
return str(self._impl).replace("\r\n", "\n")
else:
# Python doesn't automatically forward __str__ through __getattr__
def __str__(self):
return str(self._impl)
def assert_installed(self, pkg_name, editable=True, with_files=None,
without_files=None, without_egg_link=False,
use_user_site=False, sub_dir=False):
def assert_installed(
self,
pkg_name,
editable=True,
with_files=None,
without_files=None,
without_egg_link=False,
use_user_site=False,
sub_dir=False,
):
with_files = with_files or []
without_files = without_files or []
e = self.test_env
if editable:
pkg_dir = e.venv / 'src' / pkg_name.lower()
pkg_dir = e.venv / "src" / pkg_name.lower()
# If package was installed in a sub directory
if sub_dir:
pkg_dir = pkg_dir / sub_dir
@ -273,72 +280,76 @@ class TestPipResult:
pkg_dir = e.site_packages / pkg_name
if use_user_site:
egg_link_path = e.user_site / pkg_name + '.egg-link'
egg_link_path = e.user_site / pkg_name + ".egg-link"
else:
egg_link_path = e.site_packages / pkg_name + '.egg-link'
egg_link_path = e.site_packages / pkg_name + ".egg-link"
if without_egg_link:
if egg_link_path in self.files_created:
raise TestFailure(
'unexpected egg link file created: '
f'{egg_link_path!r}\n{self}'
"unexpected egg link file created: " f"{egg_link_path!r}\n{self}"
)
else:
if egg_link_path not in self.files_created:
raise TestFailure(
'expected egg link file missing: '
f'{egg_link_path!r}\n{self}'
"expected egg link file missing: " f"{egg_link_path!r}\n{self}"
)
egg_link_file = self.files_created[egg_link_path]
egg_link_contents = egg_link_file.bytes.replace(os.linesep, '\n')
egg_link_contents = egg_link_file.bytes.replace(os.linesep, "\n")
# FIXME: I don't understand why there's a trailing . here
if not (egg_link_contents.endswith('\n.') and
egg_link_contents[:-2].endswith(pkg_dir)):
expected_ending = pkg_dir + '\n.'
raise TestFailure(textwrap.dedent(
f'''\
if not (
egg_link_contents.endswith("\n.")
and egg_link_contents[:-2].endswith(pkg_dir)
):
expected_ending = pkg_dir + "\n."
raise TestFailure(
textwrap.dedent(
f"""
Incorrect egg_link file {egg_link_file!r}
Expected ending: {expected_ending!r}
------- Actual contents -------
{egg_link_contents!r}
-------------------------------'''
))
if use_user_site:
pth_file = e.user_site / 'easy-install.pth'
else:
pth_file = e.site_packages / 'easy-install.pth'
if (pth_file in self.files_updated) == without_egg_link:
maybe = '' if without_egg_link else 'not '
raise TestFailure(
f'{pth_file} unexpectedly {maybe}updated by install'
-------------------------------
"""
).strip()
)
if use_user_site:
pth_file = e.user_site / "easy-install.pth"
else:
pth_file = e.site_packages / "easy-install.pth"
if (pth_file in self.files_updated) == without_egg_link:
maybe = "" if without_egg_link else "not "
raise TestFailure(f"{pth_file} unexpectedly {maybe}updated by install")
if (pkg_dir in self.files_created) == (curdir in without_files):
maybe = 'not ' if curdir in without_files else ''
maybe = "not " if curdir in without_files else ""
files = sorted(self.files_created)
raise TestFailure(textwrap.dedent(f'''\
raise TestFailure(
textwrap.dedent(
f"""
expected package directory {pkg_dir!r} {maybe}to be created
actually created:
{files}
'''))
"""
)
)
for f in with_files:
normalized_path = os.path.normpath(pkg_dir / f)
if normalized_path not in self.files_created:
raise TestFailure(
f'Package directory {pkg_dir!r} missing '
f'expected content {f!r}'
f"Package directory {pkg_dir!r} missing " f"expected content {f!r}"
)
for f in without_files:
normalized_path = os.path.normpath(pkg_dir / f)
if normalized_path in self.files_created:
raise TestFailure(
f'Package directory {pkg_dir!r} has unexpected content {f}'
f"Package directory {pkg_dir!r} has unexpected content {f}"
)
def did_create(self, path, message=None):
@ -355,8 +366,7 @@ class TestPipResult:
def _one_or_both(a, b):
"""Returns f"{a}\n{b}" if a is truthy, else returns str(b).
"""
"""Returns f"{a}\n{b}" if a is truthy, else returns str(b)."""
if not a:
return str(b)
@ -367,15 +377,19 @@ def make_check_stderr_message(stderr, line, reason):
"""
Create an exception message to use inside check_stderr().
"""
return dedent("""\
return dedent(
"""\
{reason}:
Caused by line: {line!r}
Complete stderr: {stderr}
""").format(stderr=stderr, line=line, reason=reason)
"""
).format(stderr=stderr, line=line, reason=reason)
def _check_stderr(
stderr, allow_stderr_warning, allow_stderr_error,
stderr,
allow_stderr_warning,
allow_stderr_error,
):
"""
Check the given stderr for logged warnings and errors.
@ -396,29 +410,29 @@ def _check_stderr(
# sent directly to stderr and so bypass any configured log formatter.
# The "--- Logging error ---" string is used in Python 3.4+, and
# "Logged from file " is used in Python 2.
if (line.startswith('--- Logging error ---') or
line.startswith('Logged from file ')):
reason = 'stderr has a logging error, which is never allowed'
if line.startswith("--- Logging error ---") or line.startswith(
"Logged from file "
):
reason = "stderr has a logging error, which is never allowed"
msg = make_check_stderr_message(stderr, line=line, reason=reason)
raise RuntimeError(msg)
if allow_stderr_error:
continue
if line.startswith('ERROR: '):
if line.startswith("ERROR: "):
reason = (
'stderr has an unexpected error '
'(pass allow_stderr_error=True to permit this)'
"stderr has an unexpected error "
"(pass allow_stderr_error=True to permit this)"
)
msg = make_check_stderr_message(stderr, line=line, reason=reason)
raise RuntimeError(msg)
if allow_stderr_warning:
continue
if (line.startswith('WARNING: ') or
line.startswith(DEPRECATION_MSG_PREFIX)):
if line.startswith("WARNING: ") or line.startswith(DEPRECATION_MSG_PREFIX):
reason = (
'stderr has an unexpected warning '
'(pass allow_stderr_warning=True to permit this)'
"stderr has an unexpected warning "
"(pass allow_stderr_warning=True to permit this)"
)
msg = make_check_stderr_message(stderr, line=line, reason=reason)
raise RuntimeError(msg)
@ -438,7 +452,7 @@ class PipTestEnvironment(TestFileEnvironment):
# a name of the form xxxx_path and relative paths have a name that
# does not end in '_path'.
exe = sys.platform == 'win32' and '.exe' or ''
exe = sys.platform == "win32" and ".exe" or ""
verbose = False
def __init__(self, base_path, *args, virtualenv, pip_expect_warning=None, **kwargs):
@ -454,16 +468,16 @@ class PipTestEnvironment(TestFileEnvironment):
self.user_base_path = self.venv_path.joinpath("user")
self.user_site_path = self.venv_path.joinpath(
"user",
site.USER_SITE[len(site.USER_BASE) + 1:],
site.USER_SITE[len(site.USER_BASE) + 1 :],
)
if sys.platform == 'win32':
if sys.platform == "win32":
if sys.version_info >= (3, 5):
scripts_base = Path(
os.path.normpath(self.user_site_path.joinpath('..'))
os.path.normpath(self.user_site_path.joinpath(".."))
)
else:
scripts_base = self.user_base_path
self.user_bin_path = scripts_base.joinpath('Scripts')
self.user_bin_path = scripts_base.joinpath("Scripts")
else:
self.user_bin_path = self.user_base_path.joinpath(
os.path.relpath(self.bin_path, self.venv_path)
@ -495,12 +509,21 @@ class PipTestEnvironment(TestFileEnvironment):
super().__init__(base_path, *args, **kwargs)
# Expand our absolute path directories into relative
for name in ["base", "venv", "bin", "lib", "site_packages",
"user_base", "user_site", "user_bin", "scratch"]:
for name in [
"base",
"venv",
"bin",
"lib",
"site_packages",
"user_base",
"user_site",
"user_bin",
"scratch",
]:
real_name = f"{name}_path"
relative_path = Path(os.path.relpath(
getattr(self, real_name), self.base_path
))
relative_path = Path(
os.path.relpath(getattr(self, real_name), self.base_path)
)
setattr(self, name, relative_path)
# Make sure temp_path is a Path object
@ -514,7 +537,7 @@ class PipTestEnvironment(TestFileEnvironment):
self.user_site_path.joinpath("easy-install.pth").touch()
def _ignore_file(self, fn):
if fn.endswith('__pycache__') or fn.endswith(".pyc"):
if fn.endswith("__pycache__") or fn.endswith(".pyc"):
result = True
else:
result = super()._ignore_file(fn)
@ -525,7 +548,7 @@ class PipTestEnvironment(TestFileEnvironment):
# results because of venv `lib64 -> lib/` symlink on Linux.
full = os.path.join(self.base_path, path)
if os.path.isdir(full) and os.path.islink(full):
if not self.temp_path or path != 'tmp':
if not self.temp_path or path != "tmp":
result[path] = FoundDir(self.base_path, path)
else:
super()._find_traverse(path, result)
@ -560,42 +583,40 @@ class PipTestEnvironment(TestFileEnvironment):
compatibility.
"""
if self.verbose:
print(f'>> running {args} {kw}')
print(f">> running {args} {kw}")
assert not cwd or not run_from, "Don't use run_from; it's going away"
cwd = cwd or run_from or self.cwd
if sys.platform == 'win32':
if sys.platform == "win32":
# Partial fix for ScriptTest.run using `shell=True` on Windows.
args = [str(a).replace('^', '^^').replace('&', '^&') for a in args]
args = [str(a).replace("^", "^^").replace("&", "^&") for a in args]
if allow_error:
kw['expect_error'] = True
kw["expect_error"] = True
# Propagate default values.
expect_error = kw.get('expect_error')
expect_error = kw.get("expect_error")
if expect_error:
# Then default to allowing logged errors.
if allow_stderr_error is not None and not allow_stderr_error:
raise RuntimeError(
'cannot pass allow_stderr_error=False with '
'expect_error=True'
"cannot pass allow_stderr_error=False with " "expect_error=True"
)
allow_stderr_error = True
elif kw.get('expect_stderr'):
elif kw.get("expect_stderr"):
# Then default to allowing logged warnings.
if allow_stderr_warning is not None and not allow_stderr_warning:
raise RuntimeError(
'cannot pass allow_stderr_warning=False with '
'expect_stderr=True'
"cannot pass allow_stderr_warning=False with " "expect_stderr=True"
)
allow_stderr_warning = True
if allow_stderr_error:
if allow_stderr_warning is not None and not allow_stderr_warning:
raise RuntimeError(
'cannot pass allow_stderr_warning=False with '
'allow_stderr_error=True'
"cannot pass allow_stderr_warning=False with "
"allow_stderr_error=True"
)
# Default values if not set.
@ -606,7 +627,7 @@ class PipTestEnvironment(TestFileEnvironment):
# Pass expect_stderr=True to allow any stderr. We do this because
# we do our checking of stderr further on in check_stderr().
kw['expect_stderr'] = True
kw["expect_stderr"] = True
result = super().run(cwd=cwd, *args, **kw)
if expect_error and not allow_error:
@ -615,7 +636,8 @@ class PipTestEnvironment(TestFileEnvironment):
raise AssertionError("Script passed unexpectedly.")
_check_stderr(
result.stderr, allow_stderr_error=allow_stderr_error,
result.stderr,
allow_stderr_error=allow_stderr_error,
allow_stderr_warning=allow_stderr_warning,
)
@ -624,24 +646,27 @@ class PipTestEnvironment(TestFileEnvironment):
def pip(self, *args, use_module=True, **kwargs):
__tracebackhide__ = True
if self.pip_expect_warning:
kwargs['allow_stderr_warning'] = True
kwargs["allow_stderr_warning"] = True
if use_module:
exe = 'python'
args = ('-m', 'pip') + args
exe = "python"
args = ("-m", "pip") + args
else:
exe = 'pip'
exe = "pip"
return self.run(exe, *args, **kwargs)
def pip_install_local(self, *args, **kwargs):
return self.pip(
"install", "--no-index",
"--find-links", path_to_url(os.path.join(DATA_DIR, "packages")),
*args, **kwargs
"install",
"--no-index",
"--find-links",
path_to_url(os.path.join(DATA_DIR, "packages")),
*args,
**kwargs,
)
def easy_install(self, *args, **kwargs):
args = ('-m', 'easy_install') + args
return self.run('python', *args, **kwargs)
args = ("-m", "easy_install") + args
return self.run("python", *args, **kwargs)
# FIXME ScriptTest does something similar, but only within a single
@ -679,15 +704,15 @@ def diff_states(start, end, ignore=None):
prefix = prefix.rstrip(os.path.sep) + os.path.sep
return path.startswith(prefix)
start_keys = {k for k in start.keys()
if not any([prefix_match(k, i) for i in ignore])}
end_keys = {k for k in end.keys()
if not any([prefix_match(k, i) for i in ignore])}
start_keys = {
k for k in start.keys() if not any([prefix_match(k, i) for i in ignore])
}
end_keys = {k for k in end.keys() if not any([prefix_match(k, i) for i in ignore])}
deleted = {k: start[k] for k in start_keys.difference(end_keys)}
created = {k: end[k] for k in end_keys.difference(start_keys)}
updated = {}
for k in start_keys.intersection(end_keys):
if (start[k].size != end[k].size):
if start[k].size != end[k].size:
updated[k] = end[k]
return dict(deleted=deleted, created=created, updated=updated)
@ -716,8 +741,10 @@ def assert_all_changes(start_state, end_state, expected_changes):
diff = diff_states(start_files, end_files, ignore=expected_changes)
if list(diff.values()) != [{}, {}, {}]:
raise TestFailure('Unexpected changes:\n' + '\n'.join(
[k + ': ' + ', '.join(v.keys()) for k, v in diff.items()]))
raise TestFailure(
"Unexpected changes:\n"
+ "\n".join([k + ": " + ", ".join(v.keys()) for k, v in diff.items()])
)
# Don't throw away this potentially useful information
return diff
@ -728,14 +755,16 @@ def _create_main_file(dir_path, name=None, output=None):
Create a module with a main() function that prints the given output.
"""
if name is None:
name = 'version_pkg'
name = "version_pkg"
if output is None:
output = '0.1'
text = textwrap.dedent("""\
output = "0.1"
text = textwrap.dedent(
f"""
def main():
print({!r})
""".format(output))
filename = f'{name}.py'
print({output!r})
"""
)
filename = f"{name}.py"
dir_path.joinpath(filename).write_text(text)
@ -755,7 +784,7 @@ def _git_commit(
message: an optional commit message.
"""
if message is None:
message = 'test commit'
message = "test commit"
args = []
@ -766,151 +795,186 @@ def _git_commit(
args.append("--all")
new_args = [
'git', 'commit', '-q', '--author', 'pip <distutils-sig@python.org>',
"git",
"commit",
"-q",
"--author",
"pip <distutils-sig@python.org>",
]
new_args.extend(args)
new_args.extend(['-m', message])
new_args.extend(["-m", message])
env_or_script.run(*new_args, cwd=repo_dir)
def _vcs_add(script, version_pkg_path, vcs='git'):
if vcs == 'git':
script.run('git', 'init', cwd=version_pkg_path)
script.run('git', 'add', '.', cwd=version_pkg_path)
_git_commit(script, version_pkg_path, message='initial version')
elif vcs == 'hg':
script.run('hg', 'init', cwd=version_pkg_path)
script.run('hg', 'add', '.', cwd=version_pkg_path)
def _vcs_add(script, version_pkg_path, vcs="git"):
if vcs == "git":
script.run("git", "init", cwd=version_pkg_path)
script.run("git", "add", ".", cwd=version_pkg_path)
_git_commit(script, version_pkg_path, message="initial version")
elif vcs == "hg":
script.run("hg", "init", cwd=version_pkg_path)
script.run("hg", "add", ".", cwd=version_pkg_path)
script.run(
'hg', 'commit', '-q',
'--user', 'pip <distutils-sig@python.org>',
'-m', 'initial version', cwd=version_pkg_path,
"hg",
"commit",
"-q",
"--user",
"pip <distutils-sig@python.org>",
"-m",
"initial version",
cwd=version_pkg_path,
)
elif vcs == 'svn':
elif vcs == "svn":
repo_url = _create_svn_repo(script, version_pkg_path)
script.run(
'svn', 'checkout', repo_url, 'pip-test-package',
cwd=script.scratch_path
"svn", "checkout", repo_url, "pip-test-package", cwd=script.scratch_path
)
checkout_path = script.scratch_path / 'pip-test-package'
checkout_path = script.scratch_path / "pip-test-package"
# svn internally stores windows drives as uppercase; we'll match that.
checkout_path = checkout_path.replace('c:', 'C:')
checkout_path = checkout_path.replace("c:", "C:")
version_pkg_path = checkout_path
elif vcs == 'bazaar':
script.run('bzr', 'init', cwd=version_pkg_path)
script.run('bzr', 'add', '.', cwd=version_pkg_path)
elif vcs == "bazaar":
script.run("bzr", "init", cwd=version_pkg_path)
script.run("bzr", "add", ".", cwd=version_pkg_path)
script.run(
'bzr', 'whoami', 'pip <distutils-sig@python.org>',
cwd=version_pkg_path)
"bzr", "whoami", "pip <distutils-sig@python.org>", cwd=version_pkg_path
)
script.run(
'bzr', 'commit', '-q',
'--author', 'pip <distutils-sig@python.org>',
'-m', 'initial version', cwd=version_pkg_path,
"bzr",
"commit",
"-q",
"--author",
"pip <distutils-sig@python.org>",
"-m",
"initial version",
cwd=version_pkg_path,
)
else:
raise ValueError(f'Unknown vcs: {vcs}')
raise ValueError(f"Unknown vcs: {vcs}")
return version_pkg_path
def _create_test_package_with_subdirectory(script, subdirectory):
script.scratch_path.joinpath("version_pkg").mkdir()
version_pkg_path = script.scratch_path / 'version_pkg'
version_pkg_path = script.scratch_path / "version_pkg"
_create_main_file(version_pkg_path, name="version_pkg", output="0.1")
version_pkg_path.joinpath("setup.py").write_text(
textwrap.dedent("""
textwrap.dedent(
"""
from setuptools import setup, find_packages
setup(name='version_pkg',
version='0.1',
setup(
name="version_pkg",
version="0.1",
packages=find_packages(),
py_modules=['version_pkg'],
entry_points=dict(console_scripts=['version_pkg=version_pkg:main']))
"""))
py_modules=["version_pkg"],
entry_points=dict(console_scripts=["version_pkg=version_pkg:main"]),
)
"""
)
)
subdirectory_path = version_pkg_path.joinpath(subdirectory)
subdirectory_path.mkdir()
_create_main_file(subdirectory_path, name="version_subpkg", output="0.1")
subdirectory_path.joinpath('setup.py').write_text(
textwrap.dedent("""
from setuptools import setup, find_packages
setup(name='version_subpkg',
version='0.1',
packages=find_packages(),
py_modules=['version_subpkg'],
entry_points=dict(console_scripts=['version_pkg=version_subpkg:main']))
"""))
subdirectory_path.joinpath("setup.py").write_text(
textwrap.dedent(
"""
from setuptools import find_packages, setup
script.run('git', 'init', cwd=version_pkg_path)
script.run('git', 'add', '.', cwd=version_pkg_path)
_git_commit(script, version_pkg_path, message='initial version')
setup(
name="version_subpkg",
version="0.1",
packages=find_packages(),
py_modules=["version_subpkg"],
entry_points=dict(console_scripts=["version_pkg=version_subpkg:main"]),
)
"""
)
)
script.run("git", "init", cwd=version_pkg_path)
script.run("git", "add", ".", cwd=version_pkg_path)
_git_commit(script, version_pkg_path, message="initial version")
return version_pkg_path
def _create_test_package_with_srcdir(script, name='version_pkg', vcs='git'):
def _create_test_package_with_srcdir(script, name="version_pkg", vcs="git"):
script.scratch_path.joinpath(name).mkdir()
version_pkg_path = script.scratch_path / name
subdir_path = version_pkg_path.joinpath('subdir')
subdir_path = version_pkg_path.joinpath("subdir")
subdir_path.mkdir()
src_path = subdir_path.joinpath('src')
src_path = subdir_path.joinpath("src")
src_path.mkdir()
pkg_path = src_path.joinpath('pkg')
pkg_path = src_path.joinpath("pkg")
pkg_path.mkdir()
pkg_path.joinpath('__init__.py').write_text('')
subdir_path.joinpath("setup.py").write_text(textwrap.dedent("""
pkg_path.joinpath("__init__.py").write_text("")
subdir_path.joinpath("setup.py").write_text(
textwrap.dedent(
"""
from setuptools import setup, find_packages
setup(
name='{name}',
version='0.1',
name="{name}",
version="0.1",
packages=find_packages(),
package_dir={{'': 'src'}},
package_dir={{"": "src"}},
)
""".format(
name=name
)
)
)
""".format(name=name)))
return _vcs_add(script, version_pkg_path, vcs)
def _create_test_package(script, name='version_pkg', vcs='git'):
def _create_test_package(script, name="version_pkg", vcs="git"):
script.scratch_path.joinpath(name).mkdir()
version_pkg_path = script.scratch_path / name
_create_main_file(version_pkg_path, name=name, output='0.1')
version_pkg_path.joinpath("setup.py").write_text(textwrap.dedent("""
_create_main_file(version_pkg_path, name=name, output="0.1")
version_pkg_path.joinpath("setup.py").write_text(
textwrap.dedent(
"""
from setuptools import setup, find_packages
setup(
name='{name}',
version='0.1',
name="{name}",
version="0.1",
packages=find_packages(),
py_modules=['{name}'],
entry_points=dict(console_scripts=['{name}={name}:main'])
py_modules=["{name}"],
entry_points=dict(console_scripts=["{name}={name}:main"]),
)
""".format(
name=name
)
)
)
""".format(name=name)))
return _vcs_add(script, version_pkg_path, vcs)
def _create_svn_repo(script, version_pkg_path):
repo_url = path_to_url(
script.scratch_path / 'pip-test-package-repo' / 'trunk')
repo_url = path_to_url(script.scratch_path / "pip-test-package-repo" / "trunk")
script.run("svnadmin", "create", "pip-test-package-repo", cwd=script.scratch_path)
script.run(
'svnadmin', 'create', 'pip-test-package-repo',
cwd=script.scratch_path
)
script.run(
'svn', 'import', version_pkg_path, repo_url,
'-m', 'Initial import of pip-test-package',
cwd=script.scratch_path
"svn",
"import",
version_pkg_path,
repo_url,
"-m",
"Initial import of pip-test-package",
cwd=script.scratch_path,
)
return repo_url
def _change_test_package_version(script, version_pkg_path):
_create_main_file(
version_pkg_path, name='version_pkg', output='some different version'
version_pkg_path, name="version_pkg", output="some different version"
)
# Pass -a to stage the change to the main file.
_git_commit(
script, version_pkg_path, message='messed version', stage_modified=True
)
_git_commit(script, version_pkg_path, message="messed version", stage_modified=True)
def assert_raises_regexp(exception, reg, run, *args, **kwargs):
@ -935,21 +999,25 @@ def requirements_file(contents, tmpdir):
:param tmpdir: A Path to the folder in which to create the file
"""
path = tmpdir / 'reqs.txt'
path = tmpdir / "reqs.txt"
path.write_text(contents)
yield path
path.unlink()
def create_test_package_with_setup(script, **setup_kwargs):
assert 'name' in setup_kwargs, setup_kwargs
pkg_path = script.scratch_path / setup_kwargs['name']
assert "name" in setup_kwargs, setup_kwargs
pkg_path = script.scratch_path / setup_kwargs["name"]
pkg_path.mkdir()
pkg_path.joinpath("setup.py").write_text(textwrap.dedent(f"""
pkg_path.joinpath("setup.py").write_text(
textwrap.dedent(
f"""
from setuptools import setup
kwargs = {setup_kwargs!r}
setup(**kwargs)
"""))
"""
)
)
return pkg_path
@ -961,9 +1029,7 @@ def urlsafe_b64encode_nopad(data):
def create_really_basic_wheel(name, version):
# type: (str, str) -> bytes
def digest(contents):
return "sha256={}".format(
urlsafe_b64encode_nopad(sha256(contents).digest())
)
return "sha256={}".format(urlsafe_b64encode_nopad(sha256(contents).digest()))
def add_file(path, text):
contents = text.encode("utf-8")
@ -983,7 +1049,9 @@ def create_really_basic_wheel(name, version):
Metadata-Version: 2.1
Name: {}
Version: {}
""".format(name, version)
""".format(
name, version
)
),
)
z.writestr(record_path, "\n".join(",".join(r) for r in records))
@ -1043,7 +1111,6 @@ def create_basic_wheel_for_package(
metadata_updates=metadata_updates,
extra_metadata_files={"top_level.txt": name},
extra_files=extra_files,
# Have an empty RECORD because we don't want to be checking hashes.
record="",
)
@ -1052,9 +1119,7 @@ def create_basic_wheel_for_package(
return archive_path
def create_basic_sdist_for_package(
script, name, version, extra_files=None
):
def create_basic_sdist_for_package(script, name, version, extra_files=None):
files = {
"setup.py": """
from setuptools import find_packages, setup
@ -1063,17 +1128,13 @@ def create_basic_sdist_for_package(
}
# Some useful shorthands
archive_name = "{name}-{version}.tar.gz".format(
name=name, version=version
)
archive_name = "{name}-{version}.tar.gz".format(name=name, version=version)
# Replace key-values with formatted values
for key, value in list(files.items()):
del files[key]
key = key.format(name=name)
files[key] = textwrap.dedent(value).format(
name=name, version=version
).strip()
files[key] = textwrap.dedent(value).format(name=name, version=version).strip()
# Add new files after formatting
if extra_files:
@ -1087,7 +1148,7 @@ def create_basic_sdist_for_package(
retval = script.scratch_path / archive_name
generated = shutil.make_archive(
retval,
'gztar',
"gztar",
root_dir=script.temp_path,
base_dir=os.curdir,
)
@ -1104,15 +1165,15 @@ def need_executable(name, check_cmd):
try:
subprocess.check_output(check_cmd)
except (OSError, subprocess.CalledProcessError):
return pytest.mark.skip(
reason=f'{name} is not available')(fn)
return pytest.mark.skip(reason=f"{name} is not available")(fn)
return fn
return wrapper
def is_bzr_installed():
try:
subprocess.check_output(('bzr', 'version', '--short'))
subprocess.check_output(("bzr", "version", "--short"))
except OSError:
return False
return True
@ -1120,27 +1181,23 @@ def is_bzr_installed():
def is_svn_installed():
try:
subprocess.check_output(('svn', '--version'))
subprocess.check_output(("svn", "--version"))
except OSError:
return False
return True
def need_bzr(fn):
return pytest.mark.bzr(need_executable(
'Bazaar', ('bzr', 'version', '--short')
)(fn))
return pytest.mark.bzr(need_executable("Bazaar", ("bzr", "version", "--short"))(fn))
def need_svn(fn):
return pytest.mark.svn(need_executable(
'Subversion', ('svn', '--version')
)(need_executable(
'Subversion Admin', ('svnadmin', '--version')
)(fn)))
return pytest.mark.svn(
need_executable("Subversion", ("svn", "--version"))(
need_executable("Subversion Admin", ("svnadmin", "--version"))(fn)
)
)
def need_mercurial(fn):
return pytest.mark.mercurial(need_executable(
'Mercurial', ('hg', 'version')
)(fn))
return pytest.mark.mercurial(need_executable("Mercurial", ("hg", "version"))(fn))

View file

@ -11,13 +11,13 @@ from cryptography.x509.oid import NameOID
def make_tls_cert(hostname):
# type: (str) -> Tuple[x509.Certificate, rsa.RSAPrivateKey]
key = rsa.generate_private_key(
public_exponent=65537,
key_size=2048,
backend=default_backend()
public_exponent=65537, key_size=2048, backend=default_backend()
)
subject = issuer = x509.Name([
subject = issuer = x509.Name(
[
x509.NameAttribute(NameOID.COMMON_NAME, hostname),
])
]
)
cert = (
x509.CertificateBuilder()
.subject_name(subject)

View file

@ -15,7 +15,6 @@ kinds = pip._internal.configuration.kinds
class ConfigurationMixin:
def setup(self):
self.configuration = pip._internal.configuration.Configuration(
isolated=False,
@ -41,9 +40,7 @@ class ConfigurationMixin:
@contextlib.contextmanager
def tmpfile(self, contents):
# Create a temporary file
fd, path = tempfile.mkstemp(
prefix="pip_", suffix="_config.ini", text=True
)
fd, path = tempfile.mkstemp(prefix="pip_", suffix="_config.ini", text=True)
os.close(fd)
contents = textwrap.dedent(contents).lstrip()

View file

@ -43,6 +43,4 @@ def get_filelist(base):
(join_dirpath(p) for p in filenames),
)
return set(chain.from_iterable(
join(*dirinfo) for dirinfo in os.walk(base)
))
return set(chain.from_iterable(join(*dirinfo) for dirinfo in os.walk(base)))

View file

@ -5,11 +5,11 @@ from tests.lib import _create_main_file, _git_commit
def _create_test_package_submodule(env):
env.scratch_path.joinpath("version_pkg_submodule").mkdir()
submodule_path = env.scratch_path / 'version_pkg_submodule'
env.run('touch', 'testfile', cwd=submodule_path)
env.run('git', 'init', cwd=submodule_path)
env.run('git', 'add', '.', cwd=submodule_path)
_git_commit(env, submodule_path, message='initial version / submodule')
submodule_path = env.scratch_path / "version_pkg_submodule"
env.run("touch", "testfile", cwd=submodule_path)
env.run("git", "init", cwd=submodule_path)
env.run("git", "add", ".", cwd=submodule_path)
_git_commit(env, submodule_path, message="initial version / submodule")
return submodule_path
@ -17,8 +17,8 @@ def _create_test_package_submodule(env):
def _change_test_package_submodule(env, submodule_path):
submodule_path.joinpath("testfile").write_text("this is a changed file")
submodule_path.joinpath("testfile2").write_text("this is an added file")
env.run('git', 'add', '.', cwd=submodule_path)
_git_commit(env, submodule_path, message='submodule change')
env.run("git", "add", ".", cwd=submodule_path)
_git_commit(env, submodule_path, message="submodule change")
def _pull_in_submodule_changes_to_module(env, module_path, rel_path):
@ -27,11 +27,9 @@ def _pull_in_submodule_changes_to_module(env, module_path, rel_path):
rel_path: the location of the submodule relative to the superproject.
"""
submodule_path = module_path / rel_path
env.run('git', 'pull', '-q', 'origin', 'master', cwd=submodule_path)
env.run("git", "pull", "-q", "origin", "master", cwd=submodule_path)
# Pass -a to stage the submodule changes that were just pulled in.
_git_commit(
env, module_path, message='submodule change', stage_modified=True
)
_git_commit(env, module_path, message="submodule change", stage_modified=True)
def _create_test_package_with_submodule(env, rel_path):
@ -40,33 +38,37 @@ def _create_test_package_with_submodule(env, rel_path):
rel_path: the location of the submodule relative to the superproject.
"""
env.scratch_path.joinpath("version_pkg").mkdir()
version_pkg_path = env.scratch_path / 'version_pkg'
version_pkg_path = env.scratch_path / "version_pkg"
version_pkg_path.joinpath("testpkg").mkdir()
pkg_path = version_pkg_path / 'testpkg'
pkg_path = version_pkg_path / "testpkg"
pkg_path.joinpath("__init__.py").write_text("# hello there")
_create_main_file(pkg_path, name="version_pkg", output="0.1")
version_pkg_path.joinpath("setup.py").write_text(textwrap.dedent('''\
version_pkg_path.joinpath("setup.py").write_text(
textwrap.dedent(
"""\
from setuptools import setup, find_packages
setup(name='version_pkg',
version='0.1',
packages=find_packages(),
)
'''))
env.run('git', 'init', cwd=version_pkg_path)
env.run('git', 'add', '.', cwd=version_pkg_path)
_git_commit(env, version_pkg_path, message='initial version')
"""
)
)
env.run("git", "init", cwd=version_pkg_path)
env.run("git", "add", ".", cwd=version_pkg_path)
_git_commit(env, version_pkg_path, message="initial version")
submodule_path = _create_test_package_submodule(env)
env.run(
'git',
'submodule',
'add',
"git",
"submodule",
"add",
submodule_path,
rel_path,
cwd=version_pkg_path,
)
_git_commit(env, version_pkg_path, message='initial version w submodule')
_git_commit(env, version_pkg_path, message="initial version w submodule")
return version_pkg_path, submodule_path

View file

@ -3,12 +3,12 @@ from pip._internal.models.link import Link
def make_mock_candidate(version, yanked_reason=None, hex_digest=None):
url = f'https://example.com/pkg-{version}.tar.gz'
url = f"https://example.com/pkg-{version}.tar.gz"
if hex_digest is not None:
assert len(hex_digest) == 64
url += f'#sha256={hex_digest}'
url += f"#sha256={hex_digest}"
link = Link(url, yanked_reason=yanked_reason)
candidate = InstallationCandidate('mypackage', version, link)
candidate = InstallationCandidate("mypackage", version, link)
return candidate

View file

@ -13,15 +13,15 @@ def _create_svn_initools_repo(initools_dir):
Create the SVN INITools repo.
"""
directory = os.path.dirname(initools_dir)
subprocess.check_call('svnadmin create INITools'.split(), cwd=directory)
subprocess.check_call("svnadmin create INITools".split(), cwd=directory)
filename, _ = urllib.request.urlretrieve(
'http://bitbucket.org/hltbra/pip-initools-dump/raw/8b55c908a320/'
'INITools_modified.dump'
"http://bitbucket.org/hltbra/pip-initools-dump/raw/8b55c908a320/"
"INITools_modified.dump"
)
with open(filename) as dump:
subprocess.check_call(
['svnadmin', 'load', initools_dir],
["svnadmin", "load", initools_dir],
stdin=dump,
stdout=subprocess.DEVNULL,
)
@ -38,27 +38,27 @@ def local_checkout(
temp directory Path object unique to each test function invocation,
created as a sub directory of the base temp directory.
"""
assert '+' in remote_repo
vcs_name = remote_repo.split('+', 1)[0]
assert "+" in remote_repo
vcs_name = remote_repo.split("+", 1)[0]
repository_name = os.path.basename(remote_repo)
directory = temp_path.joinpath('cache')
directory = temp_path.joinpath("cache")
repo_url_path = os.path.join(directory, repository_name)
assert not os.path.exists(repo_url_path)
if not os.path.exists(directory):
os.mkdir(directory)
if vcs_name == 'svn':
assert repository_name == 'INITools'
if vcs_name == "svn":
assert repository_name == "INITools"
_create_svn_initools_repo(repo_url_path)
repo_url_path = os.path.join(repo_url_path, 'trunk')
repo_url_path = os.path.join(repo_url_path, "trunk")
else:
vcs_backend = vcs.get_backend(vcs_name)
vcs_backend.obtain(repo_url_path, url=hide_url(remote_repo))
return '{}+{}'.format(vcs_name, path_to_url(repo_url_path))
return "{}+{}".format(vcs_name, path_to_url(repo_url_path))
def local_repo(remote_repo, temp_path):
return local_checkout(remote_repo, temp_path).split('+', 1)[1]
return local_checkout(remote_repo, temp_path).split("+", 1)[1]

View file

@ -7,7 +7,6 @@ from pip._internal.commands import CommandInfo, commands_dict
class FakeCommand(Command):
def main(self, args):
index_opts = cmdoptions.make_option_group(
cmdoptions.index_group,
@ -18,11 +17,12 @@ class FakeCommand(Command):
class AddFakeCommandMixin:
def setup(self):
commands_dict['fake'] = CommandInfo(
'tests.lib.options_helpers', 'FakeCommand', 'fake summary',
commands_dict["fake"] = CommandInfo(
"tests.lib.options_helpers",
"FakeCommand",
"fake summary",
)
def teardown(self):
commands_dict.pop('fake')
commands_dict.pop("fake")

View file

@ -157,7 +157,7 @@ class Path(str):
# TODO: Remove after removing inheritance from str.
def join(self, *parts):
raise RuntimeError('Path.join is invalid, use joinpath instead.')
raise RuntimeError("Path.join is invalid, use joinpath instead.")
def read_bytes(self):
# type: () -> bytes
@ -188,4 +188,5 @@ class Path(str):
def stat(self):
return os.stat(self)
curdir = Path(os.path.curdir)

View file

@ -5,7 +5,6 @@ from io import BytesIO
class FakeStream:
def __init__(self, contents):
self._io = BytesIO(contents)
@ -20,7 +19,6 @@ class FakeStream:
class MockResponse:
def __init__(self, contents):
self.raw = FakeStream(contents)
self.content = contents
@ -29,12 +27,11 @@ class MockResponse:
self.status_code = 200
self.connection = None
self.url = None
self.headers = {'Content-Length': len(contents)}
self.headers = {"Content-Length": len(contents)}
self.history = []
class MockConnection:
def _send(self, req, **kwargs):
raise NotImplementedError("_send must be overridden for tests")
@ -46,7 +43,6 @@ class MockConnection:
class MockRequest:
def __init__(self, url):
self.url = url
self.headers = {}

View file

@ -34,10 +34,10 @@ if not hasattr(signal, "pthread_sigmask"):
# practice.
blocked_signals = nullcontext
else:
@contextmanager
def blocked_signals():
"""Block all signals for e.g. starting a worker thread.
"""
"""Block all signals for e.g. starting a worker thread."""
# valid_signals() was added in Python 3.8 (and not using it results
# in a warning on pthread_sigmask() call)
try:
@ -82,12 +82,13 @@ def _mock_wsgi_adapter(mock):
"""Uses a mock to record function arguments and provide
the actual function that should respond.
"""
def adapter(environ, start_response):
# type: (Environ, StartResponse) -> Body
try:
responder = mock(environ, start_response)
except StopIteration:
raise RuntimeError('Ran out of mocked responses.')
raise RuntimeError("Ran out of mocked responses.")
return responder(environ, start_response)
return adapter
@ -136,8 +137,7 @@ def make_mock_server(**kwargs):
@contextmanager
def server_running(server):
# type: (BaseWSGIServer) -> None
"""Context manager for running the provided server in a separate thread.
"""
"""Context manager for running the provided server in a separate thread."""
thread = threading.Thread(target=server.serve_forever)
thread.daemon = True
with blocked_signals():
@ -156,45 +156,50 @@ def text_html_response(text):
# type: (str) -> Responder
def responder(environ, start_response):
# type: (Environ, StartResponse) -> Body
start_response("200 OK", [
start_response(
"200 OK",
[
("Content-Type", "text/html; charset=UTF-8"),
])
return [text.encode('utf-8')]
],
)
return [text.encode("utf-8")]
return responder
def html5_page(text):
# type: (str) -> str
return dedent("""
return (
dedent(
"""
<!DOCTYPE html>
<html>
<body>
{}
</body>
</html>
""").strip().format(text)
"""
)
.strip()
.format(text)
)
def index_page(spec):
# type: (Dict[str, str]) -> Responder
def link(name, value):
return '<a href="{}">{}</a>'.format(
value, name
)
return '<a href="{}">{}</a>'.format(value, name)
links = ''.join(link(*kv) for kv in spec.items())
links = "".join(link(*kv) for kv in spec.items())
return text_html_response(html5_page(links))
def package_page(spec):
# type: (Dict[str, str]) -> Responder
def link(name, value):
return '<a href="{}">{}</a>'.format(
value, name
)
return '<a href="{}">{}</a>'.format(value, name)
links = ''.join(link(*kv) for kv in spec.items())
links = "".join(link(*kv) for kv in spec.items())
return text_html_response(html5_page(links))
@ -204,13 +209,14 @@ def file_response(path):
# type: (Environ, StartResponse) -> Body
size = os.stat(path).st_size
start_response(
"200 OK", [
"200 OK",
[
("Content-Type", "application/octet-stream"),
("Content-Length", str(size)),
],
)
with open(path, 'rb') as f:
with open(path, "rb") as f:
return [f.read()]
return responder
@ -223,22 +229,24 @@ def authorization_response(path):
def responder(environ, start_response):
# type: (Environ, StartResponse) -> Body
if environ.get('HTTP_AUTHORIZATION') == correct_auth:
if environ.get("HTTP_AUTHORIZATION") == correct_auth:
size = os.stat(path).st_size
start_response(
"200 OK", [
"200 OK",
[
("Content-Type", "application/octet-stream"),
("Content-Length", str(size)),
],
)
else:
start_response(
"401 Unauthorized", [
"401 Unauthorized",
[
("WWW-Authenticate", "Basic"),
],
)
with open(path, 'rb') as f:
with open(path, "rb") as f:
return [f.read()]
return responder

View file

@ -18,9 +18,7 @@ def assert_error_startswith(exc_type, expected_start):
with pytest.raises(exc_type) as err:
yield
assert str(err.value).startswith(expected_start), (
f'full message: {err.value}'
)
assert str(err.value).startswith(expected_start), f"full message: {err.value}"
def test_tmp_dir_exists_in_env(script):
@ -31,7 +29,7 @@ def test_tmp_dir_exists_in_env(script):
# need these tests to ensure the assert_no_temp feature of scripttest is
# working
script.assert_no_temp() # this fails if env.tmp_path doesn't exist
assert script.environ['TMPDIR'] == script.temp_path
assert script.environ["TMPDIR"] == script.temp_path
assert isdir(script.temp_path)
@ -41,16 +39,16 @@ def test_correct_pip_version(script):
"""
# output is like:
# pip PIPVERSION from PIPDIRECTORY (python PYVERSION)
result = script.pip('--version')
result = script.pip("--version")
# compare the directory tree of the invoked pip with that of this source
# distribution
pip_folder_outputed = re.match(
r'pip \d+(\.[\d]+)+(\.?(b|rc|dev|pre|post)\d+)? from (.*) '
r'\(python \d(.[\d])+\)$',
result.stdout
r"pip \d+(\.[\d]+)+(\.?(b|rc|dev|pre|post)\d+)? from (.*) "
r"\(python \d(.[\d])+\)$",
result.stdout,
).group(4)
pip_folder = join(SRC_DIR, 'src', 'pip')
pip_folder = join(SRC_DIR, "src", "pip")
diffs = filecmp.dircmp(pip_folder, pip_folder_outputed)
@ -59,32 +57,33 @@ def test_correct_pip_version(script):
# primary resources other than .py files, this code will need
# maintenance
mismatch_py = [
x for x in diffs.left_only + diffs.right_only + diffs.diff_files
if x.endswith('.py')
x
for x in diffs.left_only + diffs.right_only + diffs.diff_files
if x.endswith(".py")
]
assert not mismatch_py, (
f'mismatched source files in {pip_folder!r} '
f'and {pip_folder_outputed!r}: {mismatch_py!r}'
f"mismatched source files in {pip_folder!r} "
f"and {pip_folder_outputed!r}: {mismatch_py!r}"
)
def test_as_import(script):
""" test that pip.__init__.py does not shadow
"""test that pip.__init__.py does not shadow
the command submodule with a dictionary
"""
import pip._internal.commands.install as inst
assert inst is not None
class TestPipTestEnvironment:
def run_stderr_with_prefix(self, script, prefix, **kwargs):
"""
Call run() that prints stderr with the given prefix.
"""
text = f'{prefix}: hello, world\\n'
text = f"{prefix}: hello, world\\n"
command = f'import sys; sys.stderr.write("{text}")'
args = [sys.executable, '-c', command]
args = [sys.executable, "-c", command]
script.run(*args, **kwargs)
def run_with_log_command(self, script, sub_string, **kwargs):
@ -96,14 +95,17 @@ class TestPipTestEnvironment:
"import logging; logging.basicConfig(level='INFO'); "
"logging.getLogger().info('sub: {}', 'foo')"
).format(sub_string)
args = [sys.executable, '-c', command]
args = [sys.executable, "-c", command]
script.run(*args, **kwargs)
@pytest.mark.parametrize('prefix', (
'DEBUG',
'INFO',
'FOO',
))
@pytest.mark.parametrize(
"prefix",
(
"DEBUG",
"INFO",
"FOO",
),
)
def test_run__allowed_stderr(self, script, prefix):
"""
Test calling run() with allowed stderr.
@ -117,21 +119,28 @@ class TestPipTestEnvironment:
"""
# Check that no error happens.
self.run_stderr_with_prefix(
script, 'WARNING', allow_stderr_warning=True,
script,
"WARNING",
allow_stderr_warning=True,
)
# Check that an error still happens with ERROR.
expected_start = 'stderr has an unexpected error'
expected_start = "stderr has an unexpected error"
with assert_error_startswith(RuntimeError, expected_start):
self.run_stderr_with_prefix(
script, 'ERROR', allow_stderr_warning=True,
script,
"ERROR",
allow_stderr_warning=True,
)
@pytest.mark.parametrize('prefix', (
'DEPRECATION',
'WARNING',
'ERROR',
))
@pytest.mark.parametrize(
"prefix",
(
"DEPRECATION",
"WARNING",
"ERROR",
),
)
def test_run__allow_stderr_error(self, script, prefix):
"""
Test passing allow_stderr_error=True.
@ -139,11 +148,14 @@ class TestPipTestEnvironment:
# Check that no error happens.
self.run_stderr_with_prefix(script, prefix, allow_stderr_error=True)
@pytest.mark.parametrize('prefix, expected_start', (
('DEPRECATION', 'stderr has an unexpected warning'),
('WARNING', 'stderr has an unexpected warning'),
('ERROR', 'stderr has an unexpected error'),
))
@pytest.mark.parametrize(
"prefix, expected_start",
(
("DEPRECATION", "stderr has an unexpected warning"),
("WARNING", "stderr has an unexpected warning"),
("ERROR", "stderr has an unexpected error"),
),
)
def test_run__unexpected_stderr(self, script, prefix, expected_start):
"""
Test calling run() with unexpected stderr output.
@ -156,70 +168,72 @@ class TestPipTestEnvironment:
Test calling run() with an unexpected logging error.
"""
# Pass a good substitution string.
self.run_with_log_command(script, sub_string='%r')
self.run_with_log_command(script, sub_string="%r")
expected_start = 'stderr has a logging error, which is never allowed'
expected_start = "stderr has a logging error, which is never allowed"
with assert_error_startswith(RuntimeError, expected_start):
# Pass a bad substitution string. Also, pass
# allow_stderr_error=True to check that the RuntimeError occurs
# even under the stricter test condition of when we are allowing
# other types of errors.
self.run_with_log_command(
script, sub_string='{!r}', allow_stderr_error=True,
script,
sub_string="{!r}",
allow_stderr_error=True,
)
def test_run__allow_stderr_error_false_error_with_expect_error(
self, script,
self,
script,
):
"""
Test passing allow_stderr_error=False with expect_error=True.
"""
expected_start = (
'cannot pass allow_stderr_error=False with expect_error=True'
)
expected_start = "cannot pass allow_stderr_error=False with expect_error=True"
with assert_error_startswith(RuntimeError, expected_start):
script.run('python', allow_stderr_error=False, expect_error=True)
script.run("python", allow_stderr_error=False, expect_error=True)
def test_run__allow_stderr_warning_false_error_with_expect_stderr(
self, script,
self,
script,
):
"""
Test passing allow_stderr_warning=False with expect_stderr=True.
"""
expected_start = (
'cannot pass allow_stderr_warning=False with expect_stderr=True'
"cannot pass allow_stderr_warning=False with expect_stderr=True"
)
with assert_error_startswith(RuntimeError, expected_start):
script.run(
'python', allow_stderr_warning=False, expect_stderr=True,
"python",
allow_stderr_warning=False,
expect_stderr=True,
)
@pytest.mark.parametrize('arg_name', (
'expect_error',
'allow_stderr_error',
))
@pytest.mark.parametrize(
"arg_name",
(
"expect_error",
"allow_stderr_error",
),
)
def test_run__allow_stderr_warning_false_error(self, script, arg_name):
"""
Test passing allow_stderr_warning=False when it is not allowed.
"""
kwargs = {'allow_stderr_warning': False, arg_name: True}
kwargs = {"allow_stderr_warning": False, arg_name: True}
expected_start = (
'cannot pass allow_stderr_warning=False with '
'allow_stderr_error=True'
"cannot pass allow_stderr_warning=False with " "allow_stderr_error=True"
)
with assert_error_startswith(RuntimeError, expected_start):
script.run('python', **kwargs)
script.run("python", **kwargs)
def test_run__expect_error_fails_when_zero_returncode(self, script):
expected_start = 'Script passed unexpectedly'
expected_start = "Script passed unexpectedly"
with assert_error_startswith(AssertionError, expected_start):
script.run(
'python', expect_error=True
)
script.run("python", expect_error=True)
def test_run__no_expect_error_fails_when_nonzero_returncode(self, script):
expected_start = 'Script returned code: 1'
expected_start = "Script returned code: 1"
with assert_error_startswith(AssertionError, expected_start):
script.run(
'python', '-c', 'import sys; sys.exit(1)'
)
script.run("python", "-c", "import sys; sys.exit(1)")

View file

@ -161,19 +161,20 @@ def test_make_wheel_default_record():
record_bytes = z.read("simple-0.1.0.dist-info/RECORD")
record_text = record_bytes.decode()
record_rows = list(csv.reader(record_text.splitlines()))
records = {
row[0]: row[1:] for row in record_rows
}
records = {row[0]: row[1:] for row in record_rows}
expected = {
"simple/__init__.py": [
"sha256=ypeBEsobvcr6wjGzmiPcTaeG7_gUfE5yuYB3ha_uSLs", "1"
"sha256=ypeBEsobvcr6wjGzmiPcTaeG7_gUfE5yuYB3ha_uSLs",
"1",
],
"simple-0.1.0.data/purelib/info.txt": [
"sha256=Ln0sA6lQeuJl7PW1NWiFpTOTogKdJBOUmXJloaJa78Y", "1"
"sha256=Ln0sA6lQeuJl7PW1NWiFpTOTogKdJBOUmXJloaJa78Y",
"1",
],
"simple-0.1.0.dist-info/LICENSE": [
"sha256=PiPoFgA5WUoziU9lZOGxNIu9egCI1CxKy3PurtWcAJ0", "1"
"sha256=PiPoFgA5WUoziU9lZOGxNIu9egCI1CxKy3PurtWcAJ0",
"1",
],
"simple-0.1.0.dist-info/RECORD": ["", ""],
}

View file

@ -17,9 +17,9 @@ class VirtualEnvironment:
def __init__(self, location, template=None, venv_type=None):
assert template is None or venv_type is None
assert venv_type in (None, 'virtualenv', 'venv')
assert venv_type in (None, "virtualenv", "venv")
self.location = Path(location)
self._venv_type = venv_type or template._venv_type or 'virtualenv'
self._venv_type = venv_type or template._venv_type or "virtualenv"
self._user_site_packages = False
self._template = template
self._sitecustomize = None
@ -29,11 +29,11 @@ class VirtualEnvironment:
def _update_paths(self):
home, lib, inc, bin = _virtualenv.path_locations(self.location)
self.bin = Path(bin)
self.site = Path(lib) / 'site-packages'
self.site = Path(lib) / "site-packages"
# Workaround for https://github.com/pypa/virtualenv/issues/306
if hasattr(sys, "pypy_version_info"):
version_dir = str(sys.version_info.major)
self.lib = Path(home, 'lib-python', version_dir)
self.lib = Path(home, "lib-python", version_dir)
else:
self.lib = Path(lib)
@ -46,17 +46,15 @@ class VirtualEnvironment:
if self._template:
# On Windows, calling `_virtualenv.path_locations(target)`
# will have created the `target` directory...
if sys.platform == 'win32' and self.location.exists():
if sys.platform == "win32" and self.location.exists():
self.location.rmdir()
# Clone virtual environment from template.
shutil.copytree(
self._template.location, self.location, symlinks=True
)
shutil.copytree(self._template.location, self.location, symlinks=True)
self._sitecustomize = self._template.sitecustomize
self._user_site_packages = self._template.user_site_packages
else:
# Create a new virtual environment.
if self._venv_type == 'virtualenv':
if self._venv_type == "virtualenv":
_virtualenv.create_environment(
self.location,
no_pip=True,
@ -64,7 +62,7 @@ class VirtualEnvironment:
no_setuptools=True,
)
self._fix_virtualenv_site_module()
elif self._venv_type == 'venv':
elif self._venv_type == "venv":
builder = _venv.EnvBuilder()
context = builder.ensure_directories(self.location)
builder.create_configuration(context)
@ -75,46 +73,44 @@ class VirtualEnvironment:
def _fix_virtualenv_site_module(self):
# Patch `site.py` so user site work as expected.
site_py = self.lib / 'site.py'
site_py = self.lib / "site.py"
with open(site_py) as fp:
site_contents = fp.read()
for pattern, replace in (
(
# Ensure enabling user site does not result in adding
# the real site-packages' directory to `sys.path`.
("\ndef virtual_addsitepackages(known_paths):\n"),
(
'\ndef virtual_addsitepackages(known_paths):\n'
),
(
'\ndef virtual_addsitepackages(known_paths):\n'
' return known_paths\n'
"\ndef virtual_addsitepackages(known_paths):\n"
" return known_paths\n"
),
),
(
# Fix sites ordering: user site must be added before system.
(
'\n paths_in_sys = addsitepackages(paths_in_sys)'
'\n paths_in_sys = addusersitepackages(paths_in_sys)\n'
"\n paths_in_sys = addsitepackages(paths_in_sys)"
"\n paths_in_sys = addusersitepackages(paths_in_sys)\n"
),
(
'\n paths_in_sys = addusersitepackages(paths_in_sys)'
'\n paths_in_sys = addsitepackages(paths_in_sys)\n'
"\n paths_in_sys = addusersitepackages(paths_in_sys)"
"\n paths_in_sys = addsitepackages(paths_in_sys)\n"
),
),
):
assert pattern in site_contents
site_contents = site_contents.replace(pattern, replace)
with open(site_py, 'w') as fp:
with open(site_py, "w") as fp:
fp.write(site_contents)
# Make sure bytecode is up-to-date too.
assert compileall.compile_file(str(site_py), quiet=1, force=True)
def _customize_site(self):
contents = ''
if self._venv_type == 'venv':
contents = ""
if self._venv_type == "venv":
# Enable user site (before system).
contents += textwrap.dedent(
'''
"""
import os, site, sys
if not os.environ.get('PYTHONNOUSERSITE', False):
@ -138,9 +134,10 @@ class VirtualEnvironment:
# Third, add back system-sites related paths.
for path in site.getsitepackages():
site.addsitedir(path)
''').strip()
"""
).strip()
if self._sitecustomize is not None:
contents += '\n' + self._sitecustomize
contents += "\n" + self._sitecustomize
sitecustomize = self.site / "sitecustomize.py"
sitecustomize.write_text(contents)
# Make sure bytecode is up-to-date too.
@ -170,11 +167,11 @@ class VirtualEnvironment:
@user_site_packages.setter
def user_site_packages(self, value):
self._user_site_packages = value
if self._venv_type == 'virtualenv':
if self._venv_type == "virtualenv":
marker = self.lib / "no-global-site-packages.txt"
if self._user_site_packages:
marker.unlink()
else:
marker.touch()
elif self._venv_type == 'venv':
elif self._venv_type == "venv":
self._customize_site()

View file

@ -30,9 +30,7 @@ from tests.lib.path import Path
# path, digest, size
RecordLike = Tuple[str, str, str]
RecordCallback = Callable[
[List["Record"]], Union[str, bytes, List[RecordLike]]
]
RecordCallback = Callable[[List["Record"]], Union[str, bytes, List[RecordLike]]]
# As would be used in metadata
HeaderValue = Union[str, List[str]]
@ -97,11 +95,13 @@ def make_metadata_file(
if value is not _default:
return File(path, ensure_binary(value))
metadata = CaseInsensitiveDict({
metadata = CaseInsensitiveDict(
{
"Metadata-Version": "2.1",
"Name": name,
"Version": version,
})
}
)
if updates is not _default:
metadata.update(updates)
@ -128,12 +128,14 @@ def make_wheel_metadata_file(
if value is not _default:
return File(path, ensure_binary(value))
metadata = CaseInsensitiveDict({
metadata = CaseInsensitiveDict(
{
"Wheel-Version": "1.0",
"Generator": "pip-test-suite",
"Root-Is-Purelib": "true",
"Tag": ["-".join(parts) for parts in tags],
})
}
)
if updates is not _default:
metadata.update(updates)
@ -172,10 +174,7 @@ def make_entry_points_file(
def make_files(files):
# type: (Dict[str, AnyStr]) -> List[File]
return [
File(name, ensure_binary(contents))
for name, contents in files.items()
]
return [File(name, ensure_binary(contents)) for name, contents in files.items()]
def make_metadata_files(name, version, files):
@ -203,9 +202,7 @@ def urlsafe_b64encode_nopad(data):
def digest(contents):
# type: (bytes) -> str
return "sha256={}".format(
urlsafe_b64encode_nopad(sha256(contents).digest())
)
return "sha256={}".format(urlsafe_b64encode_nopad(sha256(contents).digest()))
def record_file_maker_wrapper(
@ -219,9 +216,7 @@ def record_file_maker_wrapper(
records = [] # type: List[Record]
for file in files:
records.append(
Record(
file.name, digest(file.contents), str(len(file.contents))
)
Record(file.name, digest(file.contents), str(len(file.contents)))
)
yield file
@ -250,19 +245,20 @@ def record_file_maker_wrapper(
def wheel_name(name, version, pythons, abis, platforms):
# type: (str, str, str, str, str) -> str
stem = "-".join([
stem = "-".join(
[
name,
version,
".".join(pythons),
".".join(abis),
".".join(platforms),
])
]
)
return f"{stem}.whl"
class WheelBuilder:
"""A wheel that can be saved or converted to several formats.
"""
"""A wheel that can be saved or converted to several formats."""
def __init__(self, name, files):
# type: (str, List[File]) -> None
@ -390,9 +386,7 @@ def make_wheel(
tags = list(itertools.product(pythons, abis, platforms))
possible_files = [
make_metadata_file(
name, version, metadata, metadata_updates, metadata_body
),
make_metadata_file(name, version, metadata, metadata_updates, metadata_body),
make_wheel_metadata_file(
name, version, wheel_metadata, tags, wheel_metadata_updates
),
@ -403,9 +397,7 @@ def make_wheel(
possible_files.extend(make_files(extra_files))
if extra_metadata_files is not _default:
possible_files.extend(
make_metadata_files(name, version, extra_metadata_files)
)
possible_files.extend(make_metadata_files(name, version, extra_metadata_files))
if extra_data_files is not _default:
possible_files.extend(make_data_files(name, version, extra_data_files))

View file

@ -209,6 +209,8 @@ class TestWheel:
with pytest.raises(BestVersionAlreadyInstalled):
finder.find_requirement(req, True)
class TestCandidateEvaluator:
def test_link_sorting(self):
"""
Test link sorting
@ -249,7 +251,8 @@ class TestWheel:
results = sorted(links, key=sort_key, reverse=True)
results2 = sorted(reversed(links), key=sort_key, reverse=True)
assert links == results == results2, results2
assert links == results, results
assert links == results2, results2
def test_link_sorting_wheels_with_build_tags(self):
"""Verify build tags affect sorting."""
@ -274,7 +277,47 @@ class TestWheel:
sort_key = candidate_evaluator._sort_key
results = sorted(links, key=sort_key, reverse=True)
results2 = sorted(reversed(links), key=sort_key, reverse=True)
assert links == results == results2, results2
assert links == results, results
assert links == results2, results2
def test_build_tag_is_less_important_than_other_tags(self):
links = [
InstallationCandidate(
"simple",
"1.0",
Link('simple-1.0-1-py3-abi3-linux_x86_64.whl'),
),
InstallationCandidate(
"simple",
'1.0',
Link('simple-1.0-2-py3-abi3-linux_i386.whl'),
),
InstallationCandidate(
"simple",
'1.0',
Link('simple-1.0-2-py3-any-none.whl'),
),
InstallationCandidate(
"simple",
'1.0',
Link('simple-1.0.tar.gz'),
),
]
valid_tags = [
Tag('py3', 'abi3', 'linux_x86_64'),
Tag('py3', 'abi3', 'linux_i386'),
Tag('py3', 'any', 'none'),
]
evaluator = CandidateEvaluator(
'my-project', supported_tags=valid_tags, specifier=SpecifierSet(),
)
sort_key = evaluator._sort_key
results = sorted(links, key=sort_key, reverse=True)
results2 = sorted(reversed(links), key=sort_key, reverse=True)
assert links == results, results
assert links == results2, results2
def test_finder_priority_file_over_page(data):

View file

@ -1,60 +0,0 @@
# New resolver error messages
## Incompatible requirements
Most resolver error messages are due to incompatible requirements.
That is, the dependency tree contains conflicting versions of the same
package. Take the example:
base:
available:
- A 1.0.0; depends B == 1.0.0, C == 2.0.0
- B 1.0.0; depends C == 1.0.0
- C 1.0.0
- C 2.0.0
Here, `A` cannot be installed because it depends on `B` (which depends on
a different version of `C` than `A` itself. In real world examples, the
conflicting version are not so easy to spot. I'm suggesting an error
message which looks something like this:
A 1.0.0 -> B 1.0.0 -> C 1.0.0
A 1.0.0 -> C 2.0.0
That is, for the conflicting package, we show the user where exactly the
requirement came from.
## Double requirement
I've noticed that in many cases the old resolver messages are more
informative. For example, in the simple example:
base:
available:
- B 1.0.0
- B 2.0.0
Now if we want to install both version of `B` at the same time,
i.e. the requirement `B==1.0.0 B==2.0.0`, we get:
ERROR: Could not find a version that satisfies the requirement B==1.0.0
ERROR: Could not find a version that satisfies the requirement B==2.0.0
No matching distribution found for b, b
Even though both version are actually available and satisfy each requirement,
just not at once. When trying to install a version of `B` which does not
exist, say requirement `B==1.5.0`, you get the same type of error message:
Could not find a version that satisfies the requirement B==1.5.0
No matching distribution found for b
For this case, the old error message was:
Could not find a version that satisfies the requirement B==1.5.0 (from versions: 1.0.0, 2.0.0)
No matching distribution found for B==1.5.0
And the old error message for the requirement `B==1.0.0 B==2.0.0`:
Double requirement given: B==2.0.0 (already in B==1.0.0, name='B')

View file

@ -1,74 +0,0 @@
# YAML tests for pip's resolver
This directory contains fixtures for testing pip's resolver.
The fixtures are written as `.yml` files, with a convenient format
that allows for specifying a custom index for temporary use.
The `.yml` files are typically organized in the following way. Here, we are
going to take a closer look at the `simple.yml` file and step through the
test cases. A `base` section defines which packages are available upstream:
base:
available:
- simple 0.1.0
- simple 0.2.0
- base 0.1.0; depends dep
- dep 0.1.0
Each package has a name and version number. Here, there are two
packages `simple` (with versoin `0.1.0` and `0.2.0`). The package
`base 0.1.0` depends on the requirement `dep` (which simply means it
depends on any version of `dep`. More generally, a package can also
depend on a specific version of another package, or a range of versions.
Next, in our yaml file, we have the `cases:` section which is a list of
test cases. Each test case has a request and a response. The request
is what the user would want to do:
cases:
-
request:
- install: simple
- uninstall: simple
response:
- state:
- simple 0.2.0
- state: null
Here the first request is to install the package simple, this would
basically be equivalent to typing `pip install simple`, and the corresponding
first response is that the state of installed packages is `simple 0.2.0`.
Note that by default the highest version of an available package will be
installed.
The second request is to uninstall simple again, which will result in the
state `null` (basically an empty list of installed packages).
When the yaml tests are run, each response is verified by checking which
packages got actually installed. Note that this is check is done in
alphabetical order.
The linter is very useful for initally checking `.yml` files, e.g.:
$ python linter.py -v simple.yml
To run only the yaml tests, use (from the root of the source tree):
$ tox -e py38 -- -m yaml -vv
Or, in order to avoid collecting all the test cases:
$ tox -e py38 -- tests/functional/test_yaml.py
Or, only a specific test:
$ tox -e py38 -- tests/functional/test_yaml.py -k simple
Or, just a specific test case:
$ tox -e py38 -- tests/functional/test_yaml.py -k simple-0
<!-- TODO: Add a good description of the format and how it can be used. -->

Some files were not shown because too many files have changed in this diff Show more