Compare commits

...

16 Commits

Author SHA1 Message Date
Sviatoslav Sydorenko 127b56566f
Merge 62fae81223 into a15dd75d98 2023-12-02 21:31:41 -08:00
Tzu-ping Chung a15dd75d98
Merge pull request #12417 from xqm32/fix-outdated-pip-install 2023-11-28 16:08:29 +09:00
Tzu-ping Chung d8ab6dc6c1 Clarify news fragment 2023-11-28 15:06:25 +08:00
Qiming Xu fe10d368f6
Add end line 2023-11-28 14:25:56 +08:00
Qiming Xu 28250baffb
Fix line wrap length and add news entry 2023-11-28 14:17:51 +08:00
Qiming Xu 88ac529219
Fix outdated pip install argument description 2023-11-28 13:15:31 +08:00
Damian Shaw 2a0acb595c
Update and provide fixes for mypy pre-commit (#12389)
* Update mypy to 1.6.1

* Fix mypy "Source file found twice under different module names" error

* Ignore type of intialized abstract class in tests

* Use more specific type ignore method-assign

* Type ignore for message.get_all

* Remove unused type ignore

* Add SizedBuffer type for xmlrpc.client.Transport subclass

* Add Self type for RequestHandlerClass in test

* Add type ignore for shutil.rmtree onexc handler

* Quote SizedBuffer

* Add news entry

* Remove no longer correct comment

* Update self import

* Also ignore type onerror=handler

* Update news entry

* Update news entry
2023-11-07 09:39:01 +00:00
Damian Shaw 68529081c2
Enforce f-strings via Ruff (#12393) 2023-11-07 09:14:56 +00:00
Damian Shaw 9685f64fe8
Update ruff and config (#12390) 2023-11-06 09:30:05 +00:00
Dale fd77ebfc74
Rework the functionality of PIP_CONFIG_FILE (#11850) 2023-10-27 14:59:56 +02:00
efflamlemaillet 6dbd9c68f0
Fix hg: "parse error at 0: not a prefix:" (#12373)
Use two hypen argument `--rev=` instead of `-r=`

Co-authored-by: Efflam Lemaillet <elemaillet@logilab.fr>
Co-authored-by: Pradyun Gedam <pradyunsg@gmail.com>
2023-10-27 11:08:17 +02:00
Stéphane Bidoul 7aaca9f2c4
Merge pull request #12370 from sbidoul/release/23.3.1
Release/23.3.1
2023-10-21 13:05:39 +02:00
Stéphane Bidoul 576dbd813c Bump for development 2023-10-21 12:57:41 +02:00
Stéphane Bidoul 5364f26f96 Bump for release 2023-10-21 12:57:31 +02:00
Itamar Turner-Trauring 5e7cc16c3b
Fix parallel pip cache downloads causing crash (#12364)
Co-authored-by: Itamar Turner-Trauring <itamar@pythonspeed.com>
2023-10-18 23:14:22 +01:00
Sviatoslav Sydorenko 62fae81223
Log ending round state @ during resolver debug
This change corrects a tiny-little mistake made in
9f318de7b6 that was causing the word
"state" to be printed out in the log instead of the value of the
`ending_round()`'s state` argument.
2023-09-07 01:08:19 +02:00
86 changed files with 322 additions and 397 deletions

View File

@ -22,25 +22,26 @@ repos:
- id: black - id: black
- repo: https://github.com/astral-sh/ruff-pre-commit - repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.0.292 rev: v0.1.4
hooks: hooks:
- id: ruff - id: ruff
args: [--fix, --exit-non-zero-on-fix]
- repo: https://github.com/pre-commit/mirrors-mypy - repo: https://github.com/pre-commit/mirrors-mypy
rev: v0.961 rev: v1.6.1
hooks: hooks:
- id: mypy - id: mypy
exclude: tests/data exclude: tests/data
args: ["--pretty", "--show-error-codes"] args: ["--pretty", "--show-error-codes"]
additional_dependencies: [ additional_dependencies: [
'keyring==23.0.1', 'keyring==24.2.0',
'nox==2021.6.12', 'nox==2023.4.22',
'pytest', 'pytest',
'types-docutils==0.18.3', 'types-docutils==0.20.0.3',
'types-setuptools==57.4.14', 'types-setuptools==68.2.0.0',
'types-freezegun==1.1.9', 'types-freezegun==1.1.10',
'types-six==1.16.15', 'types-six==1.16.21.9',
'types-pyyaml==6.0.12.2', 'types-pyyaml==6.0.12.12',
] ]
- repo: https://github.com/pre-commit/pygrep-hooks - repo: https://github.com/pre-commit/pygrep-hooks

View File

@ -9,6 +9,16 @@
.. towncrier release notes start .. towncrier release notes start
23.3.1 (2023-10-21)
===================
Bug Fixes
---------
- Handle a timezone indicator of Z when parsing dates in the self check. (`#12338 <https://github.com/pypa/pip/issues/12338>`_)
- Fix bug where installing the same package at the same time with multiple pip processes could fail. (`#12361 <https://github.com/pypa/pip/issues/12361>`_)
23.3 (2023-10-15) 23.3 (2023-10-15)
================= =================

View File

@ -45,8 +45,8 @@ When looking at the items to be installed, pip checks what type of item
each is, in the following order: each is, in the following order:
1. Project or archive URL. 1. Project or archive URL.
2. Local directory (which must contain a ``setup.py``, or pip will report 2. Local directory (which must contain a ``pyproject.toml`` or ``setup.py``,
an error). otherwise pip will report an error).
3. Local file (a sdist or wheel format archive, following the naming 3. Local file (a sdist or wheel format archive, following the naming
conventions for those formats). conventions for those formats).
4. A requirement, as specified in :pep:`440`. 4. A requirement, as specified in :pep:`440`.

View File

@ -19,8 +19,8 @@ and how they are related to pip's various command line options.
## Configuration Files ## Configuration Files
Configuration files can change the default values for command line option. Configuration files can change the default values for command line options.
They are written using a standard INI style configuration files. The files are written using standard INI format.
pip has 3 "levels" of configuration files: pip has 3 "levels" of configuration files:
@ -28,11 +28,15 @@ pip has 3 "levels" of configuration files:
- `user`: per-user configuration file. - `user`: per-user configuration file.
- `site`: per-environment configuration file; i.e. per-virtualenv. - `site`: per-environment configuration file; i.e. per-virtualenv.
Additionally, environment variables can be specified which will override any of the above.
### Location ### Location
pip's configuration files are located in fairly standard locations. This pip's configuration files are located in fairly standard locations. This
location is different on different operating systems, and has some additional location is different on different operating systems, and has some additional
complexity for backwards compatibility reasons. complexity for backwards compatibility reasons. Note that if user config files
exist in both the legacy and current locations, values in the current file
will override values in the legacy file.
```{tab} Unix ```{tab} Unix
@ -88,9 +92,10 @@ Site
### `PIP_CONFIG_FILE` ### `PIP_CONFIG_FILE`
Additionally, the environment variable `PIP_CONFIG_FILE` can be used to specify Additionally, the environment variable `PIP_CONFIG_FILE` can be used to specify
a configuration file that's loaded first, and whose values are overridden by a configuration file that's loaded last, and whose values override the values
the values set in the aforementioned files. Setting this to {any}`os.devnull` set in the aforementioned files. Setting this to {any}`os.devnull`
disables the loading of _all_ configuration files. disables the loading of _all_ configuration files. Note that if a file exists
at the location that this is set to, the user config file will not be loaded.
(config-precedence)= (config-precedence)=
@ -99,10 +104,10 @@ disables the loading of _all_ configuration files.
When multiple configuration files are found, pip combines them in the following When multiple configuration files are found, pip combines them in the following
order: order:
- `PIP_CONFIG_FILE`, if given.
- Global - Global
- User - User
- Site - Site
- `PIP_CONFIG_FILE`, if given.
Each file read overrides any values read from previous files, so if the Each file read overrides any values read from previous files, so if the
global timeout is specified in both the global file and the per-user file global timeout is specified in both the global file and the per-user file

View File

@ -194,22 +194,17 @@ class PipReqFileOptionsReference(PipOptions):
opt = option() opt = option()
opt_name = opt._long_opts[0] opt_name = opt._long_opts[0]
if opt._short_opts: if opt._short_opts:
short_opt_name = "{}, ".format(opt._short_opts[0]) short_opt_name = f"{opt._short_opts[0]}, "
else: else:
short_opt_name = "" short_opt_name = ""
if option in cmdoptions.general_group["options"]: if option in cmdoptions.general_group["options"]:
prefix = "" prefix = ""
else: else:
prefix = "{}_".format(self.determine_opt_prefix(opt_name)) prefix = f"{self.determine_opt_prefix(opt_name)}_"
self.view_list.append( self.view_list.append(
"* :ref:`{short}{long}<{prefix}{opt_name}>`".format( f"* :ref:`{short_opt_name}{opt_name}<{prefix}{opt_name}>`",
short=short_opt_name,
long=opt_name,
prefix=prefix,
opt_name=opt_name,
),
"\n", "\n",
) )

1
news/11815.doc.rst Normal file
View File

@ -0,0 +1 @@
Fix explanation of how PIP_CONFIG_FILE works

View File

@ -1 +0,0 @@
Handle a timezone indicator of Z when parsing dates in the self check.

1
news/12389.bugfix.rst Normal file
View File

@ -0,0 +1 @@
Update mypy to 1.6.1 and fix/ignore types

1
news/12390.trivial.rst Normal file
View File

@ -0,0 +1 @@
Update ruff versions and config for dev

1
news/12393.trivial.rst Normal file
View File

@ -0,0 +1 @@
Enforce and update code to use f-strings via Ruff rule UP032

1
news/12417.doc.rst Normal file
View File

@ -0,0 +1 @@
Fix outdated pip install argument description in documentation.

View File

@ -0,0 +1 @@
Fix mercurial revision "parse error": use ``--rev={ref}`` instead of ``-r={ref}``

View File

@ -84,8 +84,8 @@ ignore = [
"B020", "B020",
"B904", # Ruff enables opinionated warnings by default "B904", # Ruff enables opinionated warnings by default
"B905", # Ruff enables opinionated warnings by default "B905", # Ruff enables opinionated warnings by default
"G202",
] ]
target-version = "py37"
line-length = 88 line-length = 88
select = [ select = [
"ASYNC", "ASYNC",
@ -102,6 +102,7 @@ select = [
"PLR0", "PLR0",
"W", "W",
"RUF100", "RUF100",
"UP032",
] ]
[tool.ruff.isort] [tool.ruff.isort]

View File

@ -77,7 +77,7 @@ setup(
entry_points={ entry_points={
"console_scripts": [ "console_scripts": [
"pip=pip._internal.cli.main:main", "pip=pip._internal.cli.main:main",
"pip{}=pip._internal.cli.main:main".format(sys.version_info[0]), f"pip{sys.version_info[0]}=pip._internal.cli.main:main",
"pip{}.{}=pip._internal.cli.main:main".format(*sys.version_info[:2]), "pip{}.{}=pip._internal.cli.main:main".format(*sys.version_info[:2]),
], ],
}, },

View File

@ -582,10 +582,7 @@ def _handle_python_version(
""" """
version_info, error_msg = _convert_python_version(value) version_info, error_msg = _convert_python_version(value)
if error_msg is not None: if error_msg is not None:
msg = "invalid --python-version value: {!r}: {}".format( msg = f"invalid --python-version value: {value!r}: {error_msg}"
value,
error_msg,
)
raise_option_error(parser, option=option, msg=msg) raise_option_error(parser, option=option, msg=msg)
parser.values.python_version = version_info parser.values.python_version = version_info
@ -921,9 +918,9 @@ def _handle_merge_hash(
algo, digest = value.split(":", 1) algo, digest = value.split(":", 1)
except ValueError: except ValueError:
parser.error( parser.error(
"Arguments to {} must be a hash name " f"Arguments to {opt_str} must be a hash name "
"followed by a value, like --hash=sha256:" "followed by a value, like --hash=sha256:"
"abcde...".format(opt_str) "abcde..."
) )
if algo not in STRONG_HASHES: if algo not in STRONG_HASHES:
parser.error( parser.error(

View File

@ -229,9 +229,9 @@ class ConfigOptionParser(CustomOptionParser):
val = strtobool(val) val = strtobool(val)
except ValueError: except ValueError:
self.error( self.error(
"{} is not a valid value for {} option, " f"{val} is not a valid value for {key} option, "
"please specify a boolean value like yes/no, " "please specify a boolean value like yes/no, "
"true/false or 1/0 instead.".format(val, key) "true/false or 1/0 instead."
) )
elif option.action == "count": elif option.action == "count":
with suppress(ValueError): with suppress(ValueError):
@ -240,10 +240,10 @@ class ConfigOptionParser(CustomOptionParser):
val = int(val) val = int(val)
if not isinstance(val, int) or val < 0: if not isinstance(val, int) or val < 0:
self.error( self.error(
"{} is not a valid value for {} option, " f"{val} is not a valid value for {key} option, "
"please instead specify either a non-negative integer " "please instead specify either a non-negative integer "
"or a boolean value like yes/no or false/true " "or a boolean value like yes/no or false/true "
"which is equivalent to 1/0.".format(val, key) "which is equivalent to 1/0."
) )
elif option.action == "append": elif option.action == "append":
val = val.split() val = val.split()

View File

@ -175,7 +175,7 @@ class CacheCommand(Command):
files += self._find_http_files(options) files += self._find_http_files(options)
else: else:
# Add the pattern to the log message # Add the pattern to the log message
no_matching_msg += ' for pattern "{}"'.format(args[0]) no_matching_msg += f' for pattern "{args[0]}"'
if not files: if not files:
logger.warning(no_matching_msg) logger.warning(no_matching_msg)

View File

@ -242,17 +242,15 @@ class ConfigurationCommand(Command):
e.filename = editor e.filename = editor
raise raise
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
raise PipError( raise PipError(f"Editor Subprocess exited with exit code {e.returncode}")
"Editor Subprocess exited with exit code {}".format(e.returncode)
)
def _get_n_args(self, args: List[str], example: str, n: int) -> Any: def _get_n_args(self, args: List[str], example: str, n: int) -> Any:
"""Helper to make sure the command got the right number of arguments""" """Helper to make sure the command got the right number of arguments"""
if len(args) != n: if len(args) != n:
msg = ( msg = (
"Got unexpected number of arguments, expected {}. " f"Got unexpected number of arguments, expected {n}. "
'(example: "{} config {}")' f'(example: "{get_prog()} config {example}")'
).format(n, get_prog(), example) )
raise PipError(msg) raise PipError(msg)
if n == 1: if n == 1:

View File

@ -95,7 +95,7 @@ def show_actual_vendor_versions(vendor_txt_versions: Dict[str, str]) -> None:
elif parse_version(actual_version) != parse_version(expected_version): elif parse_version(actual_version) != parse_version(expected_version):
extra_message = ( extra_message = (
" (CONFLICT: vendor.txt suggests version should" " (CONFLICT: vendor.txt suggests version should"
" be {})".format(expected_version) f" be {expected_version})"
) )
logger.info("%s==%s%s", module_name, actual_version, extra_message) logger.info("%s==%s%s", module_name, actual_version, extra_message)
@ -120,7 +120,7 @@ def show_tags(options: Values) -> None:
if formatted_target: if formatted_target:
suffix = f" (target: {formatted_target})" suffix = f" (target: {formatted_target})"
msg = "Compatible tags: {}{}".format(len(tags), suffix) msg = f"Compatible tags: {len(tags)}{suffix}"
logger.info(msg) logger.info(msg)
if options.verbose < 1 and len(tags) > tag_limit: if options.verbose < 1 and len(tags) > tag_limit:
@ -134,9 +134,7 @@ def show_tags(options: Values) -> None:
logger.info(str(tag)) logger.info(str(tag))
if tags_limited: if tags_limited:
msg = ( msg = f"...\n[First {tag_limit} tags shown. Pass --verbose to show all.]"
"...\n[First {tag_limit} tags shown. Pass --verbose to show all.]"
).format(tag_limit=tag_limit)
logger.info(msg) logger.info(msg)

View File

@ -128,12 +128,12 @@ class IndexCommand(IndexGroupCommand):
if not versions: if not versions:
raise DistributionNotFound( raise DistributionNotFound(
"No matching distribution found for {}".format(query) f"No matching distribution found for {query}"
) )
formatted_versions = [str(ver) for ver in sorted(versions, reverse=True)] formatted_versions = [str(ver) for ver in sorted(versions, reverse=True)]
latest = formatted_versions[0] latest = formatted_versions[0]
write_output("{} ({})".format(query, latest)) write_output(f"{query} ({latest})")
write_output("Available versions: {}".format(", ".join(formatted_versions))) write_output("Available versions: {}".format(", ".join(formatted_versions)))
print_dist_installation_info(query, latest) print_dist_installation_info(query, latest)

View File

@ -607,12 +607,8 @@ class InstallCommand(RequirementCommand):
version = package_set[project_name][0] version = package_set[project_name][0]
for dependency in missing[project_name]: for dependency in missing[project_name]:
message = ( message = (
"{name} {version} requires {requirement}, " f"{project_name} {version} requires {dependency[1]}, "
"which is not installed." "which is not installed."
).format(
name=project_name,
version=version,
requirement=dependency[1],
) )
parts.append(message) parts.append(message)

View File

@ -59,8 +59,8 @@ def _disassemble_key(name: str) -> List[str]:
if "." not in name: if "." not in name:
error_message = ( error_message = (
"Key does not contain dot separated section and key. " "Key does not contain dot separated section and key. "
"Perhaps you wanted to use 'global.{}' instead?" f"Perhaps you wanted to use 'global.{name}' instead?"
).format(name) )
raise ConfigurationError(error_message) raise ConfigurationError(error_message)
return name.split(".", 1) return name.split(".", 1)
@ -327,33 +327,35 @@ class Configuration:
def iter_config_files(self) -> Iterable[Tuple[Kind, List[str]]]: def iter_config_files(self) -> Iterable[Tuple[Kind, List[str]]]:
"""Yields variant and configuration files associated with it. """Yields variant and configuration files associated with it.
This should be treated like items of a dictionary. This should be treated like items of a dictionary. The order
here doesn't affect what gets overridden. That is controlled
by OVERRIDE_ORDER. However this does control the order they are
displayed to the user. It's probably most ergononmic to display
things in the same order as OVERRIDE_ORDER
""" """
# SMELL: Move the conditions out of this function # SMELL: Move the conditions out of this function
# environment variables have the lowest priority env_config_file = os.environ.get("PIP_CONFIG_FILE", None)
config_file = os.environ.get("PIP_CONFIG_FILE", None)
if config_file is not None:
yield kinds.ENV, [config_file]
else:
yield kinds.ENV, []
config_files = get_configuration_files() config_files = get_configuration_files()
# at the base we have any global configuration
yield kinds.GLOBAL, config_files[kinds.GLOBAL] yield kinds.GLOBAL, config_files[kinds.GLOBAL]
# per-user configuration next # per-user config is not loaded when env_config_file exists
should_load_user_config = not self.isolated and not ( should_load_user_config = not self.isolated and not (
config_file and os.path.exists(config_file) env_config_file and os.path.exists(env_config_file)
) )
if should_load_user_config: if should_load_user_config:
# The legacy config file is overridden by the new config file # The legacy config file is overridden by the new config file
yield kinds.USER, config_files[kinds.USER] yield kinds.USER, config_files[kinds.USER]
# finally virtualenv configuration first trumping others # virtualenv config
yield kinds.SITE, config_files[kinds.SITE] yield kinds.SITE, config_files[kinds.SITE]
if env_config_file is not None:
yield kinds.ENV, [env_config_file]
else:
yield kinds.ENV, []
def get_values_in_config(self, variant: Kind) -> Dict[str, Any]: def get_values_in_config(self, variant: Kind) -> Dict[str, Any]:
"""Get values present in a config file""" """Get values present in a config file"""
return self._config[variant] return self._config[variant]

View File

@ -247,10 +247,7 @@ class NoneMetadataError(PipError):
def __str__(self) -> str: def __str__(self) -> str:
# Use `dist` in the error message because its stringification # Use `dist` in the error message because its stringification
# includes more information, like the version and location. # includes more information, like the version and location.
return "None {} metadata found for distribution: {}".format( return f"None {self.metadata_name} metadata found for distribution: {self.dist}"
self.metadata_name,
self.dist,
)
class UserInstallationInvalid(InstallationError): class UserInstallationInvalid(InstallationError):
@ -594,7 +591,7 @@ class HashMismatch(HashError):
self.gots = gots self.gots = gots
def body(self) -> str: def body(self) -> str:
return " {}:\n{}".format(self._requirement_name(), self._hash_comparison()) return f" {self._requirement_name()}:\n{self._hash_comparison()}"
def _hash_comparison(self) -> str: def _hash_comparison(self) -> str:
""" """
@ -616,11 +613,9 @@ class HashMismatch(HashError):
lines: List[str] = [] lines: List[str] = []
for hash_name, expecteds in self.allowed.items(): for hash_name, expecteds in self.allowed.items():
prefix = hash_then_or(hash_name) prefix = hash_then_or(hash_name)
lines.extend( lines.extend((f" Expected {next(prefix)} {e}") for e in expecteds)
(" Expected {} {}".format(next(prefix), e)) for e in expecteds
)
lines.append( lines.append(
" Got {}\n".format(self.gots[hash_name].hexdigest()) f" Got {self.gots[hash_name].hexdigest()}\n"
) )
return "\n".join(lines) return "\n".join(lines)

View File

@ -533,8 +533,8 @@ class CandidateEvaluator:
) )
except ValueError: except ValueError:
raise UnsupportedWheel( raise UnsupportedWheel(
"{} is not a supported wheel for this platform. It " f"{wheel.filename} is not a supported wheel for this platform. It "
"can't be sorted.".format(wheel.filename) "can't be sorted."
) )
if self._prefer_binary: if self._prefer_binary:
binary_preference = 1 binary_preference = 1
@ -939,9 +939,7 @@ class PackageFinder:
_format_versions(best_candidate_result.iter_all()), _format_versions(best_candidate_result.iter_all()),
) )
raise DistributionNotFound( raise DistributionNotFound(f"No matching distribution found for {req}")
"No matching distribution found for {}".format(req)
)
def _should_install_candidate( def _should_install_candidate(
candidate: Optional[InstallationCandidate], candidate: Optional[InstallationCandidate],

View File

@ -56,8 +56,7 @@ def distutils_scheme(
try: try:
d.parse_config_files() d.parse_config_files()
except UnicodeDecodeError: except UnicodeDecodeError:
# Typeshed does not include find_config_files() for some reason. paths = d.find_config_files()
paths = d.find_config_files() # type: ignore
logger.warning( logger.warning(
"Ignore distutils configs in %s due to encoding errors.", "Ignore distutils configs in %s due to encoding errors.",
", ".join(os.path.basename(p) for p in paths), ", ".join(os.path.basename(p) for p in paths),

View File

@ -64,10 +64,10 @@ def msg_to_json(msg: Message) -> Dict[str, Any]:
key = json_name(field) key = json_name(field)
if multi: if multi:
value: Union[str, List[str]] = [ value: Union[str, List[str]] = [
sanitise_header(v) for v in msg.get_all(field) sanitise_header(v) for v in msg.get_all(field) # type: ignore
] ]
else: else:
value = sanitise_header(msg.get(field)) value = sanitise_header(msg.get(field)) # type: ignore
if key == "keywords": if key == "keywords":
# Accept both comma-separated and space-separated # Accept both comma-separated and space-separated
# forms, for better compatibility with old data. # forms, for better compatibility with old data.

View File

@ -27,8 +27,4 @@ class InstallationCandidate(KeyBasedCompareMixin):
) )
def __str__(self) -> str: def __str__(self) -> str:
return "{!r} candidate (version {} at {})".format( return f"{self.name!r} candidate (version {self.version} at {self.link})"
self.name,
self.version,
self.link,
)

View File

@ -31,9 +31,7 @@ def _get(
value = d[key] value = d[key]
if not isinstance(value, expected_type): if not isinstance(value, expected_type):
raise DirectUrlValidationError( raise DirectUrlValidationError(
"{!r} has unexpected type for {} (expected {})".format( f"{value!r} has unexpected type for {key} (expected {expected_type})"
value, key, expected_type
)
) )
return value return value

View File

@ -33,9 +33,7 @@ class FormatControl:
return all(getattr(self, k) == getattr(other, k) for k in self.__slots__) return all(getattr(self, k) == getattr(other, k) for k in self.__slots__)
def __repr__(self) -> str: def __repr__(self) -> str:
return "{}({}, {})".format( return f"{self.__class__.__name__}({self.no_binary}, {self.only_binary})"
self.__class__.__name__, self.no_binary, self.only_binary
)
@staticmethod @staticmethod
def handle_mutual_excludes(value: str, target: Set[str], other: Set[str]) -> None: def handle_mutual_excludes(value: str, target: Set[str], other: Set[str]) -> None:

View File

@ -368,9 +368,7 @@ class Link(KeyBasedCompareMixin):
else: else:
rp = "" rp = ""
if self.comes_from: if self.comes_from:
return "{} (from {}){}".format( return f"{redact_auth_from_url(self._url)} (from {self.comes_from}){rp}"
redact_auth_from_url(self._url), self.comes_from, rp
)
else: else:
return redact_auth_from_url(str(self._url)) return redact_auth_from_url(str(self._url))

View File

@ -33,6 +33,18 @@ class SafeFileCache(SeparateBodyBaseCache):
""" """
A file based cache which is safe to use even when the target directory may A file based cache which is safe to use even when the target directory may
not be accessible or writable. not be accessible or writable.
There is a race condition when two processes try to write and/or read the
same entry at the same time, since each entry consists of two separate
files (https://github.com/psf/cachecontrol/issues/324). We therefore have
additional logic that makes sure that both files to be present before
returning an entry; this fixes the read side of the race condition.
For the write side, we assume that the server will only ever return the
same data for the same URL, which ought to be the case for files pip is
downloading. PyPI does not have a mechanism to swap out a wheel for
another wheel, for example. If this assumption is not true, the
CacheControl issue will need to be fixed.
""" """
def __init__(self, directory: str) -> None: def __init__(self, directory: str) -> None:
@ -49,9 +61,13 @@ class SafeFileCache(SeparateBodyBaseCache):
return os.path.join(self.directory, *parts) return os.path.join(self.directory, *parts)
def get(self, key: str) -> Optional[bytes]: def get(self, key: str) -> Optional[bytes]:
path = self._get_cache_path(key) # The cache entry is only valid if both metadata and body exist.
metadata_path = self._get_cache_path(key)
body_path = metadata_path + ".body"
if not (os.path.exists(metadata_path) and os.path.exists(body_path)):
return None
with suppressed_cache_errors(): with suppressed_cache_errors():
with open(path, "rb") as f: with open(metadata_path, "rb") as f:
return f.read() return f.read()
def _write(self, path: str, data: bytes) -> None: def _write(self, path: str, data: bytes) -> None:
@ -77,9 +93,13 @@ class SafeFileCache(SeparateBodyBaseCache):
os.remove(path + ".body") os.remove(path + ".body")
def get_body(self, key: str) -> Optional[BinaryIO]: def get_body(self, key: str) -> Optional[BinaryIO]:
path = self._get_cache_path(key) + ".body" # The cache entry is only valid if both metadata and body exist.
metadata_path = self._get_cache_path(key)
body_path = metadata_path + ".body"
if not (os.path.exists(metadata_path) and os.path.exists(body_path)):
return None
with suppressed_cache_errors(): with suppressed_cache_errors():
return open(path, "rb") return open(body_path, "rb")
def set_body(self, key: str, body: bytes) -> None: def set_body(self, key: str, body: bytes) -> None:
path = self._get_cache_path(key) + ".body" path = self._get_cache_path(key) + ".body"

View File

@ -42,7 +42,7 @@ def _prepare_download(
logged_url = redact_auth_from_url(url) logged_url = redact_auth_from_url(url)
if total_length: if total_length:
logged_url = "{} ({})".format(logged_url, format_size(total_length)) logged_url = f"{logged_url} ({format_size(total_length)})"
if is_from_cache(resp): if is_from_cache(resp):
logger.info("Using cached %s", logged_url) logger.info("Using cached %s", logged_url)

View File

@ -13,6 +13,8 @@ from pip._internal.network.utils import raise_for_status
if TYPE_CHECKING: if TYPE_CHECKING:
from xmlrpc.client import _HostType, _Marshallable from xmlrpc.client import _HostType, _Marshallable
from _typeshed import SizedBuffer
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -33,7 +35,7 @@ class PipXmlrpcTransport(xmlrpc.client.Transport):
self, self,
host: "_HostType", host: "_HostType",
handler: str, handler: str,
request_body: bytes, request_body: "SizedBuffer",
verbose: bool = False, verbose: bool = False,
) -> Tuple["_Marshallable", ...]: ) -> Tuple["_Marshallable", ...]:
assert isinstance(host, str) assert isinstance(host, str)

View File

@ -164,16 +164,14 @@ def message_about_scripts_not_on_PATH(scripts: Sequence[str]) -> Optional[str]:
for parent_dir, dir_scripts in warn_for.items(): for parent_dir, dir_scripts in warn_for.items():
sorted_scripts: List[str] = sorted(dir_scripts) sorted_scripts: List[str] = sorted(dir_scripts)
if len(sorted_scripts) == 1: if len(sorted_scripts) == 1:
start_text = "script {} is".format(sorted_scripts[0]) start_text = f"script {sorted_scripts[0]} is"
else: else:
start_text = "scripts {} are".format( start_text = "scripts {} are".format(
", ".join(sorted_scripts[:-1]) + " and " + sorted_scripts[-1] ", ".join(sorted_scripts[:-1]) + " and " + sorted_scripts[-1]
) )
msg_lines.append( msg_lines.append(
"The {} installed in '{}' which is not on PATH.".format( f"The {start_text} installed in '{parent_dir}' which is not on PATH."
start_text, parent_dir
)
) )
last_line_fmt = ( last_line_fmt = (
@ -321,9 +319,7 @@ def get_console_script_specs(console: Dict[str, str]) -> List[str]:
scripts_to_generate.append("pip = " + pip_script) scripts_to_generate.append("pip = " + pip_script)
if os.environ.get("ENSUREPIP_OPTIONS", "") != "altinstall": if os.environ.get("ENSUREPIP_OPTIONS", "") != "altinstall":
scripts_to_generate.append( scripts_to_generate.append(f"pip{sys.version_info[0]} = {pip_script}")
"pip{} = {}".format(sys.version_info[0], pip_script)
)
scripts_to_generate.append(f"pip{get_major_minor_version()} = {pip_script}") scripts_to_generate.append(f"pip{get_major_minor_version()} = {pip_script}")
# Delete any other versioned pip entry points # Delete any other versioned pip entry points
@ -336,9 +332,7 @@ def get_console_script_specs(console: Dict[str, str]) -> List[str]:
scripts_to_generate.append("easy_install = " + easy_install_script) scripts_to_generate.append("easy_install = " + easy_install_script)
scripts_to_generate.append( scripts_to_generate.append(
"easy_install-{} = {}".format( f"easy_install-{get_major_minor_version()} = {easy_install_script}"
get_major_minor_version(), easy_install_script
)
) )
# Delete any other versioned easy_install entry points # Delete any other versioned easy_install entry points
easy_install_ep = [ easy_install_ep = [
@ -408,10 +402,10 @@ class ScriptFile:
class MissingCallableSuffix(InstallationError): class MissingCallableSuffix(InstallationError):
def __init__(self, entry_point: str) -> None: def __init__(self, entry_point: str) -> None:
super().__init__( super().__init__(
"Invalid script entry point: {} - A callable " f"Invalid script entry point: {entry_point} - A callable "
"suffix is required. Cf https://packaging.python.org/" "suffix is required. Cf https://packaging.python.org/"
"specifications/entry-points/#use-for-scripts for more " "specifications/entry-points/#use-for-scripts for more "
"information.".format(entry_point) "information."
) )
@ -712,7 +706,7 @@ def req_error_context(req_description: str) -> Generator[None, None, None]:
try: try:
yield yield
except InstallationError as e: except InstallationError as e:
message = "For req: {}. {}".format(req_description, e.args[0]) message = f"For req: {req_description}. {e.args[0]}"
raise InstallationError(message) from e raise InstallationError(message) from e

View File

@ -603,8 +603,8 @@ class RequirementPreparer:
) )
except NetworkConnectionError as exc: except NetworkConnectionError as exc:
raise InstallationError( raise InstallationError(
"Could not install requirement {} because of HTTP " f"Could not install requirement {req} because of HTTP "
"error {} for URL {}".format(req, exc, link) f"error {exc} for URL {link}"
) )
else: else:
file_path = self._downloaded[link.url] file_path = self._downloaded[link.url]
@ -684,9 +684,9 @@ class RequirementPreparer:
with indent_log(): with indent_log():
if self.require_hashes: if self.require_hashes:
raise InstallationError( raise InstallationError(
"The editable requirement {} cannot be installed when " f"The editable requirement {req} cannot be installed when "
"requiring hashes, because there is no single file to " "requiring hashes, because there is no single file to "
"hash.".format(req) "hash."
) )
req.ensure_has_source_dir(self.src_dir) req.ensure_has_source_dir(self.src_dir)
req.update_editable() req.update_editable()
@ -714,7 +714,7 @@ class RequirementPreparer:
assert req.satisfied_by, "req should have been satisfied but isn't" assert req.satisfied_by, "req should have been satisfied but isn't"
assert skip_reason is not None, ( assert skip_reason is not None, (
"did not get skip reason skipped but req.satisfied_by " "did not get skip reason skipped but req.satisfied_by "
"is set to {}".format(req.satisfied_by) f"is set to {req.satisfied_by}"
) )
logger.info( logger.info(
"Requirement %s: %s (%s)", skip_reason, req, req.satisfied_by.version "Requirement %s: %s (%s)", skip_reason, req, req.satisfied_by.version

View File

@ -462,7 +462,7 @@ def install_req_from_req_string(
raise InstallationError( raise InstallationError(
"Packages installed from PyPI cannot depend on packages " "Packages installed from PyPI cannot depend on packages "
"which are not also hosted on PyPI.\n" "which are not also hosted on PyPI.\n"
"{} depends on {} ".format(comes_from.name, req) f"{comes_from.name} depends on {req} "
) )
return InstallRequirement( return InstallRequirement(

View File

@ -191,7 +191,7 @@ class InstallRequirement:
if self.req: if self.req:
s = redact_auth_from_requirement(self.req) s = redact_auth_from_requirement(self.req)
if self.link: if self.link:
s += " from {}".format(redact_auth_from_url(self.link.url)) s += f" from {redact_auth_from_url(self.link.url)}"
elif self.link: elif self.link:
s = redact_auth_from_url(self.link.url) s = redact_auth_from_url(self.link.url)
else: else:
@ -221,7 +221,7 @@ class InstallRequirement:
attributes = vars(self) attributes = vars(self)
names = sorted(attributes) names = sorted(attributes)
state = ("{}={!r}".format(attr, attributes[attr]) for attr in sorted(names)) state = (f"{attr}={attributes[attr]!r}" for attr in sorted(names))
return "<{name} object: {{{state}}}>".format( return "<{name} object: {{{state}}}>".format(
name=self.__class__.__name__, name=self.__class__.__name__,
state=", ".join(state), state=", ".join(state),
@ -754,8 +754,8 @@ class InstallRequirement:
if os.path.exists(archive_path): if os.path.exists(archive_path):
response = ask_path_exists( response = ask_path_exists(
"The file {} exists. (i)gnore, (w)ipe, " f"The file {display_path(archive_path)} exists. (i)gnore, (w)ipe, "
"(b)ackup, (a)bort ".format(display_path(archive_path)), "(b)ackup, (a)bort ",
("i", "w", "b", "a"), ("i", "w", "b", "a"),
) )
if response == "i": if response == "i":

View File

@ -71,16 +71,16 @@ def uninstallation_paths(dist: BaseDistribution) -> Generator[str, None, None]:
entries = dist.iter_declared_entries() entries = dist.iter_declared_entries()
if entries is None: if entries is None:
msg = "Cannot uninstall {dist}, RECORD file not found.".format(dist=dist) msg = f"Cannot uninstall {dist}, RECORD file not found."
installer = dist.installer installer = dist.installer
if not installer or installer == "pip": if not installer or installer == "pip":
dep = "{}=={}".format(dist.raw_name, dist.version) dep = f"{dist.raw_name}=={dist.version}"
msg += ( msg += (
" You might be able to recover from this via: " " You might be able to recover from this via: "
"'pip install --force-reinstall --no-deps {}'.".format(dep) f"'pip install --force-reinstall --no-deps {dep}'."
) )
else: else:
msg += " Hint: The package was installed by {}.".format(installer) msg += f" Hint: The package was installed by {installer}."
raise UninstallationError(msg) raise UninstallationError(msg)
for entry in entries: for entry in entries:

View File

@ -231,9 +231,7 @@ class Resolver(BaseResolver):
tags = compatibility_tags.get_supported() tags = compatibility_tags.get_supported()
if requirement_set.check_supported_wheels and not wheel.supported(tags): if requirement_set.check_supported_wheels and not wheel.supported(tags):
raise InstallationError( raise InstallationError(
"{} is not a supported wheel on this platform.".format( f"{wheel.filename} is not a supported wheel on this platform."
wheel.filename
)
) )
# This next bit is really a sanity check. # This next bit is really a sanity check.
@ -287,9 +285,9 @@ class Resolver(BaseResolver):
) )
if does_not_satisfy_constraint: if does_not_satisfy_constraint:
raise InstallationError( raise InstallationError(
"Could not satisfy constraints for '{}': " f"Could not satisfy constraints for '{install_req.name}': "
"installation from path or url cannot be " "installation from path or url cannot be "
"constrained to a version".format(install_req.name) "constrained to a version"
) )
# If we're now installing a constraint, mark the existing # If we're now installing a constraint, mark the existing
# object for real installation. # object for real installation.
@ -398,9 +396,9 @@ class Resolver(BaseResolver):
# "UnicodeEncodeError: 'ascii' codec can't encode character" # "UnicodeEncodeError: 'ascii' codec can't encode character"
# in Python 2 when the reason contains non-ascii characters. # in Python 2 when the reason contains non-ascii characters.
"The candidate selected for download or install is a " "The candidate selected for download or install is a "
"yanked version: {candidate}\n" f"yanked version: {best_candidate}\n"
"Reason for being yanked: {reason}" f"Reason for being yanked: {reason}"
).format(candidate=best_candidate, reason=reason) )
logger.warning(msg) logger.warning(msg)
return link return link

View File

@ -159,10 +159,7 @@ class _InstallRequirementBackedCandidate(Candidate):
return f"{self.name} {self.version}" return f"{self.name} {self.version}"
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({link!r})".format( return f"{self.__class__.__name__}({str(self._link)!r})"
class_name=self.__class__.__name__,
link=str(self._link),
)
def __hash__(self) -> int: def __hash__(self) -> int:
return hash((self.__class__, self._link)) return hash((self.__class__, self._link))
@ -354,10 +351,7 @@ class AlreadyInstalledCandidate(Candidate):
return str(self.dist) return str(self.dist)
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({distribution!r})".format( return f"{self.__class__.__name__}({self.dist!r})"
class_name=self.__class__.__name__,
distribution=self.dist,
)
def __hash__(self) -> int: def __hash__(self) -> int:
return hash((self.__class__, self.name, self.version)) return hash((self.__class__, self.name, self.version))
@ -455,11 +449,7 @@ class ExtrasCandidate(Candidate):
return "{}[{}] {}".format(name, ",".join(self.extras), rest) return "{}[{}] {}".format(name, ",".join(self.extras), rest)
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}(base={base!r}, extras={extras!r})".format( return f"{self.__class__.__name__}(base={self.base!r}, extras={self.extras!r})"
class_name=self.__class__.__name__,
base=self.base,
extras=self.extras,
)
def __hash__(self) -> int: def __hash__(self) -> int:
return hash((self.base, self.extras)) return hash((self.base, self.extras))

View File

@ -753,8 +753,8 @@ class Factory:
info = "the requested packages" info = "the requested packages"
msg = ( msg = (
"Cannot install {} because these package versions " f"Cannot install {info} because these package versions "
"have conflicting dependencies.".format(info) "have conflicting dependencies."
) )
logger.critical(msg) logger.critical(msg)
msg = "\nThe conflict is caused by:" msg = "\nThe conflict is caused by:"

View File

@ -65,7 +65,7 @@ class PipDebuggingReporter(BaseReporter):
logger.info("Reporter.starting_round(%r)", index) logger.info("Reporter.starting_round(%r)", index)
def ending_round(self, index: int, state: Any) -> None: def ending_round(self, index: int, state: Any) -> None:
logger.info("Reporter.ending_round(%r, state)", index) logger.info("Reporter.ending_round(%r, %r)", index, state)
def ending(self, state: Any) -> None: def ending(self, state: Any) -> None:
logger.info("Reporter.ending(%r)", state) logger.info("Reporter.ending(%r)", state)

View File

@ -15,10 +15,7 @@ class ExplicitRequirement(Requirement):
return str(self.candidate) return str(self.candidate)
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({candidate!r})".format( return f"{self.__class__.__name__}({self.candidate!r})"
class_name=self.__class__.__name__,
candidate=self.candidate,
)
@property @property
def project_name(self) -> NormalizedName: def project_name(self) -> NormalizedName:
@ -50,10 +47,7 @@ class SpecifierRequirement(Requirement):
return str(self._ireq.req) return str(self._ireq.req)
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({requirement!r})".format( return f"{self.__class__.__name__}({str(self._ireq.req)!r})"
class_name=self.__class__.__name__,
requirement=str(self._ireq.req),
)
@property @property
def project_name(self) -> NormalizedName: def project_name(self) -> NormalizedName:
@ -116,10 +110,7 @@ class RequiresPythonRequirement(Requirement):
return f"Python {self.specifier}" return f"Python {self.specifier}"
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({specifier!r})".format( return f"{self.__class__.__name__}({str(self.specifier)!r})"
class_name=self.__class__.__name__,
specifier=str(self.specifier),
)
@property @property
def project_name(self) -> NormalizedName: def project_name(self) -> NormalizedName:
@ -155,10 +146,7 @@ class UnsatisfiableRequirement(Requirement):
return f"{self._name} (unavailable)" return f"{self._name} (unavailable)"
def __repr__(self) -> str: def __repr__(self) -> str:
return "{class_name}({name!r})".format( return f"{self.__class__.__name__}({str(self._name)!r})"
class_name=self.__class__.__name__,
name=str(self._name),
)
@property @property
def project_name(self) -> NormalizedName: def project_name(self) -> NormalizedName:

View File

@ -77,11 +77,7 @@ def get_pip_version() -> str:
pip_pkg_dir = os.path.join(os.path.dirname(__file__), "..", "..") pip_pkg_dir = os.path.join(os.path.dirname(__file__), "..", "..")
pip_pkg_dir = os.path.abspath(pip_pkg_dir) pip_pkg_dir = os.path.abspath(pip_pkg_dir)
return "pip {} from {} (python {})".format( return f"pip {__version__} from {pip_pkg_dir} (python {get_major_minor_version()})"
__version__,
pip_pkg_dir,
get_major_minor_version(),
)
def normalize_version_info(py_version_info: Tuple[int, ...]) -> Tuple[int, int, int]: def normalize_version_info(py_version_info: Tuple[int, ...]) -> Tuple[int, int, int]:
@ -145,9 +141,9 @@ def rmtree(
) )
if sys.version_info >= (3, 12): if sys.version_info >= (3, 12):
# See https://docs.python.org/3.12/whatsnew/3.12.html#shutil. # See https://docs.python.org/3.12/whatsnew/3.12.html#shutil.
shutil.rmtree(dir, onexc=handler) shutil.rmtree(dir, onexc=handler) # type: ignore
else: else:
shutil.rmtree(dir, onerror=handler) shutil.rmtree(dir, onerror=handler) # type: ignore
def _onerror_ignore(*_args: Any) -> None: def _onerror_ignore(*_args: Any) -> None:
@ -279,13 +275,13 @@ def strtobool(val: str) -> int:
def format_size(bytes: float) -> str: def format_size(bytes: float) -> str:
if bytes > 1000 * 1000: if bytes > 1000 * 1000:
return "{:.1f} MB".format(bytes / 1000.0 / 1000) return f"{bytes / 1000.0 / 1000:.1f} MB"
elif bytes > 10 * 1000: elif bytes > 10 * 1000:
return "{} kB".format(int(bytes / 1000)) return f"{int(bytes / 1000)} kB"
elif bytes > 1000: elif bytes > 1000:
return "{:.1f} kB".format(bytes / 1000.0) return f"{bytes / 1000.0:.1f} kB"
else: else:
return "{} bytes".format(int(bytes)) return f"{int(bytes)} bytes"
def tabulate(rows: Iterable[Iterable[Any]]) -> Tuple[List[str], List[int]]: def tabulate(rows: Iterable[Iterable[Any]]) -> Tuple[List[str], List[int]]:
@ -522,9 +518,7 @@ def redact_netloc(netloc: str) -> str:
else: else:
user = urllib.parse.quote(user) user = urllib.parse.quote(user)
password = ":****" password = ":****"
return "{user}{password}@{netloc}".format( return f"{user}{password}@{netloc}"
user=user, password=password, netloc=netloc
)
def _transform_url( def _transform_url(
@ -592,7 +586,7 @@ class HiddenText:
self.redacted = redacted self.redacted = redacted
def __repr__(self) -> str: def __repr__(self) -> str:
return "<HiddenText {!r}>".format(str(self)) return f"<HiddenText {str(self)!r}>"
def __str__(self) -> str: def __str__(self) -> str:
return self.redacted return self.redacted

View File

@ -28,7 +28,7 @@ def parse_wheel(wheel_zip: ZipFile, name: str) -> Tuple[str, Message]:
metadata = wheel_metadata(wheel_zip, info_dir) metadata = wheel_metadata(wheel_zip, info_dir)
version = wheel_version(metadata) version = wheel_version(metadata)
except UnsupportedWheel as e: except UnsupportedWheel as e:
raise UnsupportedWheel("{} has an invalid wheel, {}".format(name, str(e))) raise UnsupportedWheel(f"{name} has an invalid wheel, {str(e)}")
check_compatibility(version, name) check_compatibility(version, name)
@ -60,9 +60,7 @@ def wheel_dist_info_dir(source: ZipFile, name: str) -> str:
canonical_name = canonicalize_name(name) canonical_name = canonicalize_name(name)
if not info_dir_name.startswith(canonical_name): if not info_dir_name.startswith(canonical_name):
raise UnsupportedWheel( raise UnsupportedWheel(
".dist-info directory {!r} does not start with {!r}".format( f".dist-info directory {info_dir!r} does not start with {canonical_name!r}"
info_dir, canonical_name
)
) )
return info_dir return info_dir

View File

@ -31,7 +31,7 @@ class Mercurial(VersionControl):
@staticmethod @staticmethod
def get_base_rev_args(rev: str) -> List[str]: def get_base_rev_args(rev: str) -> List[str]:
return [f"-r={rev}"] return [f"--rev={rev}"]
def fetch_new( def fetch_new(
self, dest: str, url: HiddenText, rev_options: RevOptions, verbosity: int self, dest: str, url: HiddenText, rev_options: RevOptions, verbosity: int

View File

@ -405,9 +405,9 @@ class VersionControl:
scheme, netloc, path, query, frag = urllib.parse.urlsplit(url) scheme, netloc, path, query, frag = urllib.parse.urlsplit(url)
if "+" not in scheme: if "+" not in scheme:
raise ValueError( raise ValueError(
"Sorry, {!r} is a malformed VCS url. " f"Sorry, {url!r} is a malformed VCS url. "
"The format is <vcs>+<protocol>://<url>, " "The format is <vcs>+<protocol>://<url>, "
"e.g. svn+http://myrepo/svn/MyApp#egg=MyApp".format(url) "e.g. svn+http://myrepo/svn/MyApp#egg=MyApp"
) )
# Remove the vcs prefix. # Remove the vcs prefix.
scheme = scheme.split("+", 1)[1] scheme = scheme.split("+", 1)[1]
@ -417,9 +417,9 @@ class VersionControl:
path, rev = path.rsplit("@", 1) path, rev = path.rsplit("@", 1)
if not rev: if not rev:
raise InstallationError( raise InstallationError(
"The URL {!r} has an empty revision (after @) " f"The URL {url!r} has an empty revision (after @) "
"which is not supported. Include a revision after @ " "which is not supported. Include a revision after @ "
"or remove @ from the URL.".format(url) "or remove @ from the URL."
) )
url = urllib.parse.urlunsplit((scheme, netloc, path, query, "")) url = urllib.parse.urlunsplit((scheme, netloc, path, query, ""))
return url, rev, user_pass return url, rev, user_pass
@ -566,7 +566,7 @@ class VersionControl:
self.name, self.name,
url, url,
) )
response = ask_path_exists("What to do? {}".format(prompt[0]), prompt[1]) response = ask_path_exists(f"What to do? {prompt[0]}", prompt[1])
if response == "a": if response == "a":
sys.exit(-1) sys.exit(-1)

View File

@ -140,15 +140,15 @@ def _verify_one(req: InstallRequirement, wheel_path: str) -> None:
w = Wheel(os.path.basename(wheel_path)) w = Wheel(os.path.basename(wheel_path))
if canonicalize_name(w.name) != canonical_name: if canonicalize_name(w.name) != canonical_name:
raise InvalidWheelFilename( raise InvalidWheelFilename(
"Wheel has unexpected file name: expected {!r}, " f"Wheel has unexpected file name: expected {canonical_name!r}, "
"got {!r}".format(canonical_name, w.name), f"got {w.name!r}",
) )
dist = get_wheel_distribution(FilesystemWheel(wheel_path), canonical_name) dist = get_wheel_distribution(FilesystemWheel(wheel_path), canonical_name)
dist_verstr = str(dist.version) dist_verstr = str(dist.version)
if canonicalize_version(dist_verstr) != canonicalize_version(w.version): if canonicalize_version(dist_verstr) != canonicalize_version(w.version):
raise InvalidWheelFilename( raise InvalidWheelFilename(
"Wheel has unexpected file name: expected {!r}, " f"Wheel has unexpected file name: expected {dist_verstr!r}, "
"got {!r}".format(dist_verstr, w.version), f"got {w.version!r}",
) )
metadata_version_value = dist.metadata_version metadata_version_value = dist.metadata_version
if metadata_version_value is None: if metadata_version_value is None:
@ -160,8 +160,7 @@ def _verify_one(req: InstallRequirement, wheel_path: str) -> None:
raise UnsupportedWheel(msg) raise UnsupportedWheel(msg)
if metadata_version >= Version("1.2") and not isinstance(dist.version, Version): if metadata_version >= Version("1.2") and not isinstance(dist.version, Version):
raise UnsupportedWheel( raise UnsupportedWheel(
"Metadata 1.2 mandates PEP 440 version, " f"Metadata 1.2 mandates PEP 440 version, but {dist_verstr!r} is not"
"but {!r} is not".format(dist_verstr)
) )

View File

@ -14,6 +14,7 @@ from hashlib import sha256
from pathlib import Path from pathlib import Path
from textwrap import dedent from textwrap import dedent
from typing import ( from typing import (
TYPE_CHECKING,
Any, Any,
AnyStr, AnyStr,
Callable, Callable,
@ -58,6 +59,9 @@ from tests.lib import (
from tests.lib.server import MockServer, make_mock_server from tests.lib.server import MockServer, make_mock_server
from tests.lib.venv import VirtualEnvironment, VirtualEnvironmentType from tests.lib.venv import VirtualEnvironment, VirtualEnvironmentType
if TYPE_CHECKING:
from pip._vendor.typing_extensions import Self
def pytest_addoption(parser: Parser) -> None: def pytest_addoption(parser: Parser) -> None:
parser.addoption( parser.addoption(
@ -141,7 +145,7 @@ def pytest_collection_modifyitems(config: Config, items: List[pytest.Function])
if "script" in item.fixturenames: if "script" in item.fixturenames:
raise RuntimeError( raise RuntimeError(
"Cannot use the ``script`` funcarg in a unit test: " "Cannot use the ``script`` funcarg in a unit test: "
"(filename = {}, item = {})".format(module_path, item) f"(filename = {module_path}, item = {item})"
) )
else: else:
raise RuntimeError(f"Unknown test type (filename = {module_path})") raise RuntimeError(f"Unknown test type (filename = {module_path})")
@ -941,7 +945,7 @@ def html_index_with_onetime_server(
""" """
class InDirectoryServer(http.server.ThreadingHTTPServer): class InDirectoryServer(http.server.ThreadingHTTPServer):
def finish_request(self, request: Any, client_address: Any) -> None: def finish_request(self: "Self", request: Any, client_address: Any) -> None:
self.RequestHandlerClass( self.RequestHandlerClass(
request, request,
client_address, client_address,

View File

@ -23,7 +23,7 @@ def test_entrypoints_work(entrypoint: str, script: PipTestEnvironment) -> None:
fake_pkg.mkdir() fake_pkg.mkdir()
fake_pkg.joinpath("setup.py").write_text( fake_pkg.joinpath("setup.py").write_text(
dedent( dedent(
""" f"""
from setuptools import setup from setuptools import setup
setup( setup(
@ -31,13 +31,11 @@ def test_entrypoints_work(entrypoint: str, script: PipTestEnvironment) -> None:
version="0.1.0", version="0.1.0",
entry_points={{ entry_points={{
"console_scripts": [ "console_scripts": [
{!r} {entrypoint!r}
] ]
}} }}
) )
""".format( """
entrypoint
)
) )
) )

View File

@ -400,7 +400,7 @@ def test_completion_path_after_option(
def test_completion_uses_same_executable_name( def test_completion_uses_same_executable_name(
autocomplete_script: PipTestEnvironment, flag: str, deprecated_python: bool autocomplete_script: PipTestEnvironment, flag: str, deprecated_python: bool
) -> None: ) -> None:
executable_name = "pip{}".format(sys.version_info[0]) executable_name = f"pip{sys.version_info[0]}"
# Deprecated python versions produce an extra deprecation warning # Deprecated python versions produce an extra deprecation warning
result = autocomplete_script.run( result = autocomplete_script.run(
executable_name, executable_name,

View File

@ -68,7 +68,7 @@ def test_debug__tags(script: PipTestEnvironment, args: List[str]) -> None:
stdout = result.stdout stdout = result.stdout
tags = compatibility_tags.get_supported() tags = compatibility_tags.get_supported()
expected_tag_header = "Compatible tags: {}".format(len(tags)) expected_tag_header = f"Compatible tags: {len(tags)}"
assert expected_tag_header in stdout assert expected_tag_header in stdout
show_verbose_note = "--verbose" not in args show_verbose_note = "--verbose" not in args

View File

@ -166,13 +166,11 @@ def test_freeze_with_invalid_names(script: PipTestEnvironment) -> None:
with open(egg_info_path, "w") as egg_info_file: with open(egg_info_path, "w") as egg_info_file:
egg_info_file.write( egg_info_file.write(
textwrap.dedent( textwrap.dedent(
"""\ f"""\
Metadata-Version: 1.0 Metadata-Version: 1.0
Name: {} Name: {pkgname}
Version: 1.0 Version: 1.0
""".format( """
pkgname
)
) )
) )
@ -221,12 +219,10 @@ def test_freeze_editable_not_vcs(script: PipTestEnvironment) -> None:
# We need to apply os.path.normcase() to the path since that is what # We need to apply os.path.normcase() to the path since that is what
# the freeze code does. # the freeze code does.
expected = textwrap.dedent( expected = textwrap.dedent(
"""\ f"""\
...# Editable install with no version control (version-pkg==0.1) ...# Editable install with no version control (version-pkg==0.1)
-e {} -e {os.path.normcase(pkg_path)}
...""".format( ..."""
os.path.normcase(pkg_path)
)
) )
_check_output(result.stdout, expected) _check_output(result.stdout, expected)
@ -248,12 +244,10 @@ def test_freeze_editable_git_with_no_remote(
# We need to apply os.path.normcase() to the path since that is what # We need to apply os.path.normcase() to the path since that is what
# the freeze code does. # the freeze code does.
expected = textwrap.dedent( expected = textwrap.dedent(
"""\ f"""\
...# Editable Git install with no remote (version-pkg==0.1) ...# Editable Git install with no remote (version-pkg==0.1)
-e {} -e {os.path.normcase(pkg_path)}
...""".format( ..."""
os.path.normcase(pkg_path)
)
) )
_check_output(result.stdout, expected) _check_output(result.stdout, expected)
@ -653,9 +647,9 @@ def test_freeze_with_requirement_option_file_url_egg_not_installed(
expect_stderr=True, expect_stderr=True,
) )
expected_err = ( expected_err = (
"WARNING: Requirement file [requirements.txt] contains {}, " f"WARNING: Requirement file [requirements.txt] contains {url}, "
"but package 'Does.Not-Exist' is not installed\n" "but package 'Does.Not-Exist' is not installed\n"
).format(url) )
if deprecated_python: if deprecated_python:
assert expected_err in result.stderr assert expected_err in result.stderr
else: else:

View File

@ -106,10 +106,10 @@ def test_pep518_refuses_conflicting_requires(
assert ( assert (
result.returncode != 0 result.returncode != 0
and ( and (
"Some build dependencies for {url} conflict " f"Some build dependencies for {project_dir.as_uri()} conflict "
"with PEP 517/518 supported " "with PEP 517/518 supported "
"requirements: setuptools==1.0 is incompatible with " "requirements: setuptools==1.0 is incompatible with "
"setuptools>=40.8.0.".format(url=project_dir.as_uri()) "setuptools>=40.8.0."
) )
in result.stderr in result.stderr
), str(result) ), str(result)
@ -595,8 +595,8 @@ def test_hashed_install_success(
with requirements_file( with requirements_file(
"simple2==1.0 --hash=sha256:9336af72ca661e6336eb87bc7de3e8844d853e" "simple2==1.0 --hash=sha256:9336af72ca661e6336eb87bc7de3e8844d853e"
"3848c2b9bbd2e8bf01db88c2c7\n" "3848c2b9bbd2e8bf01db88c2c7\n"
"{simple} --hash=sha256:393043e672415891885c9a2a0929b1af95fb866d6c" f"{file_url} --hash=sha256:393043e672415891885c9a2a0929b1af95fb866d6c"
"a016b42d2e6ce53619b653".format(simple=file_url), "a016b42d2e6ce53619b653",
tmpdir, tmpdir,
) as reqs_file: ) as reqs_file:
script.pip_install_local("-r", reqs_file.resolve()) script.pip_install_local("-r", reqs_file.resolve())
@ -1735,7 +1735,7 @@ def test_install_builds_wheels(script: PipTestEnvironment, data: TestData) -> No
# into the cache # into the cache
assert wheels != [], str(res) assert wheels != [], str(res)
assert wheels == [ assert wheels == [
"Upper-2.0-py{}-none-any.whl".format(sys.version_info[0]), f"Upper-2.0-py{sys.version_info[0]}-none-any.whl",
] ]
@ -2387,7 +2387,7 @@ def test_install_verify_package_name_normalization(
assert "Successfully installed simple-package" in result.stdout assert "Successfully installed simple-package" in result.stdout
result = script.pip("install", package_name) result = script.pip("install", package_name)
assert "Requirement already satisfied: {}".format(package_name) in result.stdout assert f"Requirement already satisfied: {package_name}" in result.stdout
def test_install_logs_pip_version_in_debug( def test_install_logs_pip_version_in_debug(

View File

@ -184,12 +184,10 @@ def test_config_file_override_stack(
config_file.write_text( config_file.write_text(
textwrap.dedent( textwrap.dedent(
"""\ f"""\
[global] [global]
index-url = {}/simple1 index-url = {base_address}/simple1
""".format( """
base_address
)
) )
) )
script.pip("install", "-vvv", "INITools", expect_error=True) script.pip("install", "-vvv", "INITools", expect_error=True)
@ -197,14 +195,12 @@ def test_config_file_override_stack(
config_file.write_text( config_file.write_text(
textwrap.dedent( textwrap.dedent(
"""\ f"""\
[global] [global]
index-url = {address}/simple1 index-url = {base_address}/simple1
[install] [install]
index-url = {address}/simple2 index-url = {base_address}/simple2
""".format( """
address=base_address
)
) )
) )
script.pip("install", "-vvv", "INITools", expect_error=True) script.pip("install", "-vvv", "INITools", expect_error=True)

View File

@ -41,13 +41,11 @@ def test_find_links_requirements_file_relative_path(
"""Test find-links as a relative path to a reqs file.""" """Test find-links as a relative path to a reqs file."""
script.scratch_path.joinpath("test-req.txt").write_text( script.scratch_path.joinpath("test-req.txt").write_text(
textwrap.dedent( textwrap.dedent(
""" f"""
--no-index --no-index
--find-links={} --find-links={data.packages.as_posix()}
parent==0.1 parent==0.1
""".format( """
data.packages.as_posix()
)
) )
) )
result = script.pip( result = script.pip(

View File

@ -95,7 +95,7 @@ def test_requirements_file(script: PipTestEnvironment) -> None:
result.did_create(script.site_packages / "INITools-0.2.dist-info") result.did_create(script.site_packages / "INITools-0.2.dist-info")
result.did_create(script.site_packages / "initools") result.did_create(script.site_packages / "initools")
assert result.files_created[script.site_packages / other_lib_name].dir assert result.files_created[script.site_packages / other_lib_name].dir
fn = "{}-{}.dist-info".format(other_lib_name, other_lib_version) fn = f"{other_lib_name}-{other_lib_version}.dist-info"
assert result.files_created[script.site_packages / fn].dir assert result.files_created[script.site_packages / fn].dir
@ -260,13 +260,13 @@ def test_respect_order_in_requirements_file(
assert ( assert (
"parent" in downloaded[0] "parent" in downloaded[0]
), 'First download should be "parent" but was "{}"'.format(downloaded[0]) ), f'First download should be "parent" but was "{downloaded[0]}"'
assert ( assert (
"child" in downloaded[1] "child" in downloaded[1]
), 'Second download should be "child" but was "{}"'.format(downloaded[1]) ), f'Second download should be "child" but was "{downloaded[1]}"'
assert ( assert (
"simple" in downloaded[2] "simple" in downloaded[2]
), 'Third download should be "simple" but was "{}"'.format(downloaded[2]) ), f'Third download should be "simple" but was "{downloaded[2]}"'
def test_install_local_editable_with_extras( def test_install_local_editable_with_extras(

View File

@ -169,9 +169,9 @@ def get_header_scheme_path_for_script(
) -> Path: ) -> Path:
command = ( command = (
"from pip._internal.locations import get_scheme;" "from pip._internal.locations import get_scheme;"
"scheme = get_scheme({!r});" f"scheme = get_scheme({dist_name!r});"
"print(scheme.headers);" "print(scheme.headers);"
).format(dist_name) )
result = script.run("python", "-c", command).stdout result = script.run("python", "-c", command).stdout
return Path(result.strip()) return Path(result.strip())

View File

@ -1185,7 +1185,7 @@ def test_new_resolver_presents_messages_when_backtracking_a_lot(
for index in range(1, N + 1): for index in range(1, N + 1):
A_version = f"{index}.0.0" A_version = f"{index}.0.0"
B_version = f"{index}.0.0" B_version = f"{index}.0.0"
C_version = "{index_minus_one}.0.0".format(index_minus_one=index - 1) C_version = f"{index - 1}.0.0"
depends = ["B == " + B_version] depends = ["B == " + B_version]
if index != 1: if index != 1:

View File

@ -71,8 +71,8 @@ def test_new_resolver_conflict_constraints_file(
def test_new_resolver_requires_python_error(script: PipTestEnvironment) -> None: def test_new_resolver_requires_python_error(script: PipTestEnvironment) -> None:
compatible_python = ">={0.major}.{0.minor}".format(sys.version_info) compatible_python = f">={sys.version_info.major}.{sys.version_info.minor}"
incompatible_python = "<{0.major}.{0.minor}".format(sys.version_info) incompatible_python = f"<{sys.version_info.major}.{sys.version_info.minor}"
pkga = create_test_package_with_setup( pkga = create_test_package_with_setup(
script, script,
@ -99,7 +99,7 @@ def test_new_resolver_requires_python_error(script: PipTestEnvironment) -> None:
def test_new_resolver_checks_requires_python_before_dependencies( def test_new_resolver_checks_requires_python_before_dependencies(
script: PipTestEnvironment, script: PipTestEnvironment,
) -> None: ) -> None:
incompatible_python = "<{0.major}.{0.minor}".format(sys.version_info) incompatible_python = f"<{sys.version_info.major}.{sys.version_info.minor}"
pkg_dep = create_basic_wheel_for_package( pkg_dep = create_basic_wheel_for_package(
script, script,

View File

@ -24,18 +24,11 @@ def _create_find_links(script: PipTestEnvironment) -> _FindLinks:
index_html = script.scratch_path / "index.html" index_html = script.scratch_path / "index.html"
index_html.write_text( index_html.write_text(
""" f"""
<!DOCTYPE html> <!DOCTYPE html>
<a href="{sdist_url}#sha256={sdist_hash}">{sdist_path.stem}</a> <a href="{sdist_path.as_uri()}#sha256={sdist_hash}">{sdist_path.stem}</a>
<a href="{wheel_url}#sha256={wheel_hash}">{wheel_path.stem}</a> <a href="{wheel_path.as_uri()}#sha256={wheel_hash}">{wheel_path.stem}</a>
""".format( """.strip()
sdist_url=sdist_path.as_uri(),
sdist_hash=sdist_hash,
sdist_path=sdist_path,
wheel_url=wheel_path.as_uri(),
wheel_hash=wheel_hash,
wheel_path=wheel_path,
).strip()
) )
return _FindLinks(index_html, sdist_hash, wheel_hash) return _FindLinks(index_html, sdist_hash, wheel_hash)
@ -99,9 +92,7 @@ def test_new_resolver_hash_intersect_from_constraint(
constraints_txt = script.scratch_path / "constraints.txt" constraints_txt = script.scratch_path / "constraints.txt"
constraints_txt.write_text( constraints_txt.write_text(
"base==0.1.0 --hash=sha256:{sdist_hash}".format( f"base==0.1.0 --hash=sha256:{find_links.sdist_hash}",
sdist_hash=find_links.sdist_hash,
),
) )
requirements_txt = script.scratch_path / "requirements.txt" requirements_txt = script.scratch_path / "requirements.txt"
requirements_txt.write_text( requirements_txt.write_text(
@ -200,13 +191,10 @@ def test_new_resolver_hash_intersect_empty_from_constraint(
constraints_txt = script.scratch_path / "constraints.txt" constraints_txt = script.scratch_path / "constraints.txt"
constraints_txt.write_text( constraints_txt.write_text(
""" f"""
base==0.1.0 --hash=sha256:{sdist_hash} base==0.1.0 --hash=sha256:{find_links.sdist_hash}
base==0.1.0 --hash=sha256:{wheel_hash} base==0.1.0 --hash=sha256:{find_links.wheel_hash}
""".format( """,
sdist_hash=find_links.sdist_hash,
wheel_hash=find_links.wheel_hash,
),
) )
result = script.pip( result = script.pip(
@ -240,19 +228,15 @@ def test_new_resolver_hash_requirement_and_url_constraint_can_succeed(
requirements_txt = script.scratch_path / "requirements.txt" requirements_txt = script.scratch_path / "requirements.txt"
requirements_txt.write_text( requirements_txt.write_text(
""" f"""
base==0.1.0 --hash=sha256:{wheel_hash} base==0.1.0 --hash=sha256:{wheel_hash}
""".format( """,
wheel_hash=wheel_hash,
),
) )
constraints_txt = script.scratch_path / "constraints.txt" constraints_txt = script.scratch_path / "constraints.txt"
constraint_text = "base @ {wheel_url}\n".format(wheel_url=wheel_path.as_uri()) constraint_text = f"base @ {wheel_path.as_uri()}\n"
if constrain_by_hash: if constrain_by_hash:
constraint_text += "base==0.1.0 --hash=sha256:{wheel_hash}\n".format( constraint_text += f"base==0.1.0 --hash=sha256:{wheel_hash}\n"
wheel_hash=wheel_hash,
)
constraints_txt.write_text(constraint_text) constraints_txt.write_text(constraint_text)
script.pip( script.pip(
@ -280,19 +264,15 @@ def test_new_resolver_hash_requirement_and_url_constraint_can_fail(
requirements_txt = script.scratch_path / "requirements.txt" requirements_txt = script.scratch_path / "requirements.txt"
requirements_txt.write_text( requirements_txt.write_text(
""" f"""
base==0.1.0 --hash=sha256:{other_hash} base==0.1.0 --hash=sha256:{other_hash}
""".format( """,
other_hash=other_hash,
),
) )
constraints_txt = script.scratch_path / "constraints.txt" constraints_txt = script.scratch_path / "constraints.txt"
constraint_text = "base @ {wheel_url}\n".format(wheel_url=wheel_path.as_uri()) constraint_text = f"base @ {wheel_path.as_uri()}\n"
if constrain_by_hash: if constrain_by_hash:
constraint_text += "base==0.1.0 --hash=sha256:{other_hash}\n".format( constraint_text += f"base==0.1.0 --hash=sha256:{other_hash}\n"
other_hash=other_hash,
)
constraints_txt.write_text(constraint_text) constraints_txt.write_text(constraint_text)
result = script.pip( result = script.pip(
@ -343,17 +323,12 @@ def test_new_resolver_hash_with_extras(script: PipTestEnvironment) -> None:
requirements_txt = script.scratch_path / "requirements.txt" requirements_txt = script.scratch_path / "requirements.txt"
requirements_txt.write_text( requirements_txt.write_text(
""" f"""
child[extra]==0.1.0 --hash=sha256:{child_hash} child[extra]==0.1.0 --hash=sha256:{child_hash}
parent_with_extra==0.1.0 --hash=sha256:{parent_with_extra_hash} parent_with_extra==0.1.0 --hash=sha256:{parent_with_extra_hash}
parent_without_extra==0.1.0 --hash=sha256:{parent_without_extra_hash} parent_without_extra==0.1.0 --hash=sha256:{parent_without_extra_hash}
extra==0.1.0 --hash=sha256:{extra_hash} extra==0.1.0 --hash=sha256:{extra_hash}
""".format( """,
child_hash=child_hash,
parent_with_extra_hash=parent_with_extra_hash,
parent_without_extra_hash=parent_without_extra_hash,
extra_hash=extra_hash,
),
) )
script.pip( script.pip(

View File

@ -58,12 +58,7 @@ def test_new_resolver_target_checks_compatibility_failure(
if platform: if platform:
args += ["--platform", platform] args += ["--platform", platform]
args_tag = "{}{}-{}-{}".format( args_tag = f"{implementation}{python_version}-{abi}-{platform}"
implementation,
python_version,
abi,
platform,
)
wheel_tag_matches = args_tag == fake_wheel_tag wheel_tag_matches = args_tag == fake_wheel_tag
result = script.pip(*args, expect_error=(not wheel_tag_matches)) result = script.pip(*args, expect_error=(not wheel_tag_matches))

View File

@ -159,9 +159,9 @@ def test_conflicting_pep517_backend_requirements(
expect_error=True, expect_error=True,
) )
msg = ( msg = (
"Some build dependencies for {url} conflict with the backend " f"Some build dependencies for {project_dir.as_uri()} conflict with the backend "
"dependencies: simplewheel==1.0 is incompatible with " "dependencies: simplewheel==1.0 is incompatible with "
"simplewheel==2.0.".format(url=project_dir.as_uri()) "simplewheel==2.0."
) )
assert result.returncode != 0 and msg in result.stderr, str(result) assert result.returncode != 0 and msg in result.stderr, str(result)
@ -205,8 +205,8 @@ def test_validate_missing_pep517_backend_requirements(
expect_error=True, expect_error=True,
) )
msg = ( msg = (
"Some build dependencies for {url} are missing: " f"Some build dependencies for {project_dir.as_uri()} are missing: "
"'simplewheel==1.0', 'test_backend'.".format(url=project_dir.as_uri()) "'simplewheel==1.0', 'test_backend'."
) )
assert result.returncode != 0 and msg in result.stderr, str(result) assert result.returncode != 0 and msg in result.stderr, str(result)
@ -231,9 +231,9 @@ def test_validate_conflicting_pep517_backend_requirements(
expect_error=True, expect_error=True,
) )
msg = ( msg = (
"Some build dependencies for {url} conflict with the backend " f"Some build dependencies for {project_dir.as_uri()} conflict with the backend "
"dependencies: simplewheel==2.0 is incompatible with " "dependencies: simplewheel==2.0 is incompatible with "
"simplewheel==1.0.".format(url=project_dir.as_uri()) "simplewheel==1.0."
) )
assert result.returncode != 0 and msg in result.stderr, str(result) assert result.returncode != 0 and msg in result.stderr, str(result)

View File

@ -604,9 +604,7 @@ def test_uninstall_without_record_fails(
"simple.dist==0.1'." "simple.dist==0.1'."
) )
elif installer: elif installer:
expected_error_message += " Hint: The package was installed by {}.".format( expected_error_message += f" Hint: The package was installed by {installer}."
installer
)
assert result2.stderr.rstrip() == expected_error_message assert result2.stderr.rstrip() == expected_error_message
assert_all_changes(result.files_after, result2, ignore_changes) assert_all_changes(result.files_after, result2, ignore_changes)

View File

@ -59,9 +59,7 @@ def test_pip_wheel_success(script: PipTestEnvironment, data: TestData) -> None:
wheel_file_path = script.scratch / wheel_file_name wheel_file_path = script.scratch / wheel_file_name
assert re.search( assert re.search(
r"Created wheel for simple: " r"Created wheel for simple: "
r"filename={filename} size=\d+ sha256=[A-Fa-f0-9]{{64}}".format( rf"filename={re.escape(wheel_file_name)} size=\d+ sha256=[A-Fa-f0-9]{{64}}",
filename=re.escape(wheel_file_name)
),
result.stdout, result.stdout,
) )
assert re.search(r"^\s+Stored in directory: ", result.stdout, re.M) assert re.search(r"^\s+Stored in directory: ", result.stdout, re.M)

View File

@ -747,7 +747,7 @@ class PipTestEnvironment(TestFileEnvironment):
for val in json.loads(ret.stdout) for val in json.loads(ret.stdout)
} }
expected = {(canonicalize_name(k), v) for k, v in kwargs.items()} expected = {(canonicalize_name(k), v) for k, v in kwargs.items()}
assert expected <= installed, "{!r} not all in {!r}".format(expected, installed) assert expected <= installed, f"{expected!r} not all in {installed!r}"
def assert_not_installed(self, *args: str) -> None: def assert_not_installed(self, *args: str) -> None:
ret = self.pip("list", "--format=json") ret = self.pip("list", "--format=json")
@ -755,9 +755,7 @@ class PipTestEnvironment(TestFileEnvironment):
# None of the given names should be listed as installed, i.e. their # None of the given names should be listed as installed, i.e. their
# intersection should be empty. # intersection should be empty.
expected = {canonicalize_name(k) for k in args} expected = {canonicalize_name(k) for k in args}
assert not (expected & installed), "{!r} contained in {!r}".format( assert not (expected & installed), f"{expected!r} contained in {installed!r}"
expected, installed
)
# FIXME ScriptTest does something similar, but only within a single # FIXME ScriptTest does something similar, but only within a single
@ -1028,7 +1026,7 @@ def _create_test_package_with_srcdir(
pkg_path.joinpath("__init__.py").write_text("") pkg_path.joinpath("__init__.py").write_text("")
subdir_path.joinpath("setup.py").write_text( subdir_path.joinpath("setup.py").write_text(
textwrap.dedent( textwrap.dedent(
""" f"""
from setuptools import setup, find_packages from setuptools import setup, find_packages
setup( setup(
name="{name}", name="{name}",
@ -1036,9 +1034,7 @@ def _create_test_package_with_srcdir(
packages=find_packages(), packages=find_packages(),
package_dir={{"": "src"}}, package_dir={{"": "src"}},
) )
""".format( """
name=name
)
) )
) )
return _vcs_add(dir_path, version_pkg_path, vcs) return _vcs_add(dir_path, version_pkg_path, vcs)
@ -1052,7 +1048,7 @@ def _create_test_package(
_create_main_file(version_pkg_path, name=name, output="0.1") _create_main_file(version_pkg_path, name=name, output="0.1")
version_pkg_path.joinpath("setup.py").write_text( version_pkg_path.joinpath("setup.py").write_text(
textwrap.dedent( textwrap.dedent(
""" f"""
from setuptools import setup, find_packages from setuptools import setup, find_packages
setup( setup(
name="{name}", name="{name}",
@ -1061,9 +1057,7 @@ def _create_test_package(
py_modules=["{name}"], py_modules=["{name}"],
entry_points=dict(console_scripts=["{name}={name}:main"]), entry_points=dict(console_scripts=["{name}={name}:main"]),
) )
""".format( """
name=name
)
) )
) )
return _vcs_add(dir_path, version_pkg_path, vcs) return _vcs_add(dir_path, version_pkg_path, vcs)
@ -1137,7 +1131,7 @@ def urlsafe_b64encode_nopad(data: bytes) -> str:
def create_really_basic_wheel(name: str, version: str) -> bytes: def create_really_basic_wheel(name: str, version: str) -> bytes:
def digest(contents: bytes) -> str: def digest(contents: bytes) -> str:
return "sha256={}".format(urlsafe_b64encode_nopad(sha256(contents).digest())) return f"sha256={urlsafe_b64encode_nopad(sha256(contents).digest())}"
def add_file(path: str, text: str) -> None: def add_file(path: str, text: str) -> None:
contents = text.encode("utf-8") contents = text.encode("utf-8")
@ -1153,13 +1147,11 @@ def create_really_basic_wheel(name: str, version: str) -> bytes:
add_file( add_file(
f"{dist_info}/METADATA", f"{dist_info}/METADATA",
dedent( dedent(
"""\ f"""\
Metadata-Version: 2.1 Metadata-Version: 2.1
Name: {} Name: {name}
Version: {} Version: {version}
""".format( """
name, version
)
), ),
) )
z.writestr(record_path, "\n".join(",".join(r) for r in records)) z.writestr(record_path, "\n".join(",".join(r) for r in records))

View File

@ -38,7 +38,7 @@ class ConfigurationMixin:
old() old()
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
self.configuration._load_config_files = overridden # type: ignore[assignment] self.configuration._load_config_files = overridden # type: ignore[method-assign]
@contextlib.contextmanager @contextlib.contextmanager
def tmpfile(self, contents: str) -> Iterator[str]: def tmpfile(self, contents: str) -> Iterator[str]:

View File

@ -56,7 +56,7 @@ def local_checkout(
assert vcs_backend is not None assert vcs_backend is not None
vcs_backend.obtain(repo_url_path, url=hide_url(remote_repo), verbosity=0) vcs_backend.obtain(repo_url_path, url=hide_url(remote_repo), verbosity=0)
return "{}+{}".format(vcs_name, Path(repo_url_path).as_uri()) return f"{vcs_name}+{Path(repo_url_path).as_uri()}"
def local_repo(remote_repo: str, temp_path: Path) -> str: def local_repo(remote_repo: str, temp_path: Path) -> str:

View File

@ -152,7 +152,7 @@ def html5_page(text: str) -> str:
def package_page(spec: Dict[str, str]) -> "WSGIApplication": def package_page(spec: Dict[str, str]) -> "WSGIApplication":
def link(name: str, value: str) -> str: def link(name: str, value: str) -> str:
return '<a href="{}">{}</a>'.format(value, name) return f'<a href="{value}">{name}</a>'
links = "".join(link(*kv) for kv in spec.items()) links = "".join(link(*kv) for kv in spec.items())
return text_html_response(html5_page(links)) return text_html_response(html5_page(links))

View File

@ -107,8 +107,8 @@ class TestPipTestEnvironment:
""" """
command = ( command = (
"import logging; logging.basicConfig(level='INFO'); " "import logging; logging.basicConfig(level='INFO'); "
"logging.getLogger().info('sub: {}', 'foo')" f"logging.getLogger().info('sub: {sub_string}', 'foo')"
).format(sub_string) )
args = [sys.executable, "-c", command] args = [sys.executable, "-c", command]
script.run(*args, **kwargs) script.run(*args, **kwargs)

View File

@ -19,12 +19,12 @@ from tests.lib.wheel import (
def test_message_from_dict_one_value() -> None: def test_message_from_dict_one_value() -> None:
message = message_from_dict({"a": "1"}) message = message_from_dict({"a": "1"})
assert set(message.get_all("a")) == {"1"} assert set(message.get_all("a")) == {"1"} # type: ignore
def test_message_from_dict_multiple_values() -> None: def test_message_from_dict_multiple_values() -> None:
message = message_from_dict({"a": ["1", "2"]}) message = message_from_dict({"a": ["1", "2"]})
assert set(message.get_all("a")) == {"1", "2"} assert set(message.get_all("a")) == {"1", "2"} # type: ignore
def message_from_bytes(contents: bytes) -> Message: def message_from_bytes(contents: bytes) -> Message:
@ -67,7 +67,7 @@ def test_make_metadata_file_custom_value_list() -> None:
f = default_make_metadata(updates={"a": ["1", "2"]}) f = default_make_metadata(updates={"a": ["1", "2"]})
assert f is not None assert f is not None
message = default_metadata_checks(f) message = default_metadata_checks(f)
assert set(message.get_all("a")) == {"1", "2"} assert set(message.get_all("a")) == {"1", "2"} # type: ignore
def test_make_metadata_file_custom_value_overrides() -> None: def test_make_metadata_file_custom_value_overrides() -> None:
@ -101,7 +101,7 @@ def default_wheel_metadata_checks(f: File) -> Message:
assert message.get_all("Wheel-Version") == ["1.0"] assert message.get_all("Wheel-Version") == ["1.0"]
assert message.get_all("Generator") == ["pip-test-suite"] assert message.get_all("Generator") == ["pip-test-suite"]
assert message.get_all("Root-Is-Purelib") == ["true"] assert message.get_all("Root-Is-Purelib") == ["true"]
assert set(message.get_all("Tag")) == {"py2-none-any", "py3-none-any"} assert set(message.get_all("Tag")) == {"py2-none-any", "py3-none-any"} # type: ignore
return message return message
@ -122,7 +122,7 @@ def test_make_wheel_metadata_file_custom_value_list() -> None:
f = default_make_wheel_metadata(updates={"a": ["1", "2"]}) f = default_make_wheel_metadata(updates={"a": ["1", "2"]})
assert f is not None assert f is not None
message = default_wheel_metadata_checks(f) message = default_wheel_metadata_checks(f)
assert set(message.get_all("a")) == {"1", "2"} assert set(message.get_all("a")) == {"1", "2"} # type: ignore
def test_make_wheel_metadata_file_custom_value_override() -> None: def test_make_wheel_metadata_file_custom_value_override() -> None:

View File

@ -190,7 +190,7 @@ def urlsafe_b64encode_nopad(data: bytes) -> str:
def digest(contents: bytes) -> str: def digest(contents: bytes) -> str:
return "sha256={}".format(urlsafe_b64encode_nopad(sha256(contents).digest())) return f"sha256={urlsafe_b64encode_nopad(sha256(contents).digest())}"
def record_file_maker_wrapper( def record_file_maker_wrapper(

View File

@ -23,7 +23,7 @@ def test_dist_get_direct_url_no_metadata(mock_read_text: mock.Mock) -> None:
class FakeDistribution(BaseDistribution): class FakeDistribution(BaseDistribution):
pass pass
dist = FakeDistribution() dist = FakeDistribution() # type: ignore
assert dist.direct_url is None assert dist.direct_url is None
mock_read_text.assert_called_once_with(DIRECT_URL_METADATA_NAME) mock_read_text.assert_called_once_with(DIRECT_URL_METADATA_NAME)
@ -35,7 +35,7 @@ def test_dist_get_direct_url_invalid_json(
class FakeDistribution(BaseDistribution): class FakeDistribution(BaseDistribution):
canonical_name = cast(NormalizedName, "whatever") # Needed for error logging. canonical_name = cast(NormalizedName, "whatever") # Needed for error logging.
dist = FakeDistribution() dist = FakeDistribution() # type: ignore
with caplog.at_level(logging.WARNING): with caplog.at_level(logging.WARNING):
assert dist.direct_url is None assert dist.direct_url is None
@ -84,7 +84,7 @@ def test_dist_get_direct_url_valid_metadata(mock_read_text: mock.Mock) -> None:
class FakeDistribution(BaseDistribution): class FakeDistribution(BaseDistribution):
pass pass
dist = FakeDistribution() dist = FakeDistribution() # type: ignore
direct_url = dist.direct_url direct_url = dist.direct_url
assert direct_url is not None assert direct_url is not None
mock_read_text.assert_called_once_with(DIRECT_URL_METADATA_NAME) mock_read_text.assert_called_once_with(DIRECT_URL_METADATA_NAME)

View File

@ -151,7 +151,7 @@ def test_base_command_provides_tempdir_helpers() -> None:
c = Command("fake", "fake") c = Command("fake", "fake")
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
c.run = Mock(side_effect=assert_helpers_set) # type: ignore[assignment] c.run = Mock(side_effect=assert_helpers_set) # type: ignore[method-assign]
assert c.main(["fake"]) == SUCCESS assert c.main(["fake"]) == SUCCESS
c.run.assert_called_once() c.run.assert_called_once()
@ -176,7 +176,7 @@ def test_base_command_global_tempdir_cleanup(kind: str, exists: bool) -> None:
c = Command("fake", "fake") c = Command("fake", "fake")
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
c.run = Mock(side_effect=create_temp_dirs) # type: ignore[assignment] c.run = Mock(side_effect=create_temp_dirs) # type: ignore[method-assign]
assert c.main(["fake"]) == SUCCESS assert c.main(["fake"]) == SUCCESS
c.run.assert_called_once() c.run.assert_called_once()
assert os.path.exists(Holder.value) == exists assert os.path.exists(Holder.value) == exists
@ -200,6 +200,6 @@ def test_base_command_local_tempdir_cleanup(kind: str, exists: bool) -> None:
c = Command("fake", "fake") c = Command("fake", "fake")
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
c.run = Mock(side_effect=create_temp_dirs) # type: ignore[assignment] c.run = Mock(side_effect=create_temp_dirs) # type: ignore[method-assign]
assert c.main(["fake"]) == SUCCESS assert c.main(["fake"]) == SUCCESS
c.run.assert_called_once() c.run.assert_called_once()

View File

@ -119,8 +119,8 @@ def test_get_index_content_invalid_content_type_archive(
assert ( assert (
"pip._internal.index.collector", "pip._internal.index.collector",
logging.WARNING, logging.WARNING,
"Skipping page {} because it looks like an archive, and cannot " f"Skipping page {url} because it looks like an archive, and cannot "
"be checked by a HTTP HEAD request.".format(url), "be checked by a HTTP HEAD request.",
) in caplog.record_tuples ) in caplog.record_tuples
@ -417,8 +417,8 @@ def _test_parse_links_data_attribute(
html = ( html = (
"<!DOCTYPE html>" "<!DOCTYPE html>"
'<html><head><meta charset="utf-8"><head>' '<html><head><meta charset="utf-8"><head>'
"<body>{}</body></html>" f"<body>{anchor_html}</body></html>"
).format(anchor_html) )
html_bytes = html.encode("utf-8") html_bytes = html.encode("utf-8")
page = IndexContent( page = IndexContent(
html_bytes, html_bytes,
@ -764,8 +764,8 @@ def test_get_index_content_invalid_scheme(
( (
"pip._internal.index.collector", "pip._internal.index.collector",
logging.WARNING, logging.WARNING,
"Cannot look at {} URL {} because it does not support " f"Cannot look at {vcs_scheme} URL {url} because it does not support "
"lookup as web pages.".format(vcs_scheme, url), "lookup as web pages.",
), ),
] ]

View File

@ -215,7 +215,7 @@ class TestConfigurationModification(ConfigurationMixin):
# Mock out the method # Mock out the method
mymock = MagicMock(spec=self.configuration._mark_as_modified) mymock = MagicMock(spec=self.configuration._mark_as_modified)
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
self.configuration._mark_as_modified = mymock # type: ignore[assignment] self.configuration._mark_as_modified = mymock # type: ignore[method-assign]
self.configuration.set_value("test.hello", "10") self.configuration.set_value("test.hello", "10")
@ -231,7 +231,7 @@ class TestConfigurationModification(ConfigurationMixin):
# Mock out the method # Mock out the method
mymock = MagicMock(spec=self.configuration._mark_as_modified) mymock = MagicMock(spec=self.configuration._mark_as_modified)
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
self.configuration._mark_as_modified = mymock # type: ignore[assignment] self.configuration._mark_as_modified = mymock # type: ignore[method-assign]
self.configuration.set_value("test.hello", "10") self.configuration.set_value("test.hello", "10")
@ -250,7 +250,7 @@ class TestConfigurationModification(ConfigurationMixin):
# Mock out the method # Mock out the method
mymock = MagicMock(spec=self.configuration._mark_as_modified) mymock = MagicMock(spec=self.configuration._mark_as_modified)
# https://github.com/python/mypy/issues/2427 # https://github.com/python/mypy/issues/2427
self.configuration._mark_as_modified = mymock # type: ignore[assignment] self.configuration._mark_as_modified = mymock # type: ignore[method-assign]
self.configuration.set_value("test.hello", "10") self.configuration.set_value("test.hello", "10")

View File

@ -143,10 +143,7 @@ class TestLink:
def test_is_hash_allowed( def test_is_hash_allowed(
self, hash_name: str, hex_digest: str, expected: bool self, hash_name: str, hex_digest: str, expected: bool
) -> None: ) -> None:
url = "https://example.com/wheel.whl#{hash_name}={hex_digest}".format( url = f"https://example.com/wheel.whl#{hash_name}={hex_digest}"
hash_name=hash_name,
hex_digest=hex_digest,
)
link = Link(url) link = Link(url)
hashes_data = { hashes_data = {
"sha512": [128 * "a", 128 * "b"], "sha512": [128 * "a", 128 * "b"],

View File

@ -27,6 +27,11 @@ class TestSafeFileCache:
cache = SafeFileCache(os.fspath(cache_tmpdir)) cache = SafeFileCache(os.fspath(cache_tmpdir))
assert cache.get("test key") is None assert cache.get("test key") is None
cache.set("test key", b"a test string") cache.set("test key", b"a test string")
# Body hasn't been stored yet, so the entry isn't valid yet
assert cache.get("test key") is None
# With a body, the cache entry is valid:
cache.set_body("test key", b"body")
assert cache.get("test key") == b"a test string" assert cache.get("test key") == b"a test string"
cache.delete("test key") cache.delete("test key")
assert cache.get("test key") is None assert cache.get("test key") is None
@ -35,6 +40,12 @@ class TestSafeFileCache:
cache = SafeFileCache(os.fspath(cache_tmpdir)) cache = SafeFileCache(os.fspath(cache_tmpdir))
assert cache.get_body("test key") is None assert cache.get_body("test key") is None
cache.set_body("test key", b"a test string") cache.set_body("test key", b"a test string")
# Metadata isn't available, so the entry isn't valid yet (this
# shouldn't happen, but just in case)
assert cache.get_body("test key") is None
# With metadata, the cache entry is valid:
cache.set("test key", b"metadata")
body = cache.get_body("test key") body = cache.get_body("test key")
assert body is not None assert body is not None
with body: with body:

View File

@ -21,8 +21,8 @@ def test_raise_for_status_raises_exception(status_code: int, error_type: str) ->
with pytest.raises(NetworkConnectionError) as excinfo: with pytest.raises(NetworkConnectionError) as excinfo:
raise_for_status(resp) raise_for_status(resp)
assert str(excinfo.value) == ( assert str(excinfo.value) == (
"{} {}: Network Error for url:" f"{status_code} {error_type}: Network Error for url:"
" http://www.example.com/whatever.tgz".format(status_code, error_type) " http://www.example.com/whatever.tgz"
) )

View File

@ -235,8 +235,8 @@ class TestRequirementSet:
r"file \(line 1\)\)\n" r"file \(line 1\)\)\n"
r"Can't verify hashes for these file:// requirements because " r"Can't verify hashes for these file:// requirements because "
r"they point to directories:\n" r"they point to directories:\n"
r" file://.*{sep}data{sep}packages{sep}FSPkg " rf" file://.*{sep}data{sep}packages{sep}FSPkg "
r"\(from -r file \(line 2\)\)".format(sep=sep) r"\(from -r file \(line 2\)\)"
), ),
): ):
resolver.resolve(reqset.all_requirements, True) resolver.resolve(reqset.all_requirements, True)

View File

@ -297,7 +297,7 @@ class TestProcessLine:
def test_yield_line_constraint(self, line_processor: LineProcessor) -> None: def test_yield_line_constraint(self, line_processor: LineProcessor) -> None:
line = "SomeProject" line = "SomeProject"
filename = "filename" filename = "filename"
comes_from = "-c {} (line {})".format(filename, 1) comes_from = f"-c {filename} (line {1})"
req = install_req_from_line(line, comes_from=comes_from, constraint=True) req = install_req_from_line(line, comes_from=comes_from, constraint=True)
found_req = line_processor(line, filename, 1, constraint=True)[0] found_req = line_processor(line, filename, 1, constraint=True)[0]
assert repr(found_req) == repr(req) assert repr(found_req) == repr(req)
@ -326,7 +326,7 @@ class TestProcessLine:
url = "git+https://url#egg=SomeProject" url = "git+https://url#egg=SomeProject"
line = f"-e {url}" line = f"-e {url}"
filename = "filename" filename = "filename"
comes_from = "-c {} (line {})".format(filename, 1) comes_from = f"-c {filename} (line {1})"
req = install_req_from_editable(url, comes_from=comes_from, constraint=True) req = install_req_from_editable(url, comes_from=comes_from, constraint=True)
found_req = line_processor(line, filename, 1, constraint=True)[0] found_req = line_processor(line, filename, 1, constraint=True)[0]
assert repr(found_req) == repr(req) assert repr(found_req) == repr(req)
@ -873,12 +873,10 @@ class TestParseRequirements:
) -> None: ) -> None:
global_option = "--dry-run" global_option = "--dry-run"
content = """ content = f"""
--only-binary :all: --only-binary :all:
INITools==2.0 --global-option="{global_option}" INITools==2.0 --global-option="{global_option}"
""".format( """
global_option=global_option
)
with requirements_file(content, tmpdir) as reqs_file: with requirements_file(content, tmpdir) as reqs_file:
req = next( req = next(

View File

@ -252,7 +252,7 @@ class TestCheckDistRequiresPython:
def metadata(self) -> email.message.Message: def metadata(self) -> email.message.Message:
raise FileNotFoundError(metadata_name) raise FileNotFoundError(metadata_name)
dist = make_fake_dist(klass=NotWorkingFakeDist) dist = make_fake_dist(klass=NotWorkingFakeDist) # type: ignore
with pytest.raises(NoneMetadataError) as exc: with pytest.raises(NoneMetadataError) as exc:
_check_dist_requires_python( _check_dist_requires_python(
@ -261,8 +261,8 @@ class TestCheckDistRequiresPython:
ignore_requires_python=False, ignore_requires_python=False,
) )
assert str(exc.value) == ( assert str(exc.value) == (
"None {} metadata found for distribution: " f"None {metadata_name} metadata found for distribution: "
"<distribution 'my-project'>".format(metadata_name) "<distribution 'my-project'>"
) )

View File

@ -66,7 +66,7 @@ def test_rev_options_repr() -> None:
# First check VCS-specific RevOptions behavior. # First check VCS-specific RevOptions behavior.
(Bazaar, [], ["-r", "123"], {}), (Bazaar, [], ["-r", "123"], {}),
(Git, ["HEAD"], ["123"], {}), (Git, ["HEAD"], ["123"], {}),
(Mercurial, [], ["-r=123"], {}), (Mercurial, [], ["--rev=123"], {}),
(Subversion, [], ["-r", "123"], {}), (Subversion, [], ["-r", "123"], {}),
# Test extra_args. For this, test using a single VersionControl class. # Test extra_args. For this, test using a single VersionControl class.
( (

View File

@ -102,15 +102,13 @@ def test_get_legacy_build_wheel_path__multiple_names(
], ],
) )
def test_get_entrypoints(tmp_path: pathlib.Path, console_scripts: str) -> None: def test_get_entrypoints(tmp_path: pathlib.Path, console_scripts: str) -> None:
entry_points_text = """ entry_points_text = f"""
[console_scripts] [console_scripts]
{} {console_scripts}
[section] [section]
common:one = module:func common:one = module:func
common:two = module:other_func common:two = module:other_func
""".format( """
console_scripts
)
distribution = make_wheel( distribution = make_wheel(
"simple", "simple",

0
tools/__init__.py Normal file
View File

View File

@ -27,7 +27,7 @@ def is_this_a_good_version_number(string: str) -> Optional[str]:
expected_major = datetime.now().year % 100 expected_major = datetime.now().year % 100
if len(release) not in [2, 3]: if len(release) not in [2, 3]:
return "Not of the form: {0}.N or {0}.N.P".format(expected_major) return f"Not of the form: {expected_major}.N or {expected_major}.N.P"
return None return None