Problems found with existing digests:
Package memconf distfile memconf-2.16/memconf.gz
b6f4b736cac388dddc5070670351cf7262aba048 [recorded]
95748686a5ad8144232f4d4abc9bf052721a196f [calculated]
Problems found locating distfiles:
Package dc-tools: missing distfile dc-tools/abs0-dc-burn-netbsd-1.5-0-gae55ec9
Package ipw-firmware: missing distfile ipw2100-fw-1.2.tgz
Package iwi-firmware: missing distfile ipw2200-fw-2.3.tgz
Package nvnet: missing distfile nvnet-netbsd-src-20050620.tgz
Package syslog-ng: missing distfile syslog-ng-3.7.2.tar.gz
Otherwise, existing SHA1 digests verified and found to be the same on
the machine holding the existing distfiles (morden). All existing
SHA1 digests retained for now as an audit trail.
INSTALLATION_DIRS, as well as all occurrences of ${PREFIX}/man with
${PREFIX}/${PKGMANDIR}.
Fixes PR 35265, although I did not use the patch provided therein.
developer is officially maintaining the package.
The rationale for changing this from "tech-pkg" to "pkgsrc-users" is
that it implies that any user can try to maintain the package (by
submitting patches to the mailing list). Since the folks most likely
to care about the package are the folks that want to use it or are
already using it, this would leverage the energy of users who aren't
developers.
or $VISUAL) at any point in a pipe. From a nudge from David Maxwell.
Normally, in a pipeline, when you need to edit some phase of the data
stream, you use a standard tool such as sed, grep, or awk to alter,
filter, or otherwise manipulate the stream. One potential problem with
this approach is that the manipulations have to be very well thought out
in advance. Another is that the manipulations will probably need to be
applied uniformly. And third, the data must be very well understood in
advance. Not all situations and data easily conform to these
constraints.
Alternatively, when the changes needed for the data are more than
trivial, or perhaps you just don't feel like expending the mental energy
needed to work out all the expressions in advance, a typical approach
might be to run some process or pipeline, dump output to a file, edit
the file with vi, pico, or emacs, then push the data along to the next
phase by using the file as input to some additional process or pipeline.
The catch here - other than the sheer awkwardness of this process - is
that you have to remember to come back later and clean up all of those
little and not-so-little "temporary" files.
So, wouldn't you just like to be able to tap in an edit session at any
arbitrary point in the pipeline, do your magic on the data, then have it
automagically continue on its merry way? The vip program provides this
functionality, and operates syntactically just like any other filter.