| Commit message (Collapse) | Author | Age | Files | Lines |
|
|
|
|
|
|
|
|
|
|
|
|
| |
This effectively merges branch 'typofixes-vlajos-20150807' of github.com:vlajos/apt
with the following commit:
commit 13cacb3e2e2352ba701e769fc889e3344fabbf7e
Author: Veres Lajos <vlajos@gmail.com>
Date: Sun Aug 9 00:12:53 2015 +0100
typofix - https://github.com/vlajos/misspell_fixer
It has been rebased for a better commit message.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
If a single pdiff fails, we have to fail the entire patching endeavour
and fall back to getting the complete file instead. That is easy in
serverside merged pdiffs as we get them one by one. For clientside we
get them all at once through, which means that a failure in one has to
stop the entire pipeline, which works as expected (as proven by the
bugreporters as they don't even notice it happening). The problem is
just that the first failing pdiff will do the cleanup, so another pdiff
which happens to be successfully acquired after we processed the failure
doesn't find the file it is supposed to use as a basename anymore, so
the patch is renamed to what should be the unique extension and moved
into the current working directory. Processing is then stopped as the
patch realizes that it isn't the last one which completed downloading.
On the plus side this means this is neither us using a bad temporary
location nor a security problem. It "just" overrides unconditionally
files in your current working directory (if you happen to have them
named like a pdiff patch – a bit unlikely perhaps) and so drops files
there which are never used again.
I guess this was introduced in 4e3c5633b1e74b4f58b95f339cfbbf4cbf21ab3e
for real as I made the need for the existence of the base file rather
explicit, but the potential lingers in the code for far longer.
Closes: #816837
|
|
|
|
|
|
|
|
|
| |
Fixed in f7bd44bae0d7cb7f9838490b5eece075da83899e already, but the
commit misses the Closes tag and while we are at it we can add a simple
regression test and micro-optimize it a bit.
Thanks: James McCoy for the suggestion.
Closes: 816691
|
|
|
|
|
|
|
|
|
|
|
| |
In a249b3e6fd798935a02b769149c9791a6fa6ef16 I dropped with the manual
first resolver step also the support for installing build-deps as
automatic in such a way that it behaved like this option was enabled by
default.
Restoring support for it means that we go back to mark build-
dependencies as manually installed again by default and provide this
option to keep them as automatically installed.
|
|
|
|
|
|
| |
This way we hopefully notice (new) warnings in this little helper.
Git-Dch: Ignore
|
|
|
|
|
|
|
|
|
|
|
|
| |
Changelogs are relatively small and we have no hashes for them, but we
had partial support for them before, so lets stick to it.
This also deletes the (partial) file before moving the downloaded file
into its place – rename(2) should be doing this by itself, but testing
on semaphoreci suggests that this isn't always the case (error is "Stale
file handle") and we don't need an atomic replace here, so be explicit.
Git-Dch: Ignore
|
|
|
|
|
|
| |
If the architecture list is empty somehow, fail normally.
LP: #1549819
|
|
|
|
|
|
| |
The EDSP output generated by apt didn't include the versioned provides
information so that every provides looked like an unversioned one in the
eyes of an external resolver.
|
|
|
|
|
|
|
|
|
|
| |
pkgAcqChangelog has the default behaviour of downloading a changelog to
a temporary directory (inside /tmp, not /tmp directly), which is cleaned
up on shutdown, but this can be overridden to store the changelog more
permanently – but that caries a permission problem.
For changelog we can 'easily' solve this by always downloading to a
temporary directory and only move it out of there on done.
|
|
|
|
|
|
|
|
|
|
|
| |
If pkgAcqChangelog is told to acquire the changelog for a version it
will check first if this version is installed on the disk and if so will
use the local changelog in /usr/share/doc (possibily/likely gz
compressed) instead of downloading the file from the web.
An option is provided to disable this, which is enabled by default for
the Ubuntu vendor as they truncate the local changelogs – and for apts
--print-uris action.
|
|
|
|
|
|
|
|
|
|
|
| |
Otherwise the test run as root fails seeing the
W: Can't drop privileges for downloading as file 'foo_1.tar.gz' couldn't be
accessed by user '_apt'. - pkgAcquire::Run (13: Permission denied)
warning in a command which isn't supposed to warn.
One trivial test, two fixups and still counting…
Git-Dch: Ignore
|
|
|
|
|
|
|
|
| |
Travis still uses a dpkg version which defaults to gz and as which
compression is picked isn't all to important as long as one is just
accept any.
Git-Dch: Ignore
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Regression introduced in a249b3e6fd798935a02b769149c9791a6fa6ef16, which
in the case of an invalid cache would build the first part unlocked and
later pick up the (still unlocked) cache for further processing, so the
system got never locked and apt would end up complaining about being
unable to release the lock at shutdown.
The far more common case of having a valid cache worked as expected and
hence covered up the problem – especially as tests who would have
noticed it are simulations only, which do not lock.
Closes: 814139
Reported-By: Balint Reczey <balint@balintreczey.hu>
Reported-By: Helmut Grohne <helmut@subdivi.de> on IRC
|
| |
|
|
|
|
|
|
| |
We don't need the dependencies for obvious reasons and we don't need the
candidate version either, so building a pkgDepCache is wasted effort,
which we can stop doing now that build-dep cleared the path.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The later just calls the earlier, but the later needs the fullblown
dependency cache to be initialized, which is a very costly operation and
isn't done anymore that early in the run as we would need to throw away
and rebuild it again after we got all the information about source pkgs.
As we end up with a nullptr for the pkgDepCache, we use a slightly
longer calling convention to make sure that we use the pkgCache
directly, avoiding nullptr induced segfaults and costly operations.
Git-Dch: Ignore
Reported-By: Balint Reczey <balint@balintreczey.hu>
|
|
|
|
| |
Git-Dch: Ignore
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
The Date field in the Release file is useful to avoid allowing an
attacker to 'downgrade' a user to earlier Release files (and hence to
older states of the archieve with open security bugs). It is also needed
to allow a user to define min/max values for the validation of a Release
file (with or without the Release file providing a Valid-Until field).
APT wasn't formally requiring this field before through and (agrueable
not binding and still incomplete) online documentation declares it
optional (until now), so we downgrade the error to a warning for now to
give repository creators a bit more time to adapt – the bigger ones
should have a Date field for years already, so the effected group should
be small in any case.
It should be noted that earlier apt versions had this as an error
already, but only showed it if a Valid-Until field was present (or the
user tried to used the configuration items for min/max valid-until).
Closes: 809329
|
|
|
|
|
|
|
|
|
| |
In 321213f0dcdcdaab04e01663e7a047b261400c9c Andreas Cadhalpun corrected
the incorrect overriding of earlier better-fitting results with later
(semi-)matches – but that broke the case in which packages are in multiple
releases in the same version (and the user has both releases configured).
Closes: 812497
|
|
|
|
|
|
|
|
|
|
|
| |
In commit a221efc331693f8905da870141756c892911c433 I promoted the source
package name and version to the binary cache for faster access by e.g.
EDSP, but due to changing the interpretation length to soon we always
ignored the version part of the Source field, so that packages ended up
having the binary version as source version – which while usually just
fine it is wrong for binary rebuilds.
Closes: 812492
|
|
|
|
| |
Git-Dch: Ignore
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
build-dep was implemented by parsing the build-dependencies of a package
and figuring out which packages to install/remove based on this. That
means that for the first level of dependencies build-dep was
implementing its very own resolver with all the benefits (aka: bugs)
this gives us for not using the existing resolver for all levels.
Making this work involves generating a dummy binary package with fitting
Depends and Conflicts and as we can't create them out of thin air the
cache generation needs to be involved so we end up writing a Packages
file which we want to parse – after we have parsed the other Packages
files already. With .dsc/.deb files we could add them before we started
parsing anything.
With a bit of care we can avoid generating too much data we have to
throw away again (as many parts assume that e.g. the count of packages
doesn't change midair), so that on a speed front there shouldn't be
much of a difference, but output can be slightly confusing as if we have
a completely valid cache on disk the "Reading package lists... Done" is
printed two times – but apt is pretty quick about it in that case.
Closes: #137560, #444930, #489911, #583914, #728317, #812173
|
|
|
|
|
|
|
|
|
| |
To resolve dependencies like "pkg:arch" we create a package with the
name "pkg:arch" and the architecture "any". We create these packages
only if a dependency needs it as these kind of dependencies aren't that
common. This commit ensured that in the even this architecture specific
dependency is the only relation this package has we still create the
underlying package to have them available in provides resolution.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Introduced in 9d2a8a7388cf3b0bbbe92f6b0b30a533e1167f40 apt tries to
merge actions like downloading the same (as judged by hashes) file
into doing it once. The implementation was very simple in that it isn't
planing at all. Turns out that it works 90% of the time just fine, but
has issues in more complicated situations in which items can be in
different stages downloading different files emitting potentially the
"wrong" hash – like while pdiffs are worked on we might end up copying
the patch instead of the result file giving us very strange errors in
return. Reverting the change until we can implement a better planing
solution seems to be the best course of action even if its sad.
Closes: 810046
|
|
|
|
|
|
|
|
| |
Architectures for packages which do not belong to the native nor a
foreign architecture (dubbed barbarian for now) which are marked
M-A:foreign still provide in their own architecture even if not for
others. Also, other M-A:foreign (and allowed) packages provide in these
barbarian architectures.
|
|
|
|
| |
Closes: #734922
|
|
|
|
|
|
|
|
|
| |
This prevents a test suite failure on systems with weird umasks.
Also set umask 000 at the beginning so we can actually check for
that anywhere.
Gbp-Dch: ignore
|
|
|
|
|
|
|
| |
Just enabling it for anyone breaks with HTTP/1.0 servers and
proxies sometimes.
Closes: #810796
|
|
|
|
|
|
|
| |
The code already deals with compressed leftovers, but forgot the
uncompressed files. The opertunity is picked to reorder this code and
add debug messages about the actions taken as well as produce such a
leftover file in the associated testcase.
|
|
|
|
|
|
|
|
|
|
|
|
| |
With the addition of the $HASH-Download field in the .diff/Index we got
the size of the compressed patches for 'free', so if that information is
available we can use it for a more fitting calculation of the size
requirements of the patches vs. the complete file.
Note that this predicts a too small size in the transition case in which
the information isn't available for all patches, but figuring this out
would be a lot of code for practically nothing as only one update can
ever be in such a transition phase.
|
|
|
|
|
|
|
|
| |
Some (older) versions of bash seem to be allergic to a method named
"aptautotest_grep_^apt" (note the caret). Unlikely that we are going to
write autotests for such commands so we could just skip those, but lets
instead just use "normal" characters in the names and strip the rest as
we already did with the (arguable more common) '-'.
|
|
|
|
|
|
| |
This way it works more similar to the compressor binaries, which we
can relief in this way from their job in the test framework avoiding the
need of adding e.g. liblz4-tool to the test dependencies.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Downloading and storing are two different operations were different
compression types can be preferred. For downloading we provide the
choice via Acquire::CompressionTypes::Order as there is a choice to
be made between download size and speed – and limited by whats available
in the repository.
Storage on the other hand has all compressions currently supported by
apt available and to reduce runtime of tools accessing these files the
compression type should be a low-cost format in terms of decompression.
apt traditionally stores its indexes uncompressed on disk, but has
options to keep them compressed. Now that apt downloads additional files
we also deal with files which simply can't be stored uncompressed as
they are just too big (like Contents for apt-file). Traditionally they
are downloaded in a low-cost format (gz) as repositories do not provide
other formats, but there might be even lower-cost formats and for
download we could introduce higher-cost in the repositories.
Downloading an entire index potentially requires recompression to
another format, so an update takes potentially longer – but big files
are usually updated via pdiffs which has to de- and re-compress anyhow
and does it on the fly anyhow, so there is no extra time needed and in
general it seems to be benefitial to invest the time in update to save
time later on file access.
|
|
|
|
|
|
| |
Less hardcoding should help while introducing new compressors.
Git-Dch: Ignore
|
|
|
|
|
|
|
| |
There is no reason to enforce that the file we start the bootstrap with
is compressed with a compressor which is available online. This allows
us to change the on-disk format as well as deals with repositories
adding/removing support for a specific compressor.
|
|
|
|
|
|
|
|
|
|
|
| |
Adding a new compressor method meant adding a new method as well – even
if that boilt down to just linking to our generalized decompressor with
a new name. That is unneeded busywork if we can instead just call the
generalized decompressor and let it figure out which compressor to use
based on the filenames rather than by program name.
For compatibility we ship still 'gzip', 'bzip2' and co, but they are
just links to our "new" 'store' method.
|
|
|
|
| |
Gbp-Dch: ignore
|
|
|
|
|
|
|
|
|
|
| |
This option controls if downloaded packages should be kept after
a successful install or if they should be deleted. The default
for "apt-get" is that they are kept (just like before).
However the default for "apt" is that they get deleted.
Closes: #160743
|
|
|
|
|
|
|
|
|
|
| |
apt_preferences and deb822-style sources used the specialized class
pkgUserTagSection to deal with comments before/after a given stanza, but
it couldn't deal with comments in the stanza at all.
codesearch suggests that nobody else does and a vastely superior way of
working with potentially commented files is implemented now, so we can
officially discourage the use of the old incomplete hack class.
|
|
|
|
|
|
|
|
|
| |
Now (55153bf94ff28a23318e79aa48242244c4d82b3c) that pkgTagFile can be
told to deal with all sorts of comments we can use this mode to parse
dsc (as by catch) and debian/control files properly even in the wake of
multiline fields spliced with comments like Build-Depends.
Closes: 806775
|
|
|
|
| |
This test relies on the ordering of the hash function.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| |
Debian has a Packages file for arch:all already, but the arch:any files
contain arch:all packages as well, so downloading it would be a total
waste of resources. Getting this solved is on the list of things to do,
but it is also the hardest part – for index targets like Contents the
situation is much easier and less server/client implementations are
involved so we might not want to stall them.
A repository can now declare via:
No-Support-for-Architecture-all: Packages
that even if an arch:all Packages exists, it shouldn't be downloaded, so
that support for Contents files can be added now.
See also 1dd20368486820efb6ef4476ad739e967174bec4 for the implementation
of downloading arch:all index targets, which this is limiting.
The field uses the name of the target from the apt configuration for
simplicity and is negative by design as this field is intended to be
supported/needed only for a "short" time (one or two Debian releases).
While this commit theoretically supports any target, its expected to
only see "Packages" as a value in reality.
|
|
|
|
|
|
| |
We try to acquired the locks, but we didn't stop if we failed to get it…
Closes: 808561
|
|
|
|
|
|
|
|
|
| |
The output changes slightly between different versions, which we already
dealt with in the main testcase for apt-key, but there are two more
which do not test both versions explicitly and so still had gpg1 output
to check against as this is the default at the moment.
Git-Dch: Ignore
|
|
|
|
|
|
|
|
|
|
|
|
| |
apt-key creates internally a script (since ~1.1) which it will call to
avoid dealing with an array of different options in the code itself, but
while writing this script it wraps the values in "", which will cause
the shell to evaluate its content upon execution.
To make 'use' of this either set a absolute gpg command or TMPDIR to
something as interesting as:
"/tmp/This is fü\$\$ing cràzy, \$(man man | head -n1 | cut -d' ' -f1)\$!"
If such paths can be encountered in reality is a different question…
|
|
|
|
|
|
|
| |
This doesn't allow all tests to run cleanly, but it at least allows to
write tests which could run successfully in such environments.
Git-Dch: Ignore
|
|
|
|
|
| |
This filters out errors due to timing issues. Early exits if
enough pulses occured.
|
|
|
|
|
|
|
|
| |
This helps writing test cases. Also adapt the test case that
expected 64-bit.
Nothing changes performance wise, the distribution of the hash
values remains intact.
|
|
|
|
|
|
| |
This makes the test suite work on 32 bit-long platforms.
Gbp-Dch: ignore
|
|
|
|
|
|
|
|
|
|
|
| |
Use asprintf() so we have easy error detection and do not depend
on PATH_MAX.
Do not add another separator to the generated path, in both cases
the path inside the chroot is guaranteed to have a leading /
already.
Also pass -Wall to gcc.
|