[aur-general] [arch-general] Please settle 'base' in 'depends' for all

Xyne xyne at archlinux.ca
Fri Jan 21 10:57:12 EST 2011


On 2011-01-22 01:29 +1000 (03:6)
Allan McRae wrote:

> On 22/01/11 00:43, Xyne wrote:
> > Allan McRae wrote:
> >
> >> I pointed out that hard rules are not good.   e.g. coreutils should (and
> >> does) depend on glibc as it is not guaranteed that glibc is installed at
> >> the time when you first install coreutils (which is likely the initial
> >> install).   But there is no point putting glibc in the depends list for
> >> (e.g.) openoffice-base as it will be installed by that stage.
> >
> > That's irrelevant to this discussion because it's a bootstrapping issue. I
> > don't know how cyclical dependencies and other initialization problems should
> > be handled, but they constitute a special case that should be detected and
> > dealt with separately.
> 
> Isn't this exactly the issue here?  The original question was whether we 
> should include glibc in the dependency list for a package.  I pointed 
> out a case where including glibc in the depends is critical and a case 
> where it is a waste of time, indicating there is no one answer to such a 
> question.

Sorry, I misread your reply. Ignore what I wrote.


> 
> 
> >> Two points to consider:
> >> 1) How much more complicated would it be to list all dependencies?
> >>
> >>   >  readelf -d $(pacman -Qql openoffice-base) 2>/dev/null | grep NEEDED |
> >> sort | uniq | wc -l
> >> 150
> >>
> >> That is a lot of libraries... although some will be in the same package
> >> so that is an upper estimate.   But that is only libraries and the
> >> complete dep list will be longer than that.
> >
> > I agree that is a lot. Of course we can't reasonably expect a packager to
> > manually enter 150 libraries into a PKGBUILD, but are all of those direct
> > dependencies? Maybe this is a silly question due to my ignorance of linking,
> > but are any of those libraries linked via other packages? For example, if bar
> > provides code that links to baz, and foo builds against bar, would baz turn up
> > in the readelf output for foo? If the answer is yes, then baz would not be a
> > dep of foo, even if foo links to it, because the linking was established
> > "indirectly", i.e. bar could have used something else.
>  >
> > Of course, in that case, baz would be a strict runtime dependency (unless
> > sodeps could resolve this, but again, my understanding here is limited), but
> > from a graph-theory pov, foo would only depend on bar. Such a situation would
> > only require a rebuild, just as it would now (i.e. if baz were replaced by
> > something else).
> 
> The answer is no.  "readelf -d" only lists directly linked libraries. 
> "ldd" gives the entire link chain.


So if I wrote bindings to libalpm in Haskell (haskell-libalpm) and then created
a package with a binary that used those bindings (foo), then readelf's output
would not indicate libalpm?


> 
> >> 2) It is worth the effort?   We have very few bug reports about missing
> >> dependencies and most (all?) of those fall into the category of missed
> >> soname bumps or due to people not building in chroots.  I.e. these are
> >> because of poor packaging and not because we make assumptions about what
> >> packages are installed or the dependencies of dependencies.
> >>
> >>
> >> So I see making a change to the current approach as making things (1)
> >> more complicated for (2) no real benefit.
> >
> > The answer depends on the answer to my previous question. The current system
> > does indeed work, but it provides no strict guarantees. I think good practice
> > in general is to make something that is critical as reliable and future-proof
> > as possible, and I see that as a true benefit. It's like wanting to agree upon
> > an open specification instead of just letting everyone do it their way and hope
> > for a triumph of common sense.
> >
> > Admittedly, I doubt it would be a problem in the future and I'm discussing this
> > idealistically.
> 
> Idealistically, I might even agree with you.  I just think it is not a 
> practical thing to do.  And if we move away from just using binary 
> libraries as examples, we can find situation where we can make 
> guarantees that a dep of a dep will never be removed.
> 
> e.g. I have a some perl software that depends on the perl-foo module. 
> If we list all dependencies, I would need to list perl and perl-foo. 
> But I can guarantee that perl-foo will always depend on perl, so do I 
> really need to list perl as a dep?

Well, I list perl as a dep even for packages that depend on perl-* packages,
but foo-* packages are a special case in that they necessarily depend on foo
further down the chain. The same cannot be said for other implicit dependencies.



> 
> > As for complication, even if there were a large number of deps to consider,
> > there would likely be ways to generate at least a tentative list using simple
> > tools. It could then be refined through feedback.*
> 
> Again, I think this is too idealistic.  Our current dependency checking 
> tool (namcap) has a lot of issues determining the dependencies of a 
> package and it is not be refined much at all...

Maybe not "simple" tools then, but it should be possible to create something
that can generate a list, e.g. a sandbox tool that can determine which
libraries etc are accessed at build time and run time, which could then be
mapped back to package.



Forget that for a moment though and answer this instead: Can you think of any
way other than direct specification that would guarantee that all dependencies
are installed with a package (presuming that we know exactly what a package
depends on). E.g. if a package depends on foo and foo-bar, then foo-bar clearly
suffices, but how would you formally guarantee something such as glibc?


More information about the aur-general mailing list