On Mon 21 Feb 2011 16:42 +0100, Dieter Plaetinck wrote:
On Mon, 21 Feb 2011 16:35:33 +0100 Lukas Fleischer firstname.lastname@example.org wrote:
On Mon, Feb 21, 2011 at 03:46:47PM +0100, Dieter Plaetinck wrote:
On Mon, 21 Feb 2011 14:50:39 +0100 Lukas Fleischer email@example.com wrote:
The only issue that might affect the end users as well is "ZIP bombs". Most users will probably notice such a thing before it is entirely extracted, just interrupt tar(1)/gzip(1) and send a removal request to aur-general, however.
hmmm. some good points. I guess I could try the suggested approach and see how I like it. However, now that you bring up the "zip bombs", do you think it's feasible to scan for them serverside without compromising security and/or making things needlessly complicated? it would be useful for clients if that one aspect could be filtered out in advance.
I don't think this is possible without decompressing the tarball which is again vulnerable to (D)DoS.
hmm maybe we mean different things. you are talking about exhausting ram/cpu/time, right? http://en.wikipedia.org/wiki/Zip_bomb In that case, sure, just leave it to the client. the problem is trivial enough.
I was talking about bad filenames (like ../../foo, /foo, /root/foobar, /tmpl/blah, and whatever else is posible) that might be prevented with `tar -t`
Yeah I was thinking we could be more strict about the source package format as well. For example rejecting any that have src or pkg directories. Those often contain upstream source code our actual builds by mistake. I think that's mostly from old packages where the uploader didn't use makepkg --source.