On Thu, Aug 18, 2016 at 05:39:35PM -0400, Quint Guvernator wrote:
Hello all,
The default value of XferCommand (/usr/bin/curl -C - -f %u > %o) tells curl to try to resume downloads from a particular byte position if they are stopped prematurely. Not all servers support this, though, so curl may die with exit status 33 like this:
:: Checking nwjs-bin integrity... ==> Making package: nwjs-bin 0.16.1-1 (Thu Aug 18 00:38:19 EDT 2016) ==> Retrieving sources... -> Downloading nwjs-v0.16.1-linux-x64.tar.gz... ** Resuming transfer from byte position 8736768 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 47.1M 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 curl: (33) HTTP server doesn't seem to support byte ranges. Cannot resume. ==> ERROR: Failure while downloading http://dl.nwjs.io/v0.16.1/nwjs-v0.16.1-linux-x64.tar.gz Aborting...
(To reproduce, grab the nwjs-bin PKGBUILD from the AUR. makepkg and kill the download, then makepkg again. There's gotta be an instance of this in community or extra, but I haven't run into any yet.)
The same could theoretically happen with FTP; the corresponding error code there is 36.
This raises the question: should `curl -C` be the default download command when we can't trust all servers to support resuming incomplete downloads?
It's rare that people use the external downloader. It'a also rare that static content servers refuse continuation.
To me, it seems that adding `-C` should be documented on the wiki under pacman/Tips_and_Tricks rather than being enabled by default. This way, users (ostensibly) understand the gotcha involved before enabling it.
...but the external downloader isn't a default either.
There isn't much technical merit/demerit to discuss here, and there's a potential for bikeshedding, so maybe we could stick to "yes, I like it" or "no, I don't like it". It's barely a 1 LOC change.
no, I don't like it. d