On 9/19/18 3:15 PM, Luke Shumaker wrote:
On Sat, 08 Sep 2018 16:31:16 -0400, Luke Shumaker wrote:
From: Luke Shumaker <lukeshu@parabola.nu>
Without --stream, `hg clone` reencodes+recompresses the entire repository, to the storage settings of the host. But download_hg() already did that on the initial network clone, and it is 100% pointless duplicated work for the local clone.
The work that this saves is CPU-bound (not disk-bound), and is restricted to a single core.
After more testing, this didn't have the speed-up that I expected. Consider the patch withdrawn.
As a matter of curiosity, was this just "not much savings" or "not actually saving"? What kind of practical effect does it have, ultimately? ... I'm wondering if it's worth doing something similar elsewhere, specifically for git clone --shared It would save cp'ing possibly bloaty files from SRCDEST to BUILDDIR, in the event that the two directories are on different partitions. Normally git would optimize this away by creating hardlinks. Downsides are: - If SRCDEST is rm'ed then the BUILDDIR clone breaks -- but I consider that reasonable, plus if you re-clone to SRCDEST it magically works again... - If the upstream source does a force push and SRCDEST prunes some commits in our ephemeral clone via git gc --auto, *and* users treat BUILDDIR as a place to commit changes they want to keep, they may get a broken repo and missing commits. Do we care about this? Worth noting is they will already have makepkg trying to force-reset the default "makepkg" branch. -- Eli Schwartz Bug Wrangler and Trusted User