[pacman-dev] [PATCH] Update existing sources instead of removing them first for VCS sources.

Allan McRae allan at archlinux.org
Thu Oct 10 07:18:01 EDT 2013


On 10/10/13 19:22, Lukas Jirkovsky wrote:
> On Thu, Oct 10, 2013 at 5:05 AM, Allan McRae <allan at archlinux.org> wrote:
>> On 09/10/13 22:13, Lukáš Jirkovský wrote:


I'm responding to just SVN here.


>>>                       error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git"
>>>                       plain "$(gettext "Aborting...")"
>>>                       exit 1
>>> @@ -662,7 +666,6 @@ extract_hg() {
>>>
>>>       msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg"
>>>       pushd "$srcdir" &>/dev/null
>>> -     rm -rf "${dir##*/}"
>>>
>>>       local ref
>>>       if [[ -n $fragment ]]; then
>>
>>
>> <snip>
> 
> I don't see any problem here.
> 
>>> @@ -754,16 +759,20 @@ extract_svn() {
>>>               esac
>>>       fi
>>>
>>> -     cp -a "$dir" .
>>> -
>>
>> Huh...
> 
> It is copied later. See my comment about "cp -au" later.
> 
>>>       if [[ -n ${ref} ]]; then
>>> -             cd_safe "$(get_filename "$netfile")"
>>> +             cd_safe "$dir"
>>
>> Umm...   how are you going into $dir when it has never been copied?
> 
> If I understand it correctly, $dir is the repository location in that
> has been checked out in download_svn(). I update it here, so the "cp
> -au" trick can be used. However this is not a clean solution. In fact,
> this is what I would like to fix in the future, because in my opinion
> the current handling of SVN is seriously flawed. Let me explain.
> 
> The current code in download_svn() checks out the desired data in the
> most recent revision, because the fragment is not used. This working
> copy is then copied to $srcdir and  updated to the specified fragment.
> The major problem here is that the working copy in SVN contains only
> the checked out files, everything else needs to be downloaded from
> server.
> 
> Imagine that you have a repository whose most recent revision is 1000
> and the fragment specifies revision 2. The current code will checkout
> the repository at revision 1000 in download_svn(). In extract_svn()
> this working copy is copied to $srcdir and it runs svn update to
> update to revision 2, which will basically remove all the files
> changed in the following 998 revisions and downloads the rest from the
> server anew, this time with at the specified revision.
> 
> TL;DR: the current code for handling subversion sources may download
> sources twice if a revision is specified in the fragment. Therefore I
> think SVN should be an exception in that the fragment should be used
> in download_svn()
> 
>>>               if ! svn update -r ${ref}; then
>>>                       error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn"
>>>                       plain "$(gettext "Aborting...")"
>>>               fi
>>>       fi
>>>
>>> +     if [[ -d "${dir##*/}" ]]; then
>>> +             cp -au "$dir" "$srcdir"
>>> +     else
>>> +             cp -a "$dir" .
>>> +     fi
>>> +
>>
>> I think this needs to go where the old "cp -a" was.  I also guess that
>> svn will not be affected by the issue with git/hg/bzr above given the
>> master copy is always being copied over and then the reference checked out.
>>
> 
> Actually no. The update of the repository in $dir that is done in the
> previous snippet only updates the affected files. If I then uses "cp
> -au" to copy the repository from $dir to the repository copy in
> $srcdir, only the files that were updated are copied, because they
> have newer timestamps. I've tested it on two packages and it seemed to
> work fine.


I see what you are doing now.  I quickly reviewed with the mindset that
you were doing the same in SVN as the rest.

I find your explaination of the handling of the fragment in SVN
reasonable.  But this needs to be done in a separate patch.  One thing
that would need confirmed is changing the value of "revision#..." or
removing or adding it works fine.

Allan



More information about the pacman-dev mailing list