On Thu, Oct 10, 2013 at 5:05 AM, Allan McRae <allan@archlinux.org> wrote:
On 09/10/13 22:13, Lukáš Jirkovský wrote:
Previously, the sources in $srcdir were always removed prior building.
After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. --- scripts/makepkg.sh.in | 33 +++++++++++++++++++++------------ 1 file changed, 21 insertions(+), 12 deletions(-)
I'm looking at the git functions here as this is what I am testing... I am almost certain the comment below applied to bzr and hg too.
<snip>
@@ -580,9 +582,11 @@ extract_git() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "git" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" - - if ! git clone "$dir"; then + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + git pull + cd_safe "$srcdir"
No need for that cd there.
I need to cd to the repository that I need to update. Then I need to cd back to $srcdir, because it later changes directory to "${dir##*/}" which is just name of the repository, not an absolute path.
+ elif ! git clone "$dir"; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -607,7 +611,7 @@ extract_git() { fi
if [[ -n $ref ]]; then - if ! git checkout -b makepkg $ref; then + if ! git checkout -B makepkg $ref; then
I immediately noticed that this will not allow me to switch from using a branch/commit/tag and then revert to using master.
I guess removing that "if [[ -n $ref ]]" is enough to fix that?
I haven't thought of that. I'm not sure how to fix that.
error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -662,7 +666,6 @@ extract_hg() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}"
local ref if [[ -n $fragment ]]; then
<snip>
I don't see any problem here.
@@ -754,16 +759,20 @@ extract_svn() { esac fi
- cp -a "$dir" . -
Huh...
It is copied later. See my comment about "cp -au" later.
if [[ -n ${ref} ]]; then - cd_safe "$(get_filename "$netfile")" + cd_safe "$dir"
Umm... how are you going into $dir when it has never been copied?
If I understand it correctly, $dir is the repository location in that has been checked out in download_svn(). I update it here, so the "cp -au" trick can be used. However this is not a clean solution. In fact, this is what I would like to fix in the future, because in my opinion the current handling of SVN is seriously flawed. Let me explain. The current code in download_svn() checks out the desired data in the most recent revision, because the fragment is not used. This working copy is then copied to $srcdir and updated to the specified fragment. The major problem here is that the working copy in SVN contains only the checked out files, everything else needs to be downloaded from server. Imagine that you have a repository whose most recent revision is 1000 and the fragment specifies revision 2. The current code will checkout the repository at revision 1000 in download_svn(). In extract_svn() this working copy is copied to $srcdir and it runs svn update to update to revision 2, which will basically remove all the files changed in the following 998 revisions and downloads the rest from the server anew, this time with at the specified revision. TL;DR: the current code for handling subversion sources may download sources twice if a revision is specified in the fragment. Therefore I think SVN should be an exception in that the fragment should be used in download_svn()
if ! svn update -r ${ref}; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn" plain "$(gettext "Aborting...")" fi fi
+ if [[ -d "${dir##*/}" ]]; then + cp -au "$dir" "$srcdir" + else + cp -a "$dir" . + fi +
I think this needs to go where the old "cp -a" was. I also guess that svn will not be affected by the issue with git/hg/bzr above given the master copy is always being copied over and then the reference checked out.
Actually no. The update of the repository in $dir that is done in the previous snippet only updates the affected files. If I then uses "cp -au" to copy the repository from $dir to the repository copy in $srcdir, only the files that were updated are copied, because they have newer timestamps. I've tested it on two packages and it seemed to work fine. Lukas