[pacman-dev] [PATCH] Update existing sources instead of removing them first for VCS sources.
Previously, the sources in $srcdir were always removed prior building. After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. --- scripts/makepkg.sh.in | 33 +++++++++++++++++++++------------ 1 file changed, 21 insertions(+), 12 deletions(-) diff --git a/scripts/makepkg.sh.in b/scripts/makepkg.sh.in index 67ec240..bb4297c 100644 --- a/scripts/makepkg.sh.in +++ b/scripts/makepkg.sh.in @@ -516,9 +516,11 @@ extract_bzr() { msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "bzr" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" - if ! { bzr checkout "$dir" "${revision[@]}" --lightweight && + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + bzr update + elif ! { bzr checkout "$dir" "${revision[@]}" --lightweight && ( cd "$repo" && bzr pull "$dir" -q --overwrite "${revision[@]}" ); }; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "bzr" plain "$(gettext "Aborting...")" @@ -580,9 +582,11 @@ extract_git() { msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "git" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" - - if ! git clone "$dir"; then + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + git pull + cd_safe "$srcdir" + elif ! git clone "$dir"; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -607,7 +611,7 @@ extract_git() { fi if [[ -n $ref ]]; then - if ! git checkout -b makepkg $ref; then + if ! git checkout -B makepkg $ref; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -662,7 +666,6 @@ extract_hg() { msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" local ref if [[ -n $fragment ]]; then @@ -677,7 +680,10 @@ extract_hg() { esac fi - if ! hg clone "${ref[@]}" "$dir" "${dir##*/}"; then + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + hg pull -u + elif ! hg clone "${ref[@]}" "$dir" "${dir##*/}"; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "hg" plain "$(gettext "Aborting...")" exit 1 @@ -739,7 +745,6 @@ extract_svn() { msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "svn" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" local ref if [[ -n $fragment ]]; then @@ -754,16 +759,20 @@ extract_svn() { esac fi - cp -a "$dir" . - if [[ -n ${ref} ]]; then - cd_safe "$(get_filename "$netfile")" + cd_safe "$dir" if ! svn update -r ${ref}; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn" plain "$(gettext "Aborting...")" fi fi + if [[ -d "${dir##*/}" ]]; then + cp -au "$dir" "$srcdir" + else + cp -a "$dir" . + fi + popd &>/dev/null } -- 1.8.4
On Wed, Oct 9, 2013 at 2:13 PM, Lukáš Jirkovský <l.jirkovsky@gmail.com> wrote:
Previously, the sources in $srcdir were always removed prior building.
After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. ---
One thing that was always bugging me here was the kernel package, where introducing new files through a patch would break another makepkg run. This is relevant here, right? cheers! mar77i
On 10/10/13 07:32, Martti Kühne wrote:
On Wed, Oct 9, 2013 at 2:13 PM, Lukáš Jirkovský <l.jirkovsky@gmail.com> wrote:
Previously, the sources in $srcdir were always removed prior building.
After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. ---
One thing that was always bugging me here was the kernel package, where introducing new files through a patch would break another makepkg run. This is relevant here, right?
Not really. I'm guessing applying the patch with -N would fix that. Allan
On Wed, Oct 9, 2013 at 11:57 PM, Allan McRae <allan@archlinux.org> wrote:
Not really. I'm guessing applying the patch with -N would fix that.
Allan
It is usually necessary to also add "|| true" to the patch line, so even if the patch fails it will continue. Lukas
On 09/10/13 22:13, Lukáš Jirkovský wrote:
Previously, the sources in $srcdir were always removed prior building.
After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. --- scripts/makepkg.sh.in | 33 +++++++++++++++++++++------------ 1 file changed, 21 insertions(+), 12 deletions(-)
I'm looking at the git functions here as this is what I am testing... I am almost certain the comment below applied to bzr and hg too. <snip>
@@ -580,9 +582,11 @@ extract_git() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "git" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" - - if ! git clone "$dir"; then + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + git pull + cd_safe "$srcdir"
No need for that cd there.
+ elif ! git clone "$dir"; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -607,7 +611,7 @@ extract_git() { fi
if [[ -n $ref ]]; then - if ! git checkout -b makepkg $ref; then + if ! git checkout -B makepkg $ref; then
I immediately noticed that this will not allow me to switch from using a branch/commit/tag and then revert to using master. I guess removing that "if [[ -n $ref ]]" is enough to fix that?
error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -662,7 +666,6 @@ extract_hg() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}"
local ref if [[ -n $fragment ]]; then
<snip>
@@ -754,16 +759,20 @@ extract_svn() { esac fi
- cp -a "$dir" . -
Huh...
if [[ -n ${ref} ]]; then - cd_safe "$(get_filename "$netfile")" + cd_safe "$dir"
Umm... how are you going into $dir when it has never been copied?
if ! svn update -r ${ref}; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn" plain "$(gettext "Aborting...")" fi fi
+ if [[ -d "${dir##*/}" ]]; then + cp -au "$dir" "$srcdir" + else + cp -a "$dir" . + fi +
I think this needs to go where the old "cp -a" was. I also guess that svn will not be affected by the issue with git/hg/bzr above given the master copy is always being copied over and then the reference checked out.
popd &>/dev/null }
On Thu, Oct 10, 2013 at 5:05 AM, Allan McRae <allan@archlinux.org> wrote:
On 09/10/13 22:13, Lukáš Jirkovský wrote:
Previously, the sources in $srcdir were always removed prior building.
After this change the sources are only updated. This makes incremental builds possible. Also this goes in line with the current behaviour for other types of sources, where the sources are just being overwritten without being removed first. --- scripts/makepkg.sh.in | 33 +++++++++++++++++++++------------ 1 file changed, 21 insertions(+), 12 deletions(-)
I'm looking at the git functions here as this is what I am testing... I am almost certain the comment below applied to bzr and hg too.
<snip>
@@ -580,9 +582,11 @@ extract_git() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "git" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}" - - if ! git clone "$dir"; then + if [[ -d "${dir##*/}" ]]; then + cd_safe "${dir##*/}" + git pull + cd_safe "$srcdir"
No need for that cd there.
I need to cd to the repository that I need to update. Then I need to cd back to $srcdir, because it later changes directory to "${dir##*/}" which is just name of the repository, not an absolute path.
+ elif ! git clone "$dir"; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -607,7 +611,7 @@ extract_git() { fi
if [[ -n $ref ]]; then - if ! git checkout -b makepkg $ref; then + if ! git checkout -B makepkg $ref; then
I immediately noticed that this will not allow me to switch from using a branch/commit/tag and then revert to using master.
I guess removing that "if [[ -n $ref ]]" is enough to fix that?
I haven't thought of that. I'm not sure how to fix that.
error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -662,7 +666,6 @@ extract_hg() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}"
local ref if [[ -n $fragment ]]; then
<snip>
I don't see any problem here.
@@ -754,16 +759,20 @@ extract_svn() { esac fi
- cp -a "$dir" . -
Huh...
It is copied later. See my comment about "cp -au" later.
if [[ -n ${ref} ]]; then - cd_safe "$(get_filename "$netfile")" + cd_safe "$dir"
Umm... how are you going into $dir when it has never been copied?
If I understand it correctly, $dir is the repository location in that has been checked out in download_svn(). I update it here, so the "cp -au" trick can be used. However this is not a clean solution. In fact, this is what I would like to fix in the future, because in my opinion the current handling of SVN is seriously flawed. Let me explain. The current code in download_svn() checks out the desired data in the most recent revision, because the fragment is not used. This working copy is then copied to $srcdir and updated to the specified fragment. The major problem here is that the working copy in SVN contains only the checked out files, everything else needs to be downloaded from server. Imagine that you have a repository whose most recent revision is 1000 and the fragment specifies revision 2. The current code will checkout the repository at revision 1000 in download_svn(). In extract_svn() this working copy is copied to $srcdir and it runs svn update to update to revision 2, which will basically remove all the files changed in the following 998 revisions and downloads the rest from the server anew, this time with at the specified revision. TL;DR: the current code for handling subversion sources may download sources twice if a revision is specified in the fragment. Therefore I think SVN should be an exception in that the fragment should be used in download_svn()
if ! svn update -r ${ref}; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn" plain "$(gettext "Aborting...")" fi fi
+ if [[ -d "${dir##*/}" ]]; then + cp -au "$dir" "$srcdir" + else + cp -a "$dir" . + fi +
I think this needs to go where the old "cp -a" was. I also guess that svn will not be affected by the issue with git/hg/bzr above given the master copy is always being copied over and then the reference checked out.
Actually no. The update of the repository in $dir that is done in the previous snippet only updates the affected files. If I then uses "cp -au" to copy the repository from $dir to the repository copy in $srcdir, only the files that were updated are copied, because they have newer timestamps. I've tested it on two packages and it seemed to work fine. Lukas
On 10/10/13 19:22, Lukas Jirkovsky wrote:
On Thu, Oct 10, 2013 at 5:05 AM, Allan McRae <allan@archlinux.org> wrote:
On 09/10/13 22:13, Lukáš Jirkovský wrote:
I'm responding to just SVN here.
error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "git" plain "$(gettext "Aborting...")" exit 1 @@ -662,7 +666,6 @@ extract_hg() {
msg2 "$(gettext "Creating working copy of %s %s repo...")" "${repo}" "hg" pushd "$srcdir" &>/dev/null - rm -rf "${dir##*/}"
local ref if [[ -n $fragment ]]; then
<snip>
I don't see any problem here.
@@ -754,16 +759,20 @@ extract_svn() { esac fi
- cp -a "$dir" . -
Huh...
It is copied later. See my comment about "cp -au" later.
if [[ -n ${ref} ]]; then - cd_safe "$(get_filename "$netfile")" + cd_safe "$dir"
Umm... how are you going into $dir when it has never been copied?
If I understand it correctly, $dir is the repository location in that has been checked out in download_svn(). I update it here, so the "cp -au" trick can be used. However this is not a clean solution. In fact, this is what I would like to fix in the future, because in my opinion the current handling of SVN is seriously flawed. Let me explain.
The current code in download_svn() checks out the desired data in the most recent revision, because the fragment is not used. This working copy is then copied to $srcdir and updated to the specified fragment. The major problem here is that the working copy in SVN contains only the checked out files, everything else needs to be downloaded from server.
Imagine that you have a repository whose most recent revision is 1000 and the fragment specifies revision 2. The current code will checkout the repository at revision 1000 in download_svn(). In extract_svn() this working copy is copied to $srcdir and it runs svn update to update to revision 2, which will basically remove all the files changed in the following 998 revisions and downloads the rest from the server anew, this time with at the specified revision.
TL;DR: the current code for handling subversion sources may download sources twice if a revision is specified in the fragment. Therefore I think SVN should be an exception in that the fragment should be used in download_svn()
if ! svn update -r ${ref}; then error "$(gettext "Failure while creating working copy of %s %s repo")" "${repo}" "svn" plain "$(gettext "Aborting...")" fi fi
+ if [[ -d "${dir##*/}" ]]; then + cp -au "$dir" "$srcdir" + else + cp -a "$dir" . + fi +
I think this needs to go where the old "cp -a" was. I also guess that svn will not be affected by the issue with git/hg/bzr above given the master copy is always being copied over and then the reference checked out.
Actually no. The update of the repository in $dir that is done in the previous snippet only updates the affected files. If I then uses "cp -au" to copy the repository from $dir to the repository copy in $srcdir, only the files that were updated are copied, because they have newer timestamps. I've tested it on two packages and it seemed to work fine.
I see what you are doing now. I quickly reviewed with the mindset that you were doing the same in SVN as the rest. I find your explaination of the handling of the fragment in SVN reasonable. But this needs to be done in a separate patch. One thing that would need confirmed is changing the value of "revision#..." or removing or adding it works fine. Allan
On Thu, Oct 10, 2013 at 1:18 PM, Allan McRae <allan@archlinux.org> wrote:
I find your explaination of the handling of the fragment in SVN reasonable. But this needs to be done in a separate patch. One thing that would need confirmed is changing the value of "revision#..." or removing or adding it works fine.
Allan
What would be the preferred approach? Should I submit patch for the SVN first and then this patch, or should I do it differently? Lukas
On 11/10/13 18:02, Lukas Jirkovsky wrote:
On Thu, Oct 10, 2013 at 1:18 PM, Allan McRae <allan@archlinux.org> wrote:
I find your explaination of the handling of the fragment in SVN reasonable. But this needs to be done in a separate patch. One thing that would need confirmed is changing the value of "revision#..." or removing or adding it works fine.
Allan
What would be the preferred approach? Should I submit patch for the SVN first and then this patch, or should I do it differently?
How about a set of patches: 1) Change SVN with your aforementioned reasoning. 2) Change SVN to use "cp -au" if the directory exists in $srcdir 3) Have git update itself rather than delete ... We will deal with hg/bzr once we have git working properly. For git, the solution appears to be to always to do the "git checkout -B" if it is being updated, regardless of if it has a commit/tag/branch reference or not. So the check will be: if [[ -n $ref ]] || (( updating )) where "updating" records whether or not there was a git checkout in $srcdir in the first place. Allan
On Sun, Oct 13, 2013 at 10:09 AM, Allan McRae <allan@archlinux.org> wrote:
How about a set of patches:
1) Change SVN with your aforementioned reasoning. 2) Change SVN to use "cp -au" if the directory exists in $srcdir 3) Have git update itself rather than delete ...
Sounds OK to me.
We will deal with hg/bzr once we have git working properly.
Mercurial shouldn't be a problem, I use it every day, compared to git that I use rather scarcely. But I guess it would make sense to change the Mercurial command from: hg pull -u ie. pull and update (merging uncommitted changes) to: hg pull && hg update -C which pulls the sources and updates them while discarding uncommitted changes.
For git, the solution appears to be to always to do the "git checkout -B" if it is being updated, regardless of if it has a commit/tag/branch reference or not. So the check will be:
if [[ -n $ref ]] || (( updating ))
where "updating" records whether or not there was a git checkout in $srcdir in the first place.
Is creating a branch actually necessary? git checkout should work just fine when dealing with branches and commits and for tags the ref could be prepended with "tags/" I don't know how big problem a detached head would be. I suppose nobody sane would do a real development in makepkg's repository though. Lukas
On Wed, Oct 16, 2013 at 3:40 PM, Lukas Jirkovsky <l.jirkovsky@gmail.com> wrote:
Is creating a branch actually necessary? git checkout should work just fine when dealing with branches and commits and for tags the ref could be prepended with "tags/" I don't know how big problem a detached head would be. I suppose nobody sane would do a real development in makepkg's repository though.
Never mind this comment. It kinda makes sense when someone wants to patch the sources with eg. "git apply"
participants (4)
-
Allan McRae
-
Lukas Jirkovsky
-
Lukáš Jirkovský
-
Martti Kühne