On 11/12/13 19:31, Lukas Jirkovsky wrote:
On Wed, Dec 11, 2013 at 6:08 AM, Allan McRae <allan@archlinux.org> wrote:
There will be plenty of build systems that will use @.c (made up psuedo-make syntax) to compile all files.
I think I've a few ideas how to fix the problem with deleted files.
First idea is:
1. checkout a new revision into a temporary directory 2. create patch between the current checkout in $SRCDEST and the checkout from 1. 3. replace the checkout in $SRCDEST with the temporary checkout from 1. 4. apply the patch from 2. to the sources in $srcdir 5. update .svn of the checkout in $srcdir using the one from $SRCDEST
The temporary checkout could by avoided if the patch was created using "svn diff" but that would probably download the changes twice.
Second idea is to do the update the other way for svn: 1. run "svn update" on the checkout in $srcdir 2. replace the sources in $SRCDEST with "svn export" from $srcdir 3. copy .svn from checkout in $srcdir to the sources in $SRCDEST effectively making it a valid checkout
If we were going for something like these two... I'd say: 1) Copy checkout in $SRCDEST to temporary directory 2) Update checkout in $SRCDEST 3) take diff and delete temporary copy 4) apply diff to copy in $srcdir which still involves temporary copies which could be quite large.
The last idea: Log deleted files from the "svn update" output and then delete them manually. However, I'm afraid of such approach, because it would break if the update output changed.
Definitely not! Parsing output of external software is bad. In the end, I think we need to just accept that SVN in a centralised system and it is difficult to do this perfectly. Allan