Inspired by this other topic, I added DOWNLOAD_DIR to our superbuild’s
ExternalProject_Add to avoid re-downloading external libraries’ archive files, which works nicely.
DOWNLOAD_DIR however only works to provide caching for archives retrieved via URL sources, according to ExternalProject_Add documentation.
Since we clone a few libraries from git in our superbuild, I was wondering if there was something like a “cache” for git repositories as well? I would imagine something like a setting specifying a local git repository that acts as a cache; which will be copied to the source directory; then the “original” git repo would be added as remote (if not already existing); then the update step could be performed as normal if necessary; this would mean that only commits not in the local repository yet need to be fetched.
The only, kind of hacky, workaround I can think of at the moment is to have some step before the call to ExternalProject_Add, which, if no source was cloned yet, takes as input the local repository path and copies it in the place where the ExternalProject_Add’s download step would load it into…
My use case: I have a superbuild which I want to test automatically; and while I want to do the superbuild from scratch (to fully test it), I also want to avoid large bandwidth usage (i.e. avoid re-downloading/cloning the whole git repository of, let’s say VTK).