CMake and vcpkg on Ubuntu 20.04 - get around 2+ hour build time

I’m sure there is probably a tutorial I haven’t found. Vcpkg just can’t be this bad. On Windows it seemed okay, but I decided to use vcpkg and cpack to create my packages instead of continuing with shell scripts and old world tools for Debian & RPM packages.

The CMake build, prior to adding vcpkg, using packaged libraries from the various repos would build in well under three minutes.

Introduction of VCPKG into the build has 2+ hours being burned building dependencies from source. Wouldn’t be bad if that was a one-time thing. Hard to make changes when it takes well over 2 hours to build . . . assuming it doesn’t run out of disk or get killed by OOM.

Project is here. Branch ls-cs-0.2.1 is where the code is.

LsCs-Deb-build-dependencies.sh is the script for setting up your clean Ubuntu 20.04 VM (be certain to have at least 500GB of disk allocated to the VM) At the end of the script it instructs about a couple of manual steps.

build-LsCs-local.sh is where I go to build the libraries. When I build the libraries I always want to build my code 100% clean. I don’t want to sit through basically the same vcpkg 2+ hour time waster. This is a 13th gen i9 with ~120GB of physical RAM. The VM is assigned 6-core and ~48GB of RAM. I squat on the core CMake/vcpkg can use because if I let it default to 7 we get OOM killed.

Is there a way, after one has installed the dependencies, completed the manual steps, and pulled down the project, to run vcpkg install in the source tree (of course telling .gitignore to ignore it) and have the clean build use these libraries? I can make peace with burning 2+ hours the first time I spin up each VM, but on every build?

Hey, I do Yocto builds for embedded systems. I made peace with 3.5-5 hour builds for targets because I had no choice. I have choice here. Use vcpkg only for Windows, making Windows the ugly red headed step-child chained in the basement, and continue on with my crummudgeony ways for Debian & RPM, kicking Arch and other packaging waaaaaaaaaaay down the list, some time long after my death.

Thanks for reading this and any help you can provide.

Using vcpkg for dependencies is a different philosophy from your typical linux “install all this junk and get it from the system”. It gives you more control over your dependencies, but also gives you more responsibility for managing them.

In order for github CI builds to work reasonably, you want to use binary caching to avoid rebuilding the dependencies from source on every CI build.

(When you build locally with vcpkg, the built dependencies are cached and get re-used the next time you build, assuming nothing has changed in your set of dependencies.)

If you take a look at my workflow for Iterated Dynamics you will see how I manage a personal access token for my repository and configure vcpkg to use that for binary caching via NuGet.

There was an easier caching mechanism before, but github deprecated that, so the advice now is to use NuGet packages. If you have the appropriate permissions (and access token), then the first time you build this way it will compile dependencies from source and on a successful build, they will become packages associated with your repository. The next build should re-use the binary packages.

I am not now, nor will I be, using CI or github workflows. Agile is not now, nor will it ever be, Software Engineering. Engineering == Do it right the first time.

It is the local caching I’m asking about. There should be no package or extra github required for this. I should be able to build these to a specific location on my drive and tell CMake where they are so when I clean build my code they just get used.

LcCs is only on Github because CopperSpice was there. Once I complete the series of changes I’m making the entire project will move to SourceForge.

Thank you for the time you have put into this.

Ah!

This is what I was looking for.

Well, everything before NuGet

That is how it is supposed to work, you don’t even need to do anything special for that, except that you don’t set “specific location on my drive”, because vcpkg caches the build artifacts in a particular place - by default on Linux it is ~/.cache/vcpkg/archives, unless you’ve changed the VCPKG_DEFAULT_BINARY_CACHE environment variable. Do you have any files in that location, just to confirm that things are getting cached at all?

This tutorial walks you through setting-up a custom cache, which is rarely needed, as default caching is already supposed to be working out of the box. So I would rather investigate why the default one doesn’t(?) work in your environment.

It is supposed to be exactly one-time thing. First ever build will indeed take a lot of time, but after that the artifacts will be cached on disk (or elsewhere, if you’ve set it otherwise, such as a remote NuGet server). And all the next builds will simply unpack the cached artifacts with pre-built binaries instead of building them from sources again, which will only take some seconds (unless you have some really big dependencies with lots of files, such as Boost or Qt).

But that is only if nothing changed, like Richard said above, meaning that you have the same build tools versions, environment variables, etc. So if you get all the dependencies building again instead of restoring from cache, then it would mean that either something has in fact changed in your environment (and it changes on every build?) or that you’ve explicitly disabled caching somehow.

First off let me thank everyone for chiming in.

I have decided to kick VCPKG to the curb. I will use it for Windows where it will be the ugly red headed stepchild chained in the basement. It is not ready for prime time on Linux. I wish you all well with it. Nearly impossible to get it to work on Ubuntu 18.04 except for the simplest “Hello World!” programs. The real problem is that it constantly wants to build from tip-of-tip which is not cool. Yes, there are methods of locking versions

https://learn.microsoft.com/en-us/vcpkg/consume/lock-package-versions?tabs=inspect-powershell

https://learn.microsoft.com/en-us/vcpkg/users/versioning

But vcpkg itself doesn’t have the concept of Age Appropriate Content. Yes, one can spend hours on link

https://vcpkg.link/

but it is all poke and hope. vcpkg just isn’t designed for production environments where you only want to build your changes and never ever, under any circumstances, pull in something new from the outside.

As to the builds, it kept dying wanting something newer that wasn’t in the Ubuntu 20.04 VM. Once the deb was installed that kicked off another 2+ hour build. Then it was littered with problems like these.

Dependency vdpau found: NO. Found 1.3 but need: ‘>= 1.4’

Again, if one wanted to spend a large chunk of their life in the link page poking and hoping with different versions and scouring dependency lists one might be able to get it to work. For Linux, unless you are doing school projects, it is just not worth the pain.

I’m only chiming in here now because those helping me are building a library. Unless they are okay with it only running on the very latest Linux distro they are going to hit the same problems. Windows has been playing games with Win32 for decades. This is how they do backward compatibility. Not how it goes on Linux.

This library needs to work at least back through Ubuntu 18.04. A few wish it for as far back as 12.04. Somewhere between 12 and 16 there was a C ABI change that was not backward compatible though. Had to straddle that great divide once before. Not really interested in doing it again.

It appears CPack can still be used to generate the packages, but vcpkg is an extreme amount of pain for almost no gain.

Thanks again for chiming in.

Just to share a different experience, we use vcpkg in our local CI/CD that builds our SDK about 30-50 times a day (triggers on every commit) on several buildbots running Windows, GNU/Linux and Mac OS. We have about 60-something 3rd-party dependencies, and among other reasons why we started using vcpkg was exactly its out-of-the-box functionality of caching and restoring pre-built packages, which saves a tremendous amount of build time for us, and it is really only our changes that are getting built. Everything works isolated from the internet too (after we established asset caching for the required build tools as well (not to be confused with the “regular” caching of pre-built packages)).

Quite unfortunate that it didn’t work out in your environment.

And just what Linux versions are you supporting?

My guess is you only build for tip-of-tip. You don’t build for 18.04 and prior.

Mainly it is Ubuntu distribution, the oldest we have at the moment is 20.04 (you mentioned that one in your original post and topic too), then there is 22.04 and recently we added 24.04.

That’s not been my experience. I find it works great on linux, both for CI builds and for builds on my local machine.

Honestly, it’s always the Windows build that is harder to get right than the linux build (unless you have to target really old server distributions like centos, etc.).

But not 18.04 or prior. I only tried 20.04 after vcpkg failed abysmally on 18.04.

In the end vcpkg isn’t worth the pain for production code that will have to support 20-30 years having a starting point roughly a decade prior to today. Much of my work and stuff I work on is in the medical device world. Once you build it, that’s where your Dev system has to stay. Minor enhancement changes generally don’t include Dev system OS upgrades because Agile poops another cesspit of bugs out the back end of every sprint. Medical devices must have stability.

vcpkg needs to expand its triplet list. maybe you (or someone on your project) went through the pain like this guy did

Merging two answers here: Yes, 18.04; 16.04; and possibly 14.04.

Adding insult to injury, the current version of this has Nvidia and Vulkan.