Best practices for dependency management in a nested library / superproject setup (FetchContent vs. find_package)

Hello,

My team and I really like the versatility of FetchContent and we use it almost everywhere. We’ve built up a fair number of internal libraries to increase development speed, so things feel like Lego blocks now. That’s great, but it’s also brought me to a lot of problems with dependency handling. I see many projects using very different approaches (variables, custom functions, etc.) and I just can’t figure out what the best practice is supposed to be.

The main goal for us is not only to make things stable, but also to require the bare minimum work from the end user: ideally either FetchContent or find_package (and that’s it), unless they explicitly want to override something.

For context: we’re using CMake 3.30.

I have a fairly complex dependency tree that looks something like this:

  • LibA, LibB – external libraries (third-party).
  • LibC – external library that depends on LibA and LibB (not ours).
  • LibD – our library, depends on LibC and LibB.
  • LibE – our standalone library (no dependencies).
  • LibF – our library, depends on LibD.
  • exec – final executable, links against LibD, LibE, LibF.

My questions are:

  1. Where is it okay to use FetchContent, and where is it not advised?

    • Should FetchContent only appear at the top-level superproject (exec), or is it acceptable inside libraries (like LibD) to pull in their dependencies?
  2. Is it possible to make this work without introducing options/variables in each subproject?

    • For example, can I just write each library using find_package for its deps, and then let a superproject override everything with FetchContent && OVERRIDE_FIND_PACKAGE, ?
  3. What would be the needed order?

    • If I want to use FetchContent in the superproject for everything, do I need to declare and make available A, B, C before D and F, etc.?
  4. Export sets and collisions.

    • I often see errors like:
install(EXPORT ...) includes target "foo" which requires target "bar" that is not in any export set

the most apperent it is when building grpc using FetchContent for instance they go off after setting set(libname_ENABLE_INSTALL OFF)

  1. Handling SO name conflicts.
    • If both my project and the superproject rely on (say) LibB as a .so with the same SONAME/symbols, what’s the right way to avoid collisions?
      • Should I never export third-party libraries from my package along with my package?
      • Should I let the superproject always own the third-party deps (No FetchContentin the lib)?
      • Or is it okay to re-export them, then what do i do if my exec wants other version always not forget updating the lib along the exec?

In short:

  • We want stability and a simple user experience (just find_package or FetchContent, nothing more). If it’s possible. For instance setting up lib to use gRPC caused so much errors and fiddling with vars it uses, using set or option with Forcing cache to fix them, and those may not show up for a really long time.
  • But I’m really confused about what is considered “best practice” for balancing find_package, FetchContent, and install(EXPORT …).
  • Sometimes things work fine for months, then break with the smallest environment change.

Thanks for the help in advance

You should be able to use FetchContent throughout all levels of a dependency tree, assuming all projects are well-behaved. FetchContent is specifically designed to support such usage. The important thing to remember is that FetchContent works by a “first to declare, wins” principle. That means if the details of a dependency need to be specified by an outer project, that outer project needs to declare the dependency details before pulling in any dependencies. If you declare any details after FetchContent_MakeAvailable() has been called for any other dependency, it might already be too late as the one you wanted to override might already have been pulled in by something else. This is why approaches like that used in CPM are problematic, because they merge the call to FetchContent_Declare() and FetchContent_MakeAvailable() into a single call, and that is not how FetchContent is designed to work.

→ Do all your calls to FetchContent_Declare() before you make any calls to FetchContent_MakeAvailable().

In theory, yes you could do that, but I try to discourage folks from using the OVERRIDE_FIND_PACKAGE keyword, except perhaps in a controlled company environment. When you add that keyword, you’re forcing whoever or whatever is using your project to adopt FetchContent. If that project later becomes a dependency of another project, that’s not being nice to the consumer.

Having your projects use find_package() to obtain their dependencies is the way to go as a general principle. That gets a little more complicated with company projects where you often want to control dependencies more tightly rather than leaving it to the developer’s environment to supply some arbitrary version. Sometimes using FetchContent with the FIND_PACKAGE_ARGS keyword is what you want, as it will try using find_package() first, and only if that fails will it fall back to using FetchContent. This is a better fit for dependencies that are private projects and won’t be installed as a system package (which would be an uncontrolled version of the dependency, from your project’s perspective).

It’s worth pointing out that the two main package managers for C++ projects (conan and vcpkg) are both based around find_package(). If you use find_package() to specify your dependencies and bring them into your build, you give consumers the flexibility of using these off-the-shelf package managers to supply their dependencies. That can be a big win. Note that this doesn’t preclude you still overriding a dependency with FetchContent if you want to. With a little bit of work (not a lot), you can set things up such that the package manager provides the dependencies by default, but if a developer sets a FETCHCONTENT_SOURCE_DIR_<DEPNAME> variable, that is used with FetchContent to supply the dependency instead of the package manager. I do this with my consulting clients so they can work across multiple repositories when needed, but otherwise let conan provide their dependencies normally when they don’t need to do that.

  1. What would be the needed order?
  • If I want to use FetchContent in the superproject for everything, do I need to declare and make available A, B, C before D and F, etc.?

As long as you call FetchContent_Declare() for A, B, C, D, and F before you make any call to FetchContent_MakeAvailable(), it will do what you want.

Install rules is one area where things can get difficult with FetchContent. Each dependency need to individually be installable on its own for parent projects to be able to install anything that depends on them. If you can’t check out a dependency separately and then build and install it standalone, you won’t easily be able to install any project that depends on it without patching up the missing bits and pieces.

FetchContent can be a good fit for projects you control. It is well-suited to companies with a set of repositories they control, where you can enforce good structure and naming. FetchContent can be challenging if trying to use it on externally maintained projects, and many open source projects do not meet some of the requirements for using FetchContent safely and reliably. You might be able to use FetchContent with an open source project, but it will need to be a case-by-case assessment. Install rules is one area where many open source projects fall over. Unfortunately, this applies even to some widely used projects. This is where the value of a dedicated package manager comes in because they have had to solve those problems for you already.

I don’t know how you’d get conflicts unless you’re trying to combine two different dependencies that both try to supply the same thing. There’s no way around that, such dependencies cannot be combined in the one build (or if you can hack it to make it do that, you shouldn’t).

When you package up your project, you shouldn’t include its dependencies unless your project cannot ever be a dependency of something else. Examples of this would be projects that provide applications rather than libraries for others to build against. You might sometimes find an SDK provider gives you a package that includes pre-built libraries of its dependencies, but in my experience those are nearly always more of a problem than a help. They are frequently built incorrectly, and they actually make it harder to incorporate their SDK into a build that has other dependencies (their bundle dependencies end up conflicting with those provided by package managers, for example).

1 Like

Thank you so much for such in detail answer, thats rare.

My understanding so far (please correct me if I’m wrong):

  1. FetchContent “first declare wins”:

    • If the superproject FetchContent_Declare(A …) and FetchContent_MakeAvailable(A) before any subproject brings in A, then the superproject’s version is used.
    • If a subproject later declares FetchContent_Declare(A …) with a different GIT_TAG, it is ignored, because A is already populated.
    • This applies transitively: if C depends on A and B, and the superproject makes A and B available first, then C will automatically use those, not its own declared versions.
  2. Order matters:

    • All FetchContent_Declare() calls for things I want to override should appear before the first FetchContent_MakeAvailable().
    • That way, subprojects that use FetchContent (with or without FIND_PACKAGE_ARGS) pick up the superproject’s pinned versions.
  3. find_package vs FetchContent:

    • find_package() should be the normal/default way to resolve dependencies, because it plays nice with package managers (vcpkg, conan, system packages).
    • The superproject can still pin or override by declaring dependencies with FetchContent early, or by setting FETCHCONTENT_SOURCE_DIR_<NAME> for local dev.

Is the above correct?


Issue with FetchContent I had

I had a related issue with a library Y that uses spdlog in its public templates in Debug mode.

if (CMAKE_BUILD_TYPE STREQUAL "Debug")
    message(STATUS "Debug mode enabled")

    include(FetchContent)
    FetchContent_Declare(
            spdlog
            GIT_REPOSITORY https://github.com/gabime/spdlog.git
            GIT_TAG v1.14.1
            FIND_PACKAGE_ARGS
    )
    FetchContent_MakeAvailable(spdlog)
endif ()

...

if (CMAKE_BUILD_TYPE STREQUAL "Debug")
     target_link_libraries(Y PRIVATE # or PUBLIC
            ZLIB::ZLIB
            pthread
            spdlog::spdlog
    )
endif()
  • If I only link spdlog privately, consumers fail because they see spdlog symbols in headers.
  • If I link it PUBLIC, then when I export Y, CMake complains that spdlog also needs to be part of the export set, otherwise consumers can’t resolve it.

I do Declare and MakeAvailable spdlog befor library Y

I could export it under my namespace. But here’s my concern: won’t that create a second spdlog::spdlog target if the superproject also fetches/provides spdlog? The above mentioned conflict of so libs.
How do i avoid that duplication/conflict?

In other words, how do I correctly expose spdlog in Y’s public interface without exporting it myself, while still ensuring it works cleanly when Y is consumed as a dependency?

Our superprojects usually use spdlog for usual logging. And not only in Debug

Not quite. The “first to declare, wins” philosophy is literally the first to “declare”. That means the first call to FetchContent_Declare(A ...) will determine A’s details that will be fetched. There could be many calls to FetchContent_Declare(A ...) across multiple directory scopes before any call to FetchContent_MakeAvailable(A). It’s still the first call to FetchContent_Declare(A ...) that determines the details used by the FetchContent_MakeAvailable(A) call.

It’s not because of the population, it’s just the declaration. Hopefully my comment above already makes that clear.

The rest of your points 1-3 seem about right.

Hi, again thanks for the answer, we’ve finally gone open source so i can show my problem to you with propper examples
uvent
Webserver - Sorry for the mess, its been a very bad month for my cmake skills

Just a few minutes ago, i was tackling down the problem,
In out super project we did

FetchContent_Declare(
        uvent
        GIT_REPOSITORY https://github.com/Usub-development/uvent.git
        GIT_TAG timer_optimizations
)

FetchContent_MakeAvailable(uvent)

FetchContent_Declare(
        server
        GIT_REPOSITORY https://github.com/Usub-development/webserver
        GIT_TAG main
)

FetchContent_MakeAvailable(server)

and it just didnt work, failing with error

CMake Error at build/_deps/server-src/CMakeLists.txt:24 (Find_Package):
  By not providing "Finduvent.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "uvent", but
  CMake did not find one.

  Could not find a package configuration file provided by "uvent" with any of
  the following names:

    uventConfig.cmake
    uvent-config.cmake

  Add the installation prefix of "uvent" to CMAKE_PREFIX_PATH or set
  "uvent_DIR" to a directory containing one of the above files.  If "uvent"
  provides a separate development package or SDK, be sure it has been
  installed.

reordering Declare and MakeAvailable and other things, didnt help

the only thing that helped was OVERRIDE_FIND_PACKAGE
cmake 3.30

FetchContent_Declare(
        uvent
        GIT_REPOSITORY https://github.com/Usub-development/uvent.git
        GIT_TAG timer_optimizations
        OVERRIDE_FIND_PACKAGE
)

I just dont get, am I just getting something really important wrong? I just want to make the setup proccess easier for users and all i try just doesnt work universally…

If you want to play nice with your project’s consumers, add the FIND_PACKAGE_ARGS option to your FetchContent_Declare() call. That gives the consumer the ability to use any find_package()-based method to obtain the uvent dependency. Only if the dependency isn’t found using find_package() will FetchContent then be used to build uvent from source.

An advantage of using FIND_PACKAGE_ARGS is that it gives dependency providers that only implement the FIND_PACKAGE provider method a chance to provide the dependency first. It is very common for providers to only implement the FIND_PACKAGE method (conan does this, for example).

Projects should not generally use the OVERRIDE_FIND_PACKAGE keyword unless they will always be the top level project. It is usually only appropriate for projects within an organisation, not for open source projects. It forces consumers to obtain that dependency via FetchContent unless they override what your project does by calling FetchContent_Declare() first with its own customised details for the dependency. You don’t want to force consumers to do that.

And regarding the ordering of FetchContent_Declare() and FetchContent_MakeAvailable() calls, put all of the former first before calling the latter. The updated version of your example would look like this:

FetchContent_Declare(
        uvent
        GIT_REPOSITORY https://github.com/Usub-development/uvent.git
        GIT_TAG timer_optimizations
        FIND_PACKAGE_ARGS
)
FetchContent_Declare(
        server
        GIT_REPOSITORY https://github.com/Usub-development/webserver
        GIT_TAG main   # <---- Don't do this, use a tag or commit hash, not a branch name
        FIND_PACKAGE_ARGS
)

FetchContent_MakeAvailable(
        uvent
        server
)

The example I gave wasn’t from the library itself but from the superproject. I don’t quite see how FIND_PACKAGE_ARGS could help here. My understanding is that it only modifies the behavior of the FetchContent call so that it tries find_package first and, if not found, falls back to fetching. If that’s the case, I don’t see how it would resolve the issue.

Here’s the situation:

  • The webserver relies on uvent to make requests (and uvent is a public dependency).
  • I want to use webserver in my project.
  • I fetch uvent, so that the header-only parts of webserver can make use of it.
  • But the call fails: it can’t find uvent, even though uvent has already been declared earlier.

In this setup, I don’t see how FIND_PACKAGE_ARGS would help. Running pure find_package without OVERRIDE_FIND_PACKAGE fails, even though the uvent_build, uvent-subbuild, and uvent-src directories are all properly set up.

So the question that arises: does FetchContent only work with fetched dependencies, and the only way to integrate it with pure find_package is to use OVERRIDE_FIND_PACKAGE?

As I’m writing this, I think I may have figured out the intended structure:

  • Libraries that can be built independently (without requiring a prior installation of dependencies) should use FetchContent with FIND_PACKAGE_ARGS.
  • In the superproject / top-level project, we should use OVERRIDE_FIND_PACKAGE to avoid the problem I described.

If that assumption is correct, what happens when someone installs a package that uses FIND_PACKAGE_ARGS, then later builds another project that also fetches the same dependency library (possibly a different version) and uses find_package for the lib it installed? I know its just a bad pattern with mixing the installation methods. Best example is: the version of pulsar we used wasn’t working with FetchContent, then we introduced pure protobuff that was working with FetchCotnent and all hell broke loose. Introducing FIND_PACKAGE_ARGS did fix it, but it still is on the hands of a developer, and I think that my responsibility is to make it as easy for starting and already mature programmers as easy as possible, minimizing the error margin.

  • Should I even try to detect this version mismatch and issue a warning? Can I?
  • Or is this simply not something I shouldn’t worry about, since conflicts are expected in such cases? And are not my responsibility and not under my control. And I can’t and shouldn’t try to fix the programmers error?

Finally, thanks for your advice. The reason I tested on main was just to validate a change quickly. Normally we use a branch or tag—it’s just this dev project where I test things directly on main or branch instead of tags or hashes.