I would like to ask for some guidance on best practices and suggestions on how to handle deps in CMake. I am developing CMake-based C++ projects for customers that depend on a couple of libraries but wish to provide a clean out-of-the-box experience for the consumers of these projects. This means, that git clone, configure, build should succeed without end users lifting a finger.
The solution in one case was using Craig Scott’s DownloadProject, which has a crucial shortcoming, which FetchContent is supposed to address (and it does, bur more on that later). The solution in another case using git submodules is pulling in the source code of dependencies even before configuration would start. I also tried using the ExternalProject module in cases, but having to deal with setting up INTERFACE targets for every dep manually is tedious, error prone and not always sufficient, due to builds still missing some files at configure time, sometimes needed as sources, sometimes to decide link language, etc.
My experience is none of these solutions scale. DownloadProject, ExternalProject, FetchContent, git submodules… they all turn into intricate fixups of other project’s mess.
IMHO all of these methods should only be used when the dependency is really part of the same project.
Git submodules are for projects that have multiple… modules, and haven’t transitioned to a monorepo. Same goes for FetchContent and similar CMake modules. They only work reliably when one is in total control of everything pulled in. In all other cases, a singular call to find_package(MyDep REQUIRED)
is the only level of concern downstreams should have to deal with. Much like Find Module and Package Config files, these need not be guarded against the targets already existing, because these files are safe for multiple inclusions. The idiom
if(NOT TARGET glm::glm)
find_package(glm CONFIG REQUIRED)
endif()
is a sad consequence of the new practice of pulling in dependencies into our own build. find_package(glm CONFIG REQUIRED)
only conflicts with glm
when it’s pulled in as part of the build, because then the ALIAS
and INTERFACE
targets would clash. What’s really annoying is that every project has to guard it’s own targets too:
if(NOT TARGET MyLib)
add_library(MyLib ...)
endif()
Because no project knows whether they’re pulled into the build multiple times by a superproject. Moreover, when trying to cater to those both supplying their own deps through APT or Vcpkg, etc and also those enjoying our custom crafted out-of-the-box experience, one has to deal with the subtleties of ALIAS
vs. INTERFACE
targets, as they have different sets of properties. Here is one of the less complicated ones:
if(NOT DEPENDENCIES_FORCE_DOWNLOAD AND NOT EXISTS "${CMAKE_BINARY_DIR}/_deps/glm-external-src")
find_package(glm CONFIG)
# NOTE 1: GLM 0.9.9.0 in Ubuntu 18.04 repo doesn't install the IMPORTED
# INTERFACE target, only the legacy variable is defined in glm-config.cmake
# NOTE 2: auto-fetched subproject build doesn't define the (legacy) variable
# anymore, only the INTERFACE target
#
# To avoid every test depening on GLM define their deps using
#
# add_sample(
# LIBS
# $<$<TARGET_EXISTS:glm::glm>:glm::glm>
# INCLUDES
# $<$<NOT:$<TARGET_EXISTS:glm::glm>>:"${GLM_INCLUDE_DIRS}">
# )
#
# we create the INTERFACE target in case it didn't exist.
if(glm_FOUND AND NOT TARGET glm::glm)
add_library(glm::glm INTERFACE)
target_include_directories(glm::glm INTERFACE "${GLM_INCLUDE_DIRS}")
endif()
endif()
if(NOT (glm_FOUND OR TARGET glm::glm))
if(NOT EXISTS "${CMAKE_BINARY_DIR}/_deps/glm-external-src")
if(DEPENDENCIES_FORCE_DOWNLOAD)
message(STATUS "DEPENDENCIES_FORCE_DOWNLOAD is ON. Fetching glm.")
else()
message(STATUS "Fetching glm.")
endif()
message(STATUS "Adding glm subproject: ${CMAKE_BINARY_DIR}/_deps/glm-external-src")
endif()
cmake_minimum_required(VERSION 3.11)
include(FetchContent)
FetchContent_Declare(
glm-external
GIT_REPOSITORY https://github.com/g-truc/glm
GIT_TAG 0.9.9.8 # e79109058964d8d18b8a3bb7be7b900be46692ad
)
FetchContent_MakeAvailable(glm-external)
endif()
25 lines of script (without comment) for one dep. And I got 5-6-7 deps, each with subtly different annoyances:
- some abandoned where I have to patch the 8 year old CMake scripts,
- having to scope their build because they don’t build warning-free, so I have to patch up
CMAKE_CXX_FLAGS
for the deps only to remove warning-related flags for all supported compilers, - My favorite is needing to find the location of headers of the target, because they are shader sources as well and I need to copy next to my executable. Finding the path to those for both the
ALIAS
andIMPORTED
cases I still haven’t figured out. - Some projects (SFML) don’t support consuming projects as part of the build (freetype), only in pre-built binary mode using
find_package
, - sometimes other shenanigans.
Bottom line is: I feel like this brave new world of using FetchContent for handling dependencies, and having to self-defend all our scripts against superprojects potentially multiply including them, etc. is a dead end. I am aware that 3.24 has some degree of find_pacakge()
integration for FetchContent, but I’m not sure that will resolve this duality. This trend of pulling other people’s stuff into our build either at build or either at configure time has to stop.
(I’m also aware of dependency providers, but that requires mandating minimum of 3.24 (not a short term solution) and also getting all of our customers on board of using package managers. I’d absolutely love the idea of that, but I’m afraid that’s easier said than done. At best I can try to detect from script whether any package manager is being used, and if not, FetchContent Vcpkg and in one go install all the deps during configuration time and use everything through the oneliner find_package
interface.)