I’m trying to integrate a number of dependencies into my project. Many of these are cmake based, so integration is quick and easy. Unfortunately I’m having a heck of a time getting the non-cmake ones to play ball.
Here’s the big picture:
I have a tps library that is distributed in source a tar file. This library is NOT built with cmake
I may or may not need to configure, calling ./configure with some arguments
I must build, calling make
I can optionally install, using make install and passing a destination
I need to take this information, and the outputted include directory and library into a cmake dependency for consumption in my project. Configuration, then building, and installing results in a file structure that has the include directory, and the library.
More concrete, if you need specifics. I am trying in a toy example to integrate LuaJIT. that download can be found here.
The issue here being that ${SOURCE_DIR}/dest/usr/local/lib/libluajit-5.1.a does not exist until after the INSTALL step of ExternalProject. So it’s a bit of chicken and the egg.
Any more context I can provide, I would be thrilled to. I feel like I must be missing something silly…
Basically if you use ExternalProject, you wand a superbuild pattern, where your dependencies AND your project all get build via ExternalProject_Add in the right order.
Alternatively make it a two-step process - first build dependencies, and then the project.
In any case generally you cannot use the build artifacts of ExternalProject in the same build that calls it.
Ah, got it. I had seen that pattern before, I just want sure if it was necessary or not.
Is there a way to do what I’m trying to do here, and have a one step build create everything I need in a reliable way? I really like how FetchContent does it, but of course that assumes a CMake build system in the dependency.
Can some combo of add_custom_target or add_custom_command work for me? Or is the superbuild the way to go for most people?
(I guess I should have not said “of course”, I am obviously out of my depth here )
It was my assumption that FetchContent didn’t participate with non CMake builds, based on the idea that FetchContent_MakeAvailable calls add_subdirectory on the downloaded source. I could 100% be wrong in this. Also, this snippet from the FetchContent docs:
In addition to the above explicit options, any other unrecognized options are passed through unmodified to ExternalProject_Add() to perform the download, patch and update steps. The following options are explicitly prohibited (they are disabled by the FetchContent_Populate() command):
Oh, that makes sense then; it only shares the downloading logic with ExternalProject. Yeah, FetchContent isn’t going to work here. A superbuild is probably the best way forward if you’re skipping over package managers (conan, vcpkg, etc.).
Yeah, I’m not sure I’ll be able to get a package manager to fly. We are currently on a make based build system, and this investigation is to try to convince folks that cmake would ease some pains.
I think moving to both cmake and package manager in one swoop would possibly weaken my case.
I’ll investigate the superbuild route, unless something else pops up.
An ExternalProject-based superbuild project would be the traditional recommendation for this sort of scenario. @ben.boeckel has more experience working with such projects (e.g. VTK) so I’d follow his advice on how to get such an arrangement to work and what impact it might have on how your developers work day-to-day.
Using a package manager would free you from having to restructure your main project into a superbuild. That may ultimately be less disruptive to your developers, since they would continue to work on the main project’s sources the same way they do now. But yes, the introduction of a package manager is not trivial and that also changes the developer workflow.