Dynamically generating add_custom_command dependencies during build time

We have a code base that we are trying to convert from autotools to cmake. This project builds both c++ libraries and a python interface. The python interface/module is generated by python scripts during the build using add_custom_comand. We would like the build system to be able to determine if the python generator script or any of it’s imported .py files have been altered between builds.

Currently we have a script (moduleCheck.py) that passes out a list of all of the files that are imported by a selected generator script (*MOD.py). We are calling moduleCheck.py during the configuration stage with execute_process and passing it’s output as a list of dependencies to the add_custom_command that calls *MOD.py during the build. For the most part this works great when only editing *MOD.py or any of its existing imports as of the last time running a config.


'from PYB11Generator import * ;
import ${PYB11_MODULE_NAME}MOD ;
PYB11generateModule(${PYB11_MODULE_NAME}MOD, “Spheral${PYB11_MODULE_NAME}”) ’

The issue we are facing is in handling any new python files that are being imported during development into our existing files. Once they are imported, if these new files are changed they aren’t identified as dependencies during the build and thus the python/c++ package is not regenerated properly. Which can be a pain during development.

So the questions are:
Is there a way of creating a command/rule that generates a dynamic set of dependencies for a target during build time?
Is there a way for a target to read its dependencies from a file during build time? (I saw that add_custom_command can take a depfile for ninja, is there something like this for make?)

Thank you,

This would require dyndep support from add_custom_command. It certainly isn’t implemented right now and I don’t know if there are plans at the moment. C++20 modules support will require it, but getting that plumbed through is likely higher on the priority list.

When such support would be available, you’d have to provide a command that would “scan” the work to be done and output a list of files that will be generated later. Something would have to bridge the gap and add the dependencies on the generated files though. Not sure how that’d work off-hand.

Right now, the best you can do is one of:

  • know what files will be generated at configure time (problematic if files can appear/disappear based on source file contents)
  • use a bridging “stamp” file to represent the entirety of the Python generation step (though any Python change will now trigger users of any the files, not just the modified ones)

Cc: @brad.king

See CMake Issue 20286.