Is there any practical way to generate build benchmarks? I want them to try and find code that takes time to compile in a huge code base, so I can try to improve the code in question.
Ros catkin does it, so I suppose it would be possible. But my main question is, if it is possible without externally controlling the build.
What do you mean by “externally controlling the build”? Do you mean time ninja or something is not suitable? I think you can extract timings from .ninja_log if you parse it out, but I don’t think other generators have such support (maybe Visual Studio gives per-target timings though).
Getting useful graphs from about:tracing can be more difficult (it crashed on some particularly large builds I’ve tried), but the catapult project is what you should look for there.
Okay, I’ll have a look at catapult. Simply parsing ninja outputs is not a valid option, because the thing I am trying is not guaranteed to build with ninja.
Also using --time-trace is way more detailed than I thought.
My goal was to simply get the time the compiler took to compile one of the targets, so I can print it on the comandline
If clang >= 9 is an option, the ClangBuildAnalyzer produces a nice summary. Listing which modules, headers, functions or templates were the most expensive ones.