Most Fortran compilers avoid modifying the .mod
file generated for a module if the interface didn’t change. In this case Clang should write a temporary .pcm
file with a different name, and then rename it to replace the real .pcm
file only if the content changed. Or some equivalent replace-if-different logic.
Clang issue: Clang should not re-write .pcm for C++ module if it's unchanged · Issue #61457 · llvm/llvm-project · GitHub
Hi, I am also trying c++20 modules with clang, and came up with this discussion. My simple example compiles now. However, the compile_commands.json
generated by cmake does not contain any information about modules.
I read the discussion above, and @brad.king says module information is not included before the dependencies are scanned. However, even after I successfully compile the project (which means the dependencies are scanned), I still get errors from my lsp-server. Any workaround? Thanks.
The files are generated during the build and look like @path/to/some/name.modmap
on the command line.
What errors?
The Xcode generator isn’t working with C++ modules. Will it work if I setup a new Xcode “toolchain” that points to the latest Clang?
I’m trying to figure out how to build and deploy to my iPad, and eventually to the App Store.
CMake Error in AppleInterop/CMakeLists.txt:
The "AppleInterop" target contains C++ module sources which are not
supported by the generator
Thanks for your reply. I will present you with some details of it.
The CMakeList.txt is as follow:
cmake_minimum_required(VERSION 3.26.0)
project("nativetest")
set(CMAKE_EXPERIMENTAL_CXX_MODULE_CMAKE_API "2182bf5c-ef0d-489a-91da-49dbc3090d2a")
set(CMAKE_CXX_STANDARD 20)
set(CMAKE_EXPERIMENTAL_CXX_MODULE_DYNDEP 1)
string(CONCAT CMAKE_EXPERIMENTAL_CXX_SCANDEP_SOURCE
"${CMAKE_CXX_COMPILER_CLANG_SCAN_DEPS}"
" -format=p1689"
" --"
" <CMAKE_CXX_COMPILER> <DEFINES> <INCLUDES> <FLAGS>"
" -x c++ <SOURCE> -c -o <OBJECT>"
" -MT <DYNDEP_FILE>"
" -MD -MF <DEP_FILE>"
" > <DYNDEP_FILE>")
set(CMAKE_EXPERIMENTAL_CXX_MODULE_MAP_FORMAT "clang")
set(CMAKE_EXPERIMENTAL_CXX_MODULE_MAP_FLAG "@<MODULE_MAP_FILE>")
set(CMAKE_CXX_EXTENSIONS OFF)
add_executable(my_program)
target_sources(
my_program
PRIVATE
main.cxx
PRIVATE FILE_SET CXX_MODULES
BASE_DIRS .
FILES
importable.cxx
)
target_compile_features(my_program PUBLIC cxx_std_20)
main.cxx
on the other hand is:
import importable;
int main(int argc, char **argv) {
return from_import();
}
and importable.cxx
is:
export module importable;
export int from_import() {
return 0;
}
I used vscode with CMake tool plugin and clangd plugin, and it show errors Module 'importable' not foundclang(module_not_found)
.
The generated compile_commands.json is:
[
{
"directory": "/home/loves/test/build",
"command": "/usr/bin/clang++-16 -O3 -DNDEBUG -std=c++20 -o CMakeFiles/my_program.dir/main.cxx.o -c /home/loves/test/main.cxx",
"file": "/home/loves/test/main.cxx",
"output": "CMakeFiles/my_program.dir/main.cxx.o"
},
{
"directory": "/home/loves/test/build",
"command": "/usr/bin/clang++-16 -O3 -DNDEBUG -std=c++20 -o CMakeFiles/my_program.dir/importable.cxx.o -c /home/loves/test/importable.cxx",
"file": "/home/loves/test/importable.cxx",
"output": "CMakeFiles/my_program.dir/importable.cxx.o"
}
]
And apparently there’s nothing about modules.
No, There is no known mechanism in Xcode to express build-time discovered dependencies. It requires xcodebuild
features that cannot be emulated.
This is issue 24618.
Thanks. I will keep an eye on this issue.
Using Clang, which commands generate the .pcm
files? Is it commands like this?
[2/6] cmake -E cmake_ninja_dyndep --tdi=MyUI/CMakeFiles/MyUI.dir/CXXDependInfo.json
--lang=CXX --modmapfmt=clang --dd=MyUI/CMakeFiles/MyUI.dir/CXX.dd
@MyUI/CMakeFiles/MyUI.dir/CXX.dd.rsp
That command took a long time. Later straight compile commands with clang++
also take a long time, but they seem to be only doing the usual .cc
→ .o
compilation.
I’m trying to improve my compile times, because they’re out of control right now. 30-50 seconds for some files, that don’t look like they should be that bad. I’m wondering if I’m abusing modules in some way that generates too much .pcm
data.
Also, are header units supported? I want to try: import <Eigen/Dense>
instead of #include <Eigen/Dense>
to see if it helps.
If I just plop it after the module declaration I get an error.
View.cc:20:8: fatal error: 'Eigen.Dense' file not found
No, header units are not yet supported. cmake_ninja_dyndep is collation phase and should be pretty quick. How many files/modules are you working with and what is slow?
I have a 330 c++ files in my entire “monorepo” controlled by a root CMakeLists.txt
file, but most of them are not being compiled for this target. The number of modules - I’m guessing around 20. Including partitions, I’m guessing around 100. I could get exact numbers if you’d like more than ballpark on that. 50K lines of code more or less – not even close to a “giant” C++ codebase.
Watching the build, that command appeared slow, but it could be the way things print out. I used -ftime-trace
and looked at the one C++ file that seemed to take a long time to compile, but I don’t know how to see what the problem is. It seemed like all the colored “time slices” were long. It spent 25 seconds in a “WriteAST” with “detail” showing a path to its .pcm
file. On disk that pcm
is 50 MB. All the pcms in my CMake build dir are 1.5GB, which seemed excessive, but I have no idea.
I think things started going south when I started including Eigen so maybe I’m hitting classic C++ problems with complex headers, and this is not related to modules.
But… I wondered if I could be making things worse with modules because many of the libraries I’m using are not modules yets, so I have them #included
in the global fragment at the top of the file. If all of those things are becoming massive .pcm
files, then it could be worse than just using old headers. That’s why I was wondering if the import <Eigen/Dense>
would mean that one .pcm would be created for all that, and it wouldn’t be redone for every global fragment (if that is indeed what is happening).
The .pcm
files are created at the same time as the .o
file for the TU. The cmake_ninja_dyndep
call is what looks at the results of scanning for module usages and providings, creates the relevant response files for each TU compilation for where to find or place .pcm
files, and writes out the information for ninja
to know what order to run the compiles in. It also writes out some CMake metadata for installation and exporting of module information.
CMake MR 5926 added support for
.ixx
and.cppm
Could cmake use these extentions (.cppm
, .ixx
) to automatically add these files to the CXX_MODULES
fileset?
A simple
add_executable(myexe main.cpp module.cppm)
would be much more ergonomic than
add_executable(myexe main.cpp)
target_sources(myexe PUBLIC FILE_SET CXX_MODULES FILES
module.cppm
)
Only if the target then becomes un-exportable (or, if it is exportable, it has to treat all such files as PRIVATE
and therefore not importable from other targets even in the same project). The issue is that we don’t know some of the information from file sets to know how to install and provide this information to consuming projects.
How is that different from the long form? Where/how is the missing information added?
Namely BASE_DIRS
is important (to avoid overlapping files when using only the basename), but also knowing where/how the files will go to write the target’s target_sources
in the export configuration is tied to install(FILE_SET)
arguments.
There was a new GNU release today and still there’s no support for Modules [it still says it doesn’t recognise the -fdep
command and so on]. It seems like one still has to apply the patch mentioned on import CMake. The nice thing is: the link they send us to shows: page not found.
My question is: what am I missing?
MSVC got reaaaally ahead on this - 2 lines and there we go .
Thank you.
The GCC patch is still on-list. I haven’t had time to fix the test suite that got broken with some changes made during a review cycle.