Skip to content

[Runtimes] Merge 'compile_commands.json' files from runtimes build #116303

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Nov 28, 2024

Conversation

jhuber6
Copy link
Contributor

@jhuber6 jhuber6 commented Nov 15, 2024

Summary:
When building a project in a runtime mode, the compilation database is a
separate CMake invocation. So its compile_commands.json file will be
placed elsewhere in the runtimes/runtime-bins directory. This is
somewhat annoying for ongoing development when a runtimes build is
necessary. This patch adds some CMake magic to merge the two files.

Copy link

github-actions bot commented Nov 15, 2024

✅ With the latest revision this PR passed the Python code formatter.

@jhuber6 jhuber6 removed the request for review from a team November 15, 2024 01:18
@petrhosek
Copy link
Member

This slightly changes the semantics of compile_commands.json generation. Normally, compile_commands.json is generated by CMake during the configuration stage which means that you don't need to run build at all; after this change, we'll generate the top-level compile_commands.json every time we (re-)run CMake, but that file will be replaced by the merged one during the build.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 15, 2024

This slightly changes the semantics of compile_commands.json generation. Normally, compile_commands.json is generated by CMake during the configuration stage which means that you don't need to run build at all; after this change, we'll generate the top-level compile_commands.json every time we (re-)run CMake, but that file will be replaced by the merged one during the build.

Well you'll still get the normal compile_commands.json with a configure, but since the runtimes directories are only triggered during the build it will then merge them in.

@llvmbot llvmbot added the cmake Build system in general and CMake in particular label Nov 16, 2024
@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 18, 2024

I fixed the issue where it would error if you manually deleted the dependency. Right now this properly re-merges it whenever the runtime compile commands changes. I believe this is very useful for making runtimes builds less painful, so hopefully there aren't any additional issues.

Summary:
When building a project in a runtime mode, the compilation database is a
separate CMake invocation. So its `compile_commands.json` file will be
placed elsewhere in the `runtimes/runtime-bins` directory. This is
somewhat annoying for ongoing development when a runtimes build is
necessary. This patch adds some CMake magic to merge the two files.

warning
@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 26, 2024

Ping

@jhuber6 jhuber6 requested a review from AaronBallman November 27, 2024 04:19
Copy link
Collaborator

@AaronBallman AaronBallman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't feel particularly qualified to accept or reject the changes, but they seem plausible to me FWIW.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 27, 2024

I don't feel particularly qualified to accept or reject the changes, but they seem plausible to me FWIW.

No problem, I think this is fairly innocuous as-is, unless people are really concerned about a few extra MB of JSON. I've personally gotten really tired of doing this manually every time CMake reconfigured.

@shiltian
Copy link
Contributor

Just want to confirm that,, when the top level configure runs again, the top level compile_commands.json will be overridden with only the top level project related commands right? Those commands added by the merge will be simply gone. If that is the case, I'm okay with the change. However, if that's not the case, I don't think this change is valid.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 27, 2024

Just want to confirm that,, when the top level configure runs again, the top level compile_commands.json will be overridden with only the top level project related commands right? Those commands added by the merge will be simply gone. If that is the case, I'm okay with the change. However, if that's not the case, I don't think this change is valid.

Yes, if only the projects configure stage runes. If the runtimes configure stage runs afterwards it will trigger that to be updated which will then make the file stale and trigger the merge again.

@shiltian
Copy link
Contributor

That is fine. We don't want to change the semantics of top level configure, and also don't want to have any stale commands in it if only the top level configure runs.

@petrhosek
Copy link
Member

Have you measured at the performance on larger builds? For example, in the Fuchsia toolchain build that I normally use, I have 38 compile_commands.json files, the top-level one is 12M and the individual runtime ones are around 1-2M. Python is not the fastest language out there and I just want to make sure this doesn't become a bottleneck even for larger builds.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 27, 2024

Have you measured at the performance on larger builds? For example, in the Fuchsia toolchain build that I normally use, I have 38 compile_commands.json files, the top-level one is 12M and the individual runtime ones are around 1-2M. Python is not the fastest language out there and I just want to make sure this doesn't become a bottleneck even for larger builds.

My build has three, but I'm on a beefier server. I'd assume someone doing a standard release build might turn off the compile commands if they don't want the overhead? Would it be possible for you to try it locally? It'd be hard for me to test Fuchsia's build locally. I didn't notice any considerable slowdown, and it only happens once per reconfigure so rebuilds are fast for the most part.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 27, 2024

So, on my machine it takes about 0.3 seconds to merge an 8 MB compile commands into a 23 MB one. That'll likely be done in parallel, that time is somewhat similar to some of the worse feature tests in CMake so it might not be too much of an issue.

@jhuber6 jhuber6 merged commit 054f914 into llvm:main Nov 28, 2024
62 checks passed
@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 28, 2024

LLVM Buildbot has detected a new failure on builder libc-aarch64-ubuntu-fullbuild-dbg running on libc-aarch64-ubuntu while building llvm,runtimes at step 4 "annotate".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/71/builds/11493

Here is the relevant piece of the build log for the reference
Step 4 (annotate) failure: 'python ../llvm-zorg/zorg/buildbot/builders/annotated/libc-linux.py ...' (failure)
...
[ RUN      ] LlvmLibcStrtouint64Test.MessyBaseSixteenDecode
[       OK ] LlvmLibcStrtouint64Test.MessyBaseSixteenDecode (2 us)
[ RUN      ] LlvmLibcStrtouint64Test.AutomaticBaseSelection
[       OK ] LlvmLibcStrtouint64Test.AutomaticBaseSelection (3 us)
Ran 14 tests.  PASS: 14  FAIL: 0
[883/884] Running unit test libc.test.src.time.nanosleep_test.__unit__
[==========] Running 1 test from 1 test suite.
[ RUN      ] LlvmLibcNanosleep.SmokeTest
[       OK ] LlvmLibcNanosleep.SmokeTest (35 us)
Ran 1 tests.  PASS: 1  FAIL: 0
command timed out: 1200 seconds without output running [b'python', b'../llvm-zorg/zorg/buildbot/builders/annotated/libc-linux.py', b'--debug'], attempting to kill
process killed by signal 9
program finished with exit code -1
elapsedTime=1243.582528
Step 8 (libc-unit-tests) failure: libc-unit-tests (failure)
...
[ RUN      ] LlvmLibcStrtouint32Test.DecodeInOtherBases
[       OK ] LlvmLibcStrtouint32Test.DecodeInOtherBases (318 ms)
[ RUN      ] LlvmLibcStrtouint32Test.CleanBaseSixteenDecode
[       OK ] LlvmLibcStrtouint32Test.CleanBaseSixteenDecode (8 us)
[ RUN      ] LlvmLibcStrtouint32Test.MessyBaseSixteenDecode
[       OK ] LlvmLibcStrtouint32Test.MessyBaseSixteenDecode (2 us)
[ RUN      ] LlvmLibcStrtouint32Test.AutomaticBaseSelection
[       OK ] LlvmLibcStrtouint32Test.AutomaticBaseSelection (3 us)
Ran 14 tests.  PASS: 14  FAIL: 0
[882/884] Running unit test libc.test.src.stdlib.strtoint64_test.__unit__
[==========] Running 14 tests from 1 test suite.
[ RUN      ] LlvmLibcStrtoint64Test.InvalidBase
[       OK ] LlvmLibcStrtoint64Test.InvalidBase (3 us)
[ RUN      ] LlvmLibcStrtoint64Test.CleanBaseTenDecode
[       OK ] LlvmLibcStrtoint64Test.CleanBaseTenDecode (7 us)
[ RUN      ] LlvmLibcStrtoint64Test.MessyBaseTenDecode
[       OK ] LlvmLibcStrtoint64Test.MessyBaseTenDecode (5 us)
[ RUN      ] LlvmLibcStrtoint64Test.DecodeInOtherBases
[       OK ] LlvmLibcStrtoint64Test.DecodeInOtherBases (307 ms)
[ RUN      ] LlvmLibcStrtoint64Test.CleanBaseSixteenDecode
[       OK ] LlvmLibcStrtoint64Test.CleanBaseSixteenDecode (11 us)
[ RUN      ] LlvmLibcStrtoint64Test.MessyBaseSixteenDecode
[       OK ] LlvmLibcStrtoint64Test.MessyBaseSixteenDecode (2 us)
[ RUN      ] LlvmLibcStrtoint64Test.AutomaticBaseSelection
[       OK ] LlvmLibcStrtoint64Test.AutomaticBaseSelection (4 us)
[ RUN      ] LlvmLibcStrtouint64Test.InvalidBase
[       OK ] LlvmLibcStrtouint64Test.InvalidBase (1 us)
[ RUN      ] LlvmLibcStrtouint64Test.CleanBaseTenDecode
[       OK ] LlvmLibcStrtouint64Test.CleanBaseTenDecode (6 us)
[ RUN      ] LlvmLibcStrtouint64Test.MessyBaseTenDecode
[       OK ] LlvmLibcStrtouint64Test.MessyBaseTenDecode (5 us)
[ RUN      ] LlvmLibcStrtouint64Test.DecodeInOtherBases
[       OK ] LlvmLibcStrtouint64Test.DecodeInOtherBases (306 ms)
[ RUN      ] LlvmLibcStrtouint64Test.CleanBaseSixteenDecode
[       OK ] LlvmLibcStrtouint64Test.CleanBaseSixteenDecode (8 us)
[ RUN      ] LlvmLibcStrtouint64Test.MessyBaseSixteenDecode
[       OK ] LlvmLibcStrtouint64Test.MessyBaseSixteenDecode (2 us)
[ RUN      ] LlvmLibcStrtouint64Test.AutomaticBaseSelection
[       OK ] LlvmLibcStrtouint64Test.AutomaticBaseSelection (3 us)
Ran 14 tests.  PASS: 14  FAIL: 0
[883/884] Running unit test libc.test.src.time.nanosleep_test.__unit__
[==========] Running 1 test from 1 test suite.
[ RUN      ] LlvmLibcNanosleep.SmokeTest
[       OK ] LlvmLibcNanosleep.SmokeTest (35 us)
Ran 1 tests.  PASS: 1  FAIL: 0

command timed out: 1200 seconds without output running [b'python', b'../llvm-zorg/zorg/buildbot/builders/annotated/libc-linux.py', b'--debug'], attempting to kill
process killed by signal 9
program finished with exit code -1
elapsedTime=1243.582528

@llvm-ci
Copy link
Collaborator

llvm-ci commented Nov 28, 2024

LLVM Buildbot has detected a new failure on builder lld-x86_64-win running on as-worker-93 while building llvm,runtimes at step 7 "test-build-unified-tree-check-all".

Full details are available at: https://lab.llvm.org/buildbot/#/builders/146/builds/1707

Here is the relevant piece of the build log for the reference
Step 7 (test-build-unified-tree-check-all) failure: test (failure)
******************** TEST 'LLVM-Unit :: Support/./SupportTests.exe/37/87' FAILED ********************
Script(shard):
--
GTEST_OUTPUT=json:C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe-LLVM-Unit-18764-37-87.json GTEST_SHUFFLE=0 GTEST_TOTAL_SHARDS=87 GTEST_SHARD_INDEX=37 C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe
--

Script:
--
C:\a\lld-x86_64-win\build\unittests\Support\.\SupportTests.exe --gtest_filter=ProgramEnvTest.CreateProcessLongPath
--
C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp(160): error: Expected equality of these values:
  0
  RC
    Which is: -2

C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp(163): error: fs::remove(Twine(LongPath)): did not return errc::success.
error number: 13
error message: permission denied



C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp:160
Expected equality of these values:
  0
  RC
    Which is: -2

C:\a\lld-x86_64-win\llvm-project\llvm\unittests\Support\ProgramTest.cpp:163
fs::remove(Twine(LongPath)): did not return errc::success.
error number: 13
error message: permission denied




********************


@philnik777
Copy link
Contributor

@jhuber6 This seems to break the runtimes build. With CC=clang CXX=clang++ cmake -G Ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=On -DCMAKE_BUILD_TYPE=Debug -DLLVM_ENABLE_RUNTIMES="libcxx;libcxxabi;libunwind" ../runtimes/ I get

CMake Error at CMakeLists.txt:358 (add_custom_command):
  Attempt to add a custom rule to output

    <...>/build/compile_commands.json.rule

  which already has a custom rule.

@jhuber6
Copy link
Contributor Author

jhuber6 commented Nov 28, 2024

@jhuber6 This seems to break the runtimes build. With CC=clang CXX=clang++ cmake -G Ninja -DCMAKE_EXPORT_COMPILE_COMMANDS=On -DCMAKE_BUILD_TYPE=Debug -DLLVM_ENABLE_RUNTIMES="libcxx;libcxxabi;libunwind" ../runtimes/ I get

CMake Error at CMakeLists.txt:358 (add_custom_command):
  Attempt to add a custom rule to output

    <...>/build/compile_commands.json.rule

  which already has a custom rule.

Ah, so this seems to be from invoking the runtimes build directly? In that case we probably don't want this to trigger. I wonder if there's a way to detect that. Feel free to revert for now.

ldionne added a commit that referenced this pull request Nov 28, 2024
…build (#116303)"

This reverts commit 054f914, which was
found to break the runtimes build.
jhuber6 added a commit that referenced this pull request Nov 30, 2024
…build (#116303)

Summary:
When building a project in a runtime mode, the compilation database is a
separate CMake invocation. So its `compile_commands.json` file will be
placed elsewhere in the `runtimes/runtime-bins` directory. This is
somewhat annoying for ongoing development when a runtimes build is
necessary. This patch adds some CMake magic to merge the two files.

Fixed issue w/ standalone runtimes build by checking if the LLVM src and
CMake src are the same.
data = json.load(f)
merged_data.extend(data)
except (IOError, json.JSONDecodeError) as e:
print("Failed to parse {json_file}: {e}", file=sys.stderr)
Copy link
Member

@Meinersbur Meinersbur Dec 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
print("Failed to parse {json_file}: {e}", file=sys.stderr)
print(f"Failed to parse {json_file}: {e}", file=sys.stderr)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Honestly I think this print shouldn't exist so people don't get confused.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cmake Build system in general and CMake in particular
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants