Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

indirect calls are always used (LLVM-321) #90

Open
yamt opened this issue Feb 6, 2024 · 10 comments
Open

indirect calls are always used (LLVM-321) #90

yamt opened this issue Feb 6, 2024 · 10 comments

Comments

@yamt
Copy link

yamt commented Feb 6, 2024

currently, -mlongcalls is always effectively on.
i had to patch llvm for my use case. cf. yamt@3f04244
it would be nicer to have a way to use direct calls.
ideally, the backend can automatically choose direct calls when the call distance is not too long, i guess.

related:
bytecodealliance/wasm-micro-runtime#3141
bytecodealliance/wasm-micro-runtime#3142

@github-actions github-actions bot changed the title indirect calls are always used indirect calls are always used (LLVM-321) Feb 6, 2024
@yamt
Copy link
Author

yamt commented Feb 16, 2024

for my use case, it's ok to manually adding attributes at IR level. eg. xtensa-short-call.
mips has something along the line: https://clang.llvm.org/docs/AttributeReference.html#short-call-near

@gerekon
Copy link
Collaborator

gerekon commented Feb 16, 2024

@yamt Thanks for pointing this out. We are working on direct calls support using callN instruction.

cc @sstefan1

@sstefan1
Copy link
Collaborator

for my use case, it's ok to manually adding attributes at IR level. eg. xtensa-short-call. mips has something along the line: https://clang.llvm.org/docs/AttributeReference.html#short-call-near

We're working on implementing the support for direct calls, the same way gcc does it. That is a bit more complex. In the mean time we can implement short-call attributes for xtensa.

@dongsheng28849455
Copy link

hi, @sstefan1, can I ask the progress for this feature?

@sstefan1
Copy link
Collaborator

hi, @sstefan1, can I ask the progress for this feature?

There hasn't been any real progress since that coment. I'll write here when I have some updates.

@dongsheng28849455
Copy link

hi, @sstefan1, can I ask the progress for this feature?

There hasn't been any real progress since that coment. I'll write here when I have some updates.

is that fixed by https://github.com/espressif/llvm-project/releases/tag/esp-17.0.1_20240408?

@yamt
Copy link
Author

yamt commented May 13, 2024

hi, @sstefan1, can I ask the progress for this feature?

There hasn't been any real progress since that coment. I'll write here when I have some updates.

is that fixed by https://github.com/espressif/llvm-project/releases/tag/esp-17.0.1_20240408?

the short-call attribute is already in the releases and it seems working as far as i tested.
the "ideal" solution (ie. somehow automatically use short/long calls) is not there yet.
i guess he was talking about the latter.

@gerekon
Copy link
Collaborator

gerekon commented May 14, 2024

hi, @sstefan1, can I ask the progress for this feature?

There hasn't been any real progress since that coment. I'll write here when I have some updates.

is that fixed by https://github.com/espressif/llvm-project/releases/tag/esp-17.0.1_20240408?

the short-call attribute is already in the releases and it seems working as far as i tested. the "ideal" solution (ie. somehow automatically use short/long calls) is not there yet. i guess he was talking about the latter.

Yes, exactly. Automatic use of short/long calls is in progress.
Just for information: We do not expect it to be ready in the 1st half of this year.

@andyross
Copy link

Just bumped into this myself. Is there no support for at least changing the default e.g. ("-mno-longcalls")? The range of a CALLx instruction is +/- 512k, which is quite a bit larger than the .text segment of many/most firmware images in the wild. Right now the conservative default is a pretty severe performance regression, and it's needless for a lot of applications.

@andreisfr
Copy link
Collaborator

@andyross , thank you for the question. Currently we work on implementing CALLX support in the LLD.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants