Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement asynchronous evaluation of scripts #3044

Merged
merged 4 commits into from
Oct 20, 2023
Merged

Implement asynchronous evaluation of scripts #3044

merged 4 commits into from
Oct 20, 2023

Conversation

jedel1043
Copy link
Member

@jedel1043 jedel1043 commented Jun 16, 2023

This Pull Request implements and API to asynchronously run scripts.
Had to do some refactoring in order to share code between run and run_async_with_budget, but I ran the QuickJS benchmark and everything looks good!

However, our engine has a big limitation right now: we cannot track the executed instructions of called functions because those internally use run. For this reason, this PR just tracks instructions executed in the root script being executed, but this can be implemented in the future. Solved by #3295

Some context

We have used this pattern of freely calling run inside several functions because it's easy to reason about, and Context owns the current VM state. However, it is really unusual to have VM state inside a shared Context, and using run this way causes problems that we've already seen but ignored, like duplicate prints of functions' bytecode when using tracing.

If we can pull off the separation of the VM from the Context, we could even make the context reference we pass to every function an immutable one (&Context<'_>), which will unblock a LOT of patterns that we've been largely working around.

Having said that, I have some ideas on how to make this work. This will probably involve changing the return value of our functions and some generator shenanigans, but I found an interesting crate that implements them using async/await, which makes them available on stable code (https://github.com/whatisaphone/genawaiter).

I'm open to any other ideas you have! The gist of problem is: "How to bubble up JS function calls to the original Context::run call".

Anyways, sorry for the info dump.

@jedel1043 jedel1043 added enhancement New feature or request execution Issues or PRs related to code execution labels Jun 16, 2023
@jedel1043 jedel1043 added this to the v0.17.0 milestone Jun 16, 2023
@jedel1043 jedel1043 requested a review from a team June 16, 2023 03:37
@github-actions
Copy link

github-actions bot commented Jun 16, 2023

Test262 conformance changes

Test result main count PR count difference
Total 95,609 95,609 0
Passed 75,690 75,690 0
Ignored 19,160 19,160 0
Failed 759 759 0
Panics 0 0 0
Conformance 79.17% 79.17% 0.00%

@codecov
Copy link

codecov bot commented Jun 16, 2023

Codecov Report

Attention: 68 lines in your changes are missing coverage. Please review.

Comparison is base (1d66836) 45.78% compared to head (3236b28) 45.73%.
Report is 1 commits behind head on main.

Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3044      +/-   ##
==========================================
- Coverage   45.78%   45.73%   -0.06%     
==========================================
  Files         483      483              
  Lines       49349    49427      +78     
==========================================
+ Hits        22594    22603       +9     
- Misses      26755    26824      +69     
Files Coverage Δ
boa_engine/src/vm/opcode/await/mod.rs 73.75% <ø> (ø)
boa_engine/src/vm/opcode/binary_ops/logical.rs 100.00% <ø> (ø)
...a_engine/src/vm/opcode/binary_ops/macro_defined.rs 85.71% <ø> (ø)
boa_engine/src/vm/opcode/binary_ops/mod.rs 57.37% <ø> (ø)
boa_engine/src/vm/opcode/call/mod.rs 21.73% <ø> (ø)
boa_engine/src/vm/opcode/concat/mod.rs 70.00% <ø> (ø)
boa_engine/src/vm/opcode/control_flow/jump.rs 97.56% <ø> (ø)
boa_engine/src/vm/opcode/control_flow/return.rs 68.42% <ø> (ø)
boa_engine/src/vm/opcode/control_flow/throw.rs 41.02% <ø> (ø)
boa_engine/src/vm/opcode/copy/mod.rs 46.66% <ø> (ø)
... and 55 more

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@raskad
Copy link
Member

raskad commented Jun 17, 2023

@HalidOdat you mentioned in #2904 that you wanted to refactor how we call functions to not recurse on the rust side. That would probably impact the design of this feature right?

@jedel1043
Copy link
Member Author

jedel1043 commented Jun 17, 2023

Ah, seems like @HalidOdat and I arrived at the same problem from two different but related features. We could probably solve both problems using the same solution then :)

@HalidOdat
Copy link
Member

HalidOdat commented Jun 18, 2023

I'm open to any other ideas you have! The gist of problem is: "How to bubble up JS function calls to the original Context::run call?".

The solution I thought of was to have only one run. we have a function that is called before every run (prepare_vm_call or something like this) that pushes the call frame and anything else that is needed, then we call run (which just runs instructions in a loop, the run does not call the prepare function)

The Opcode::Call instructions would only prepare the vm and continue with next instruction, because the frame is pushed we automatically get the instruction from the called function.

How do we know if we should return or continue executing until we run out of call frames?

For this we can have a flag in the call frame that returns in run.

This would greatly simplify our calling logic! Not having to do a clone of the passed arguments and reversing them and pushing them back again instead just using the ones that where already pushed by the Opcodes. Not creating separate value and environment stacks (for ordinary functions at least).

Note: This stops recursion (and causing a stack overflow) from JavaScript, You can still recurse in rust, by recursively calling a function that calls object.call(.., context).

@jedel1043
Copy link
Member Author

That sounds perfect to avoid recursing inside JS functions. We could implement that first, then think about how to avoid recursing inside Rust functions next.

Copy link
Member

@HalidOdat HalidOdat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! Just an memory optimization

I don't think this addition would make the call refactor any harder :)

boa_engine/src/vm/opcode/mod.rs Outdated Show resolved Hide resolved
boa_engine/src/vm/mod.rs Outdated Show resolved Hide resolved
@jedel1043 jedel1043 requested a review from a team June 21, 2023 01:13
@Razican Razican modified the milestones: v0.17.0, v0.18.0 Jul 8, 2023
Copy link
Member

@HalidOdat HalidOdat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! :)

We can reduce the spend function table which I did in #3401, we can merge this first, or merge into this?

@jedel1043
Copy link
Member Author

we can merge this first, or merge into this?

Let's just merge it into this, since they're related changes :)

@HalidOdat HalidOdat requested a review from a team October 19, 2023 22:36
Copy link
Member

@nekevss nekevss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work! I like the updates! 😄

@jedel1043 jedel1043 added this pull request to the merge queue Oct 20, 2023
Merged via the queue into main with commit 9ef368f Oct 20, 2023
14 checks passed
@HalidOdat HalidOdat deleted the async-run branch October 20, 2023 04:56
sam-finch-tezos pushed a commit to trilitech/boa that referenced this pull request Nov 29, 2023
* Implement asynchronous evaluation of scripts

* cargo fmt

* FIx fuzz build

* Reduce execution table size

---------

Co-authored-by: Haled Odat <8566042+HalidOdat@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request execution Issues or PRs related to code execution
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants