Skip to content

Make sure each invocation of block_on uses its own Parker #358

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
1 commit merged into from Oct 17, 2019
Merged

Make sure each invocation of block_on uses its own Parker #358

1 commit merged into from Oct 17, 2019

Conversation

ghost
Copy link

@ghost ghost commented Oct 16, 2019

This fixes a bug where we have recursive (nested) invocations of block_on:

task::block_on(async {
    task::block_on(async {
        // ...
    })
})

Each block_on behaves as an independent task and has an associated waker.

The problem was that both of these block_on invocations reused the same Thread descriptor for wakeups. If someone called .wake(), we had no way of distinguishing which of the two block_on tasks was actually woken.

Sometimes the outer task should get woken but the inner task would pick up the wakeup because it took over the Thread descriptor.

The problem is solved by associating a distinct Parker with each block_on invocation so that they don't get shared among recursive calls. Note that a Parker is just an implementation of std::thread::park() and std::thread::unpark(), except it was extracted out and moved into crossbeam-utils.

This should fix deadlocks @svartalf and @jebrosen have reported this week. Can you perhaps confirm the problem goes away with this branch?

@jebrosen
Copy link

Yes, this branch does appear to fix the deadlock I observed.

@svartalf
Copy link

Yes, I can confirm that it resolves the issue I had. Great work, as usual!

@ghost ghost merged commit 46f0fb1 into async-rs:master Oct 17, 2019
@ghost ghost deleted the fix-recursive-block_on branch October 17, 2019 09:52
This pull request was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants