Skip to content

Nudge people to the default chunk_size setting #8

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jun 10, 2021

Conversation

ctk21
Copy link
Contributor

@ctk21 ctk21 commented Jun 9, 2021

This PR nudges people to use the default chunk_size for parallel_for. In all the cases I've seen the default chunk is superior in our sandmark numerical benchmarks, particularly when the number of domains gets large. This is often because tasks can turn out to be more imbalanced than people expect: a domain does GC marking, is slower due to cache effects, has a hiccup getting scheduled by the OS, etc.

I have left the parallel initialization examples as is. I don't have experience to know if experiments have determined a small chunk_size as best for these.

@Sudha247
Copy link
Collaborator

Sudha247 commented Jun 9, 2021

Thanks @ctk21! Definitely makes sense to move to the default chunk_size. Will run some experiments on parallel intialization examples, we can probably change them to default chunk_size as well if the results are comparable.

@Sudha247
Copy link
Collaborator

Results on IITM machines show comparable results for default chunk size vs hard coded value for the parallel initialization examples.

DLS version (float_init_par3)

Cores Hard-coded Default
1 3.09 3.07
2 1.56 1.56
4 0.81 0.88
8 0.43 0.43
12 0.3 0.3
16 0.25 0.24
20 0.22 0.2
24 0.19 0.2

Array of Random States (float_init_par2)

Cores Hard-coded Default
1 6.16 6.15
2 5.78 8.18
4 4.55 2.19
8 2.21 1.17
12 1.28 0.75
16 2.21 0.71
20 1.49 0.57
24 1.46 0.59

It might be a good idea to stick to default chunk size on the initialization examples. We can address that later on #7, needs a rewrite after the standard library random module changes are merged. Merging this one now.

@Sudha247 Sudha247 merged commit 2c7e8d9 into master Jun 10, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants