Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

failing example_scripts/run_simulation.py last step #445

Closed
Takadonet opened this issue Sep 14, 2022 · 6 comments
Closed

failing example_scripts/run_simulation.py last step #445

Takadonet opened this issue Sep 14, 2022 · 6 comments

Comments

@Takadonet
Copy link
Contributor

When running run-simulation.py in single thread or with mpirun, getting a failure when it appears to aggregate the results.

`Traceback (most recent call last):
File "/locationJUNE/example_scripts/run_simulation.py", line 180, in
combine_records(save_path)
File "/location/lib/python3.9/site-packages/june-1.1.2-py3.9.egg/june/records/records_writer.py", line 386, in combine_records
combine_summaries(
File "/location/lib/python3.9/site-packages/june-1.1.2-py3.9.egg/june/records/records_writer.py", line 337, in combine_summaries
df = df.groupby(["region", "time_stamp"], as_index=False).agg(aggregator)
File "/location/lib/python3.9/site-packages/pandas/core/groupby/generic.py", line 945, in aggregate
result, how = aggregate(self, func, *args, **kwargs)
File "/location/lib/python3.9/site-packages/pandas/core/aggregation.py", line 582, in aggregate
return agg_dict_like(obj, arg, _axis), True
File "/location/lib/python3.9/site-packages/pandas/core/aggregation.py", line 768, in agg_dict_like
results = {key: obj._gotitem(key, ndim=1).agg(how) for key, how in arg.items()}
File "/location/lib/python3.9/site-packages/pandas/core/aggregation.py", line 768, in
results = {key: obj._gotitem(key, ndim=1).agg(how) for key, how in arg.items()}
File "/location/lib/python3.9/site-packages/pandas/core/groupby/generic.py", line 253, in aggregate
return getattr(self, cyfunc)()
File "/location/lib/python3.9/site-packages/pandas/core/groupby/groupby.py", line 1496, in mean
return self._cython_agg_general(
File "/location/lib/python3.9/site-packages/pandas/core/groupby/groupby.py", line 1081, in _cython_agg_general
raise DataError("No numeric types to aggregate")

pandas.core.base.DataError: No numeric types to aggregate`

See multiple summary.##.csv produced (when running mpirun) and a few only have header files with no row data. Believe that is the reason why it is not working. Is this an edge case not taken into account or do I mess something else when running the test script?

@arnauqb
Copy link
Member

arnauqb commented Sep 15, 2022

Hi @Takadonet , can you tell me which world file are you using so I can try to recreate it? Is it the default created by example_scripts/create_world.py ?

@Takadonet
Copy link
Contributor Author

Yes with default world created with example_scripts/create_world.py @arnauqb

@arnauqb
Copy link
Member

arnauqb commented Sep 15, 2022

I see, when running in parallel the "world" is divided into subdomains, and each cpu takes care of simulating one of these domains. The summaries are then stored for each subdomain and combined at the end.

What you are seeing is an edge case where in a subdomain there are no deaths recorded since it is too small or not enough time has passed. When it then tries to combine the summaries it fails since at least one of them is empty, as you've noticed.

As a temporary solution, you could try running with fewer cpus. When you run a larger world with maybe a higher infection rate this should not happen. I just tested it with 6 cpus and it runs fine, how many cpus are you using?

@Takadonet
Copy link
Contributor Author

42 :)
Edge case it is!
Isit ok that I create a PR, to address this so it does not die but simply skips empty summary file(s)?

@arnauqb
Copy link
Member

arnauqb commented Sep 15, 2022

Yes, please go ahead!

@Takadonet
Copy link
Contributor Author

Thanks for merging my changes and removing unused dependencies!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants