-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
PyNIO backend doesn't play well with open_mfdataset #936
Comments
Hi Stephan, I will look into this issue. On Tue, Aug 2, 2016 at 11:01 AM, Stephan Hoyer notifications@github.com
|
Hi Stephan, On Tue, Aug 2, 2016 at 1:35 PM, David Brown dbrown@ucar.edu wrote:
|
The fix I posted on stackoverflow is a bandaid solution -- it requires loading every file into memory all at once, which can be problematic if you have large amounts of data. It occurs to me that the problem might be that we were attempting to concurrently load data from multiple variables in a single file at once. If this is the issue, then it's something we can work around pretty easily with xarray. I'll run some tests later to verify. |
As reported on StackOverflow: http://stackoverflow.com/questions/38711915/segmentation-fault-writing-xarray-datset-to-netcdf-or-dataframe/
It appears that we can only open a single file at a time with pynio?
Adding a thread lock via
lock=True
didn't solve the issue.cc @david-ian-brown
The text was updated successfully, but these errors were encountered: