Description
Code Sample, a copy-pastable example if possible
(essentially the same to #1629)
Opening netCDF file via xr.open_dataset
locks a resource, preventing to write a file with the same name (as pointed out and answered as an expected behavior in #1629).
import xarray as xr
ds = xr.Dataset({'var': ('x', [0, 1, 2])})
ds.to_netcdf('test.nc')
ds_read = xr.open_dataset('test.nc')
ds.to_netcdf('test.nc') # -> PermissionError
ds_read = xr.open_dataset('test.nc').load()
ds.to_netcdf('test.nc') # -> PermissionError
ds_read = xr.open_dataset('test.nc').load()
ds_read.close()
ds.to_netcdf('test.nc') # no error
Problem description
Another program cannot write the same netCDF file that xarray has opened, unless close
method is not called.
-- EDIT --
close()
method does not return the object, thus it cannot be put in the chain call, such as
some_function(xr.open_dataset('test.nc').close())
It is understandable when we do not want to load the entire file into the memory.
However, sometimes I want to read the file that will be updated soon by another program.
Also, I think that many users who are not accustomed to netCDF may expect this behavior (as np.loadtxt
does) and will be surprised after getting PermissionError
.
I think it would be nice to have an option such as load_all=True
or even make it a default?
Expected Output
No error
Output of xr.show_versions()
xarray: 0.12.0+11.g7d0e895f.dirty
pandas: 0.23.4
numpy: 1.15.4
scipy: 1.2.0
netCDF4: 1.4.2
pydap: None
h5netcdf: None
h5py: 2.8.0
Nio: None
zarr: None
cftime: 1.0.2.1
nc_time_axis: None
PseudonetCDF: None
rasterio: None
cfgrib: None
iris: None
bottleneck: 1.2.1
dask: 1.0.0
distributed: 1.25.0
matplotlib: 2.2.2
cartopy: None
seaborn: 0.9.0
setuptools: 40.5.0
pip: 18.1
conda: None
pytest: 4.0.1
IPython: 7.1.1
sphinx: 1.8.2