Description
openedon Oct 6, 2023
I have a (currently proof-of-concept) library extending netCDF to support complex numbers, including all of the most common conventions that people are currently using. This is implemented as a C library so that I can consistently extend all the language bindings. The Python API is built on top of netcdf4-python and is an entirely drop-in replacement/extension. Here's an example of writing a complex array and reading it back, demonstrating the correct dtype both ways:
import nc_complex as netCDF4
import numpy as np
complex_array = np.array([0 + 0j, 1 + 0j, 0 + 1j, 1 + 1j, 0.25 + 0.75j], dtype="c16")
with netCDF4.Dataset(filename, "w") as f:
f.createDimension("x", size=len(complex_array))
complex_var = f.createVariable("complex_data", "c16", ("x",))
complex_var[:] = complex_array
with netCDF4.Dataset(filename, "r") as f:
print(f["complex_data"])
print(f["complex_data"][:])
# <class 'nc_complex.Variable'>
# compound data_dim(x)
# compound data type: complex128
# unlimited dimensions:
# current shape = (5,)
# [0. +0.j 1. +0.j 0. +1.j 1. +1.j 0.25+0.75j]
Is there potentially any interest in merging the modifications for the Python API in here? It would require using my C library and switching out some calls to my wrappers. It could also be possible to make this behaviour optional if that was a concern.
As a side note, I found scikit-build-core really, really useful for building the Cython module. Might simplify the build system here too?