Skip to content

Pin blosc to latest version 1.11.3 #3225

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

pyup-bot
Copy link
Collaborator

This PR pins blosc to the latest release 1.11.3.

Changelog

1.11.2

Enabled use as a CMake subproject, exporting shared & static library targets for super-projects to use. See PRs 178, 179 and 180.  Thanks to Kevin Murray.

Also, internal codecs have been updated.  LZ4 and LZ4HC codecs to 1.7.5 and Zstd to 1.1.2.

For more info, please see the release notes in:

https://github.com/Blosc/c-blosc/blob/master/RELEASE_NOTES.rst

1.11.1

=============================

* Internal C-Blosc sources updated to 1.21.3.

1.11.0

=============================

* Internal C-Blosc sources updated to 1.21.2 (they are a git submodule now).

* Many small code improvements, improved consistency and typo fixes.
Thanks to Dimitri Papadopoulos Orfanos.

* Support for Python 3.11.  Support for Python 3.7 has been dropped.
Thanks to Dimitri Papadopoulos Orfanos.

* Several other fixes, mainly related with the building process, which
should be more solid now in different situations.

1.10.6

=============================

* Add a missed cmake folder to distributed files.  See 253.
Thanks to Ben Greiner.

1.10.5

=============================

- Reenable the possibility to use an already installed C-Blosc library.
See 244.  Thanks to Ben Greiner.
- Add aarch64 wheels. See 250.  Thanks to odidev.
- Deactivate SSE2 and AVX2 if a CPU has no flags.  See 242.
Thanks to Graham Inggs.
- Wheels for Linux 32 bits are not distributed anymore.
- Updated vendored C-Blosc to 1.21.1.

1.10.4

=============================

* Update `blosc.nthreads` when `blosc.set_nthreads()` is called.
Fixes 246

1.10.2

============================

- Updated README.rst with wheels information. See: 
https://pypi.org/project/blosc/

1.10.1

============================

- Added pyproject.toml to fix issues when building the package for a
Python version that does not have a wheel. See:
https://github.com/Blosc/python-blosc/issues/239

- Added blosc/c-blosc/README.md in the source distribution. See:
https://github.com/Blosc/python-blosc/pull/240

- Vendored cpuinfo.py updated to version 7.0.0.

1.10.0

===========================

- Updated vendored C-Blosc to 1.21.0.

- Wheels for Intel (32 and 64 bits) and all major OS (Win, Linux, Mac) are here.
The wheels have support for runtime detection for AVX2, so it will be
automatically leveraged in case the local host has AVX2.  No need anymore to
worry about using different binaries for CPUs not having AVX2 hardware.

Also, we are distributing binaries for C-Blosc libraries (dynamic and static)
and headers.  This way, people trying to use the C-Blosc library can use the
python-blosc wheels to install the necessary development files.  For details,
see: https://github.com/Blosc/c-blosc/blob/master/COMPILING_WITH_WHEELS.rst

We gratefully acknowledge Jeff Hammerbacher for supporting the addition of
wheels for Blosc.

- Officially drop support for Python < 3.7.  Although we did not any explicit
action that is incompatible with older Python versions, we only provide
wheels for Python >= 3.7 (til 3.9).

1.9.2

===========================

- Internal C-Blosc updated to 1.20.1.  This fixes https://github.com/Blosc/python-blosc/issues/229, and also brings  many new updates in internal codecs, providing interesting bumps in performance in some cases.

- Due to recent addition of more cores in new CPUs, the number of  internal threads to be used by default has been increased from 4 to 8.

- Allow zero-copy decompression by allowing bytes-like input.  See PR:  https://github.com/Blosc/python-blosc/pull/230.  Thanks to Lehman  Garrison.

- Fix DeprecationWarning due to invalid escape sequence and use array.tobytes for Python 3.9.

1.9.1

===========================

- Disable the attempt to include support for SSE2 and AVX2 on non-Intel
platforms, allowing the build on such platforms (see 244).  Thanks
to Lehman Garrison.

1.9.0

===========================

- Dropped support for Python 2.7 and 3.5.

- Fixed the copy of the leftovers of a chunk when its size is not a
multiple of the typesize.  Although this is a very unusual situation,
it can certainly happen (e.g.
https://github.com/Blosc/python-blosc/issues/220).

0.6.6

* Add arm64 wheels for macosx (this time for real).

0.6.5

* Add arm64 wheels for macosx.

0.6.4

* Add arm64 wheels and remove musl builds (NumPy not having them makes the build process too long).

0.6.3

* Use oldest-supported-numpy for maximum compatibility.

0.6.2

* Updated C-Blosc2 to 2.6.0.

0.6.1

* Support for Python prefilters and postfilters.  With this, you can pre-process or post-process data in super-chunks automatically.  This machinery is handled internally by C-Blosc2, so it is very efficient (although it cannot work in multi-thread mode due to the GIL).  See the examples/ directory for different ways of using this.

* Support for fillers.  This is a specialization of a prefilter, and it allows to use Python functions to create new super-chunks from different kind of inputs (NumPy, SChunk instances, scalars), allowing computations among them and getting the result automatically compressed.  See a sample script in the examples/ directory.

* Lots of small improvements in the style, consistency and other glitches in the code.  Thanks to Dimitri Papadopoulos for hist attention to detail.

* No need to compile C-Blosc2 tests, benchs or fuzzers.  Compilation time is much shorter now.

* Added `cratio`, `nbytes` and `cbytes` properties to `SChunk` instances.

* Added setters for `dparams` and `cparams` attributes in `SChunk`.

0.5.2

* Honor nested cparams properties in kwargs.  E.g. you can do:


blosc2.save_tensor(a, "test.bl2", mode="w",
                filters=[blosc2.Filter.TRUNC_PREC, blosc2.Filter.BITSHUFFLE],
                filters_meta=[13, 0],
                codec=blosc2.Codec.LZ4,
                clevel=9)


without a need to build a proper `cparams` dict first.

* C-Blosc2 upgraded to 2.4.3.  It should improve cratio for BloscLZ in combination with bitshuffle.

* Prefer pack_tensor/save_tensor in benchs and examples

0.5.1

* Remove the testing of packing PyTorch or TensorFlow objects during wheels build.

0.5.0

* New `pack_tensor`, `unpack_tensor`, `save_tensor` and `load_tensor` functions for serializing/deserializing PyTorch and TensorFlow tensor objects.  They also understand NumPy arrays, so these are the new recommended ones for serialization.

* ``pack_array2`` do not modify the value of a possible `cparams` parameter anymore.

* The `pack_array2` / `save_array` have changed the serialization format to follow the new standard introduced in `pack_tensor`.  In the future `pack_array2` / `save_array` will probably be deprecated, so please change to `pack_tensor` / `save_tensor` as soon as you can.

* The new 'standard' for serialization relies on using the `__pack_tensor__ ` attribute as a `vlmeta` (variable length) metalayer.

0.4.1

* Add `msgpack` as a runtime requirement

0.4.0

* New `pack_array2()` and `unpack_array2()` functions for packing NumPy arrays.  Contrarily to `pack_array()` and `unpack_array()` counterparts, the new ones allow for compressing arrays larger than 2 GB in size.

* New `Scunk.to_cframe()` and `blosc2.from_cframe()` methods for serializing/deserialzing `SChunk` instances.

* New `Schunk.get_slice()`, `SChunk.__getitem__()` and `SChunk.__setitem__()` methods for getting/setting slices from/to `SChunk` instances.

* The `compcode` parameter has been renamed to `codec`.  A `NameError` exception will be raised when using the old name.  Please update your code when you see this exception.

* More doc restructurings.  Hopefully, they are more pleasant to read now :-)

0.3.2

* Several leaks fixed.  Thanks to Christoph Gohlke.

* Internal C-Blosc2 updated to 2.3.1

0.3.0

* Added a new `blosc2.open(urlpath, mode)` function to be able to open persisted super-chunks.

* Added a new tutorial in notebook format (`examples/tutorial-basics.ipynb`) about the basics of python-blosc2.

* Internal C-Blosc2 updated to 2.2.0

0.2.0

* Internal C-Blosc updated to 2.0.4.

New super-chunk implementation

* New `SChunk` class that allows to create super-chunks. 
This includes the capability of storing data in 4 
different ways (sparse/contiguous and in memory/on-disk),
as well as storing variable length metalayers.

* Also, during the contruction of a `SChunk` instance,
an arbitrarily large data buffer can be given so that it is
automatically split in chunks and those are appended to the
`SChunk`.

* See `examples/schunk.py` and `examples/vlmeta.py` for some examples.

* Documentation of the new API is here: https://python-blosc2.readthedocs.io

This release is the result of a grant offered by
the Python Software Foundation to Marta Iborra.
A blog entry was written describing the difficulties and relevant 
aspects learned during the work: 
https://www.blosc.org/posts/python-blosc2-initial-release/

0.1.10

* Release with C-Blosc 2.0.2 sources and binaries.

0.1.9

* Release with C-Blosc 2.0.1 sources and binaries.

0.1.7

* Headers and binaries for the C-Blosc2 library are starting
to being distributed inside wheels.

* Internal C-Blosc2 submodule updated to 2.0.0-rc2.

* Repeating measurements 4 times in benchs so as to get more
consistent figures.

0.1.5

Fix some issues with PyPI packaging.  Wheels are here!
See:  https://github.com/Blosc/python-blosc2/issues/9

0.1.1

Initial release

python-blosc2 aims to leverage the new C-Blosc2 API so as to support super-chunks, serialization and all the features introduced in C-Blosc2. This is work in process and will be done incrementally in future releases.

Note: python-blosc2 is meant to be backward compatible with python-blosc data. That means that it can read data generated with python-blosc, but the opposite is not true (i.e. there is no forward compatibility).

Changes from python-blosc to python-blosc2

* The functions `compress_ptr` and `decompress_ptr`
are replaced by pack and unpack since
Pickle protocol 5 comes with out-of-band data.

* The function `pack_array` is equivalent to `pack`,
which accepts any object with attributes `itemsize`
and `size`.

* On the other hand, the function `unpack` doesn't 
return a numpy array whereas the `unpack_array` 
builds that array.

* The `blosc.NOSHUFFLE` is replaced 
by the `blosc2.NOFILTER`, but for backward 
compatibility `blosc2.NOSHUFFLE` still exists.

* A bytearray or NumPy object can be passed to
the `blosc2.decompress` function to store the 
decompressed data.
Links

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant