![]() |
HDF5 Last Updated on 2025-12-16
The HDF5 Field Guide
|
Navigate back: Main / HDF5 User Guide
The HDF5 Filter interface (H5Z) provides a flexible pipeline mechanism for processing dataset data during I/O operations. Filters can perform data compression, error checking, data transformation, and other custom operations on dataset chunks.
Filters operate on chunked datasets only (see Using HDF5 Filters for details on dataset chunking) and are applied independently to each chunk. Multiple filters can be chained together in a pipeline, where the output of one filter becomes the input to the next.
HDF5 includes several standard filters:
Filters are configured through dataset creation property lists. Enable chunking first using H5Pset_chunk, then add compression with functions like H5Pset_deflate.
Multiple filters can be combined in a pipeline. Filters are applied in the order they are added during write operations and in reverse order during read operations. Common pipelines combine H5Pset_shuffle, H5Pset_deflate, and H5Pset_fletcher32.
Applications can create and register custom filters:
Custom filters enable domain-specific data transformations, specialized compression algorithms, encryption, and other custom processing.
The H5Z interface provides functions to query available filters:
HDF5 supports dynamic loading of filter plugins, allowing filters to be added without recompiling applications. See HDF5 Filter Plugins for details on creating and using filter plugins.
The H5Z filter interface provides:
Filters are essential for reducing storage requirements and ensuring data integrity in HDF5 files while maintaining compatibility and performance.
Navigate back: Main / HDF5 User Guide