Streaming (battdat.streaming)#
Retrieve data in smaller chunks from a large HDF5 file
- battdat.streaming.iterate_cycles_from_file(hdf5_path: Path | str | HDFStore, make_dataset: bool = False, key: str | Collection[str] | None = 'raw_data') Iterator[DataFrame | Dict[str, DataFrame] | BatteryDataset]#
Stream single-cycle datasets from the HDF5 file
- Parameters:
hdf5_path – Path to the data file
make_dataset – Whether to form a
BatteryDatasetfor each cycle, including the metadata from the source file.key – Which table(s) to read. Supply either a single key, a list of keys, or
Noneto read all tables
- Yields:
All rows belonging to each cycle from the requested table of the HDF5 file. Generates a
BatteryDatasetifmake_datasetisTrue. Otherwise, yields a single DataFrame ifkeyis a single string or a dictionary of DataFrames ifkeyis a list.
- battdat.streaming.iterate_records_from_file(hdf5_path: Path | str | HDFStore, key: str = 'raw_data') Iterator[Dict[str, str | float | int]]#
Stream individual records from a file
- Parameters:
hdf5_path – Path to the data file
key – Which table to read
- Yields:
Individual rows from the “raw_data” section of the HDF5 file
HDF5 Streaming (b.streaming.hdf5)#
Streaming tools related to the HDF5 format
- class battdat.streaming.hdf5.HDF5Writer(hdf5_output: ~pathlib.Path | str | ~tables.file.File, write_mode: str = 'a', metadata: ~battdat.schemas.BatteryMetadata = <factory>, schema: ~battdat.schemas.column.ColumnSchema = <factory>, complevel: int = 0, complib: str = 'zlib', key: str = '', buffer_size: int = 32768, _file: ~tables.file.File | None = None, _dtype: ~numpy.dtype | None = None, _table: ~tables.table.Table | None = None, _write_buffer: ~typing.List[~typing.Dict] | None = None)#
Bases:
AbstractContextManagerTool to write raw time series data to an HDF5 file incrementally
Writes data to the
raw_datakey of a different dataset.- complevel: int = 0#
Compression level. Can be between 0 (no compression) and 9 (maximum compression). Ignored if data table already exists
- complib: str = 'zlib'#
Compression algorithm. Consult
read_hdf()for available options. Ignored if data table already exists
- key: str = ''#
Name of the root group in which to store the data. Ignored if
hdf5_outputis aFile.
- metadata: BatteryMetadata#
Metadata describing the cell
- schema: ColumnSchema#
Schema describing columns of the cell
- write_mode: str = 'a'#
Mode to use when opening the HDF5 file. Ignored if
hdf5_outputis aFile.