kedro.extras.datasets.dask.ParquetDataSet

class kedro.extras.datasets.dask.ParquetDataSet(filepath, load_args=None, save_args=None, credentials=None, fs_args=None)[source]

Bases: kedro.io.core.AbstractDataSet

ParquetDataSet loads and saves data to parquet file(s). It uses Dask remote data services to handle the corresponding load and save operations: https://docs.dask.org/en/latest/remote-data-services.html

Example (AWS S3):

from kedro.extras.datasets.dask import ParquetDataSet
import pandas as pd
import dask.dataframe as dd

data = pd.DataFrame({'col1': [1, 2], 'col2': [4, 5],
                     'col3': [5, 6]})
ddf = dd.from_pandas(data, npartitions=2)

data_set = ParquetDataSet(
    filepath="s3://bucket_name/path/to/folder",
    credentials={
        'client_kwargs':{
            'aws_access_key_id': 'YOUR_KEY',
            'aws_secret_access_key': 'YOUR SECRET',
        }
    },
    save_args={"compression": "GZIP"}
)
data_set.save(ddf)
reloaded = data_set.load()

assert ddf.compute().equals(reloaded.compute())

Attributes

ParquetDataSet.DEFAULT_LOAD_ARGS
ParquetDataSet.DEFAULT_SAVE_ARGS
ParquetDataSet.fs_args Property of optional file system parameters.

Methods

ParquetDataSet.__init__(filepath[, …]) Creates a new instance of ParquetDataSet pointing to concrete parquet files.
ParquetDataSet.exists() Checks whether a data set’s output already exists by calling the provided _exists() method.
ParquetDataSet.from_config(name, config[, …]) Create a data set instance using the configuration provided.
ParquetDataSet.load() Loads data by delegation to the provided load method.
ParquetDataSet.release() Release any cached data.
ParquetDataSet.save(data) Saves data by delegation to the provided save method.
DEFAULT_LOAD_ARGS = {}
DEFAULT_SAVE_ARGS = {'write_index': False}
__init__(filepath, load_args=None, save_args=None, credentials=None, fs_args=None)[source]

Creates a new instance of ParquetDataSet pointing to concrete parquet files.

Parameters:
Return type:

None

exists()

Checks whether a data set’s output already exists by calling the provided _exists() method.

Return type:bool
Returns:Flag indicating whether the output already exists.
Raises:DataSetError – when underlying exists method raises error.
classmethod from_config(name, config, load_version=None, save_version=None)

Create a data set instance using the configuration provided.

Parameters:
  • name (str) – Data set name.
  • config (Dict[str, Any]) – Data set config dictionary.
  • load_version (Optional[str]) – Version string to be used for load operation if the data set is versioned. Has no effect on the data set if versioning was not enabled.
  • save_version (Optional[str]) – Version string to be used for save operation if the data set is versioned. Has no effect on the data set if versioning was not enabled.
Return type:

AbstractDataSet

Returns:

An instance of an AbstractDataSet subclass.

Raises:

DataSetError – When the function fails to create the data set from its config.

fs_args

Property of optional file system parameters.

Return type:Dict[str, Any]
Returns:A dictionary of backend file system parameters, including credentials.
load()

Loads data by delegation to the provided load method.

Return type:Any
Returns:Data returned by the provided load method.
Raises:DataSetError – When underlying load method raises error.
release()

Release any cached data.

Raises:DataSetError – when underlying release method raises error.
Return type:None
save(data)

Saves data by delegation to the provided save method.

Parameters:data (Any) – the value to be saved by provided save method.
Raises:DataSetError – when underlying save method raises error.
Return type:None