Search…
Spark (contrib)

Description

The Spark offline store is an offline store currently in alpha development that provides support for reading SparkSources.

Disclaimer

This Spark offline store still does not achieve full test coverage and continues to fail some integration tests when integrating with the feast universal test suite. Please do NOT assume complete stability of the API.
  • Spark tables and views are allowed as sources that are loaded in from some Spark store(e.g in Hive or in memory).
  • Entity dataframes can be provided as a SQL query or can be provided as a Pandas dataframe. Pandas dataframes will be converted to a Spark dataframe and processed as a temporary view.
  • A SparkRetrievalJob is returned when calling get_historical_features().
    • This allows you to call
      • to_df to retrieve the pandas dataframe.
      • to_arrow to retrieve the dataframe as a pyarrow Table.
      • to_spark_df to retrieve the dataframe the spark.

Example

feature_store.yaml
1
project: my_project
2
registry: data/registry.db
3
provider: local
4
offline_store:
5
type: spark
6
spark_conf:
7
spark.master: "local[*]"
8
spark.ui.enabled: "false"
9
spark.eventLog.enabled: "false"
10
spark.sql.catalogImplementation: "hive"
11
spark.sql.parser.quotedRegexColumnNames: "true"
12
spark.sql.session.timeZone: "UTC"
13
online_store:
14
path: data/online_store.db
Copied!
Export as PDF
Copy link
Edit on GitHub