Hybrid

Description

The HybridOfflineStore allows routing offline feature operations to different offline store backends based on the batch_source of the FeatureView. This enables a single Feast deployment to support multiple offline store backends, each configured independently and selected dynamically at runtime.

Getting started

To use the HybridOfflineStore, install Feast with all required offline store dependencies (e.g., BigQuery, Snowflake, etc.) for the stores you plan to use. For example:

pip install 'feast[spark,snowflake]'

Example

feature_store.yaml
project: my_feature_repo
registry: data/registry.db
provider: local
offline_store:
  type: hybrid_offline_store.HybridOfflineStore
  offline_stores:
    - type: spark
      conf:
        spark_master: local[*]
        spark_app_name: feast_spark_app
    - type: snowflake
      conf:
        account: my_snowflake_account
        user: feast_user
        password: feast_password
        database: feast_database
        schema: feast_schema

Example FeatureView

Then you can use materialize API to materialize the data from the specified offline store based on the batch_source of the FeatureView.

Functionality Matrix

Feature/Functionality
Supported

pull_latest_from_table_or_query

Yes

pull_all_from_table_or_query

Yes

offline_write_batch

Yes

validate_data_source

Yes

get_table_column_names_and_types_from_data_source

Yes

write_logged_features

No

get_historical_features

Only with same data source

Last updated

Was this helpful?