All pages
Powered by GitBook
1 of 5

Loading...

Loading...

Loading...

Loading...

Loading...

Providers

Please see Provider for an explanation of providers.

LocalGoogle Cloud PlatformAmazon Web ServicesAzure

Azure

Description

  • Offline Store: Uses the MsSql offline store by default. Also supports File as the offline store.

  • Online Store: Uses the Redis online store by default. Also supports Sqlite as an online store.

Disclaimer

The Azure provider does not achieve full test coverage. Please do not assume complete stability.

Getting started

In order to use this offline store, you'll need to run pip install 'feast[azure]'. You can get started by then following this .

Example

feature_store.yaml
registry:
  registry_store_type: AzureRegistryStore
  path: ${REGISTRY_PATH} # Environment Variable
project: production
provider: azure
online_store:
    type: redis
    connection_string: ${REDIS_CONN} # Environment Variable
tutorial

Local

Description

  • Offline Store: Uses the File offline store by default. Also supports BigQuery as the offline store.

  • Online Store: Uses the Sqlite online store by default. Also supports Redis and Datastore as online stores.

Example

feature_store.yaml
project: my_feature_repo
registry: data/registry.db
provider: local

Google Cloud Platform

Description

  • Offline Store: Uses the BigQuery offline store by default. Also supports File as the offline store.

  • Online Store: Uses the Datastore online store by default. Also supports Sqlite as an online store.

Getting started

In order to use this offline store, you'll need to run pip install 'feast[gcp]'. You can get started by then running feast init -t gcp.

Example

Permissions

Amazon Web Services

Description

  • Offline Store: Uses the Redshift offline store by default. Also supports File as the offline store.

  • Online Store: Uses the DynamoDB online store by default. Also supports Sqlite as an online store.

Getting started

In order to use this offline store, you'll need to run (Snowflake) pip install 'feast[aws, snowflake]' or (Redshift) pip install 'feast[aws]'.

You can get started by then running feast init -t snowflake or feast init -t aws.

Example

bigquery.jobs.create

roles/bigquery.user

Materialize

Datastore (destination)

datastore.entities.allocateIds

datastore.entities.create

datastore.entities.delete

datastore.entities.get

datastore.entities.list

datastore.entities.update

datastore.databases.get

roles/datastore.owner

Get Online Features

Datastore

datastore.entities.get

roles/datastore.user

Get Historical Features

BigQuery (source)

bigquery.datasets.get

bigquery.tables.get

bigquery.tables.create

bigquery.tables.updateData

bigquery.tables.update

bigquery.tables.delete

bigquery.tables.getData

roles/bigquery.dataEditor

Command

Component

Permissions

Recommended Role

Apply

BigQuery (source)

bigquery.jobs.create

bigquery.readsessions.create

bigquery.readsessions.getData

roles/bigquery.user

Apply

Datastore (destination)

datastore.entities.allocateIds

datastore.entities.create

datastore.entities.delete

datastore.entities.get

datastore.entities.list

datastore.entities.update

roles/datastore.owner

Materialize

BigQuery (source)

feature_store.yaml
project: my_feature_repo
registry: data/registry.db
provider: aws
online_store:
  type: dynamodb
  region: us-west-2
offline_store:
  type: redshift
  region: us-west-2
  cluster_id: feast-cluster
  database: feast-database
  user: redshift-user
  s3_staging_location: s3://feast-bucket/redshift
  iam_role: arn:aws:iam::123456789012:role/redshift_s3_access_role
feature_store.yaml
project: my_feature_repo
registry: gs://my-bucket/data/registry.db
provider: gcp