The BigQuery offline store provides support for reading BigQuerySources.
BigQuery tables and views are allowed as sources.
All joins happen within BigQuery.
Entity dataframes can be provided as a SQL query or can be provided as a Pandas dataframe. Pandas dataframes will be uploaded to BigQuery in order to complete join operations.
A BigQueryRetrievalJob is returned when calling get_historical_features()
.
Configuration options are available here.
The File offline store provides support for reading FileSources.
Only Parquet files are currently supported.
All data is downloaded and joined using Python and may not scale to production workloads.
Configuration options are available here.
Redshift tables and views are allowed as sources.
All joins happen within Redshift.
Entity dataframes can be provided as a SQL query or can be provided as a Pandas dataframe. Pandas dataframes will be uploaded to Redshift in order to complete join operations.
Feast requires the following permissions in order to execute commands for Redshift offline store:
The following inline policy can be used to grant Feast the necessary permissions:
The following inline policy can be used to grant Redshift necessary permissions to access S3:
While the following trust relationship is necessary to make sure that Redshift, and only Redshift can assume this role:
The Redshift offline store provides support for reading .
A is returned when calling get_historical_features()
.
Configuration options are available .
In addition to this, Redshift offline store requires an IAM role that will be used by Redshift itself to interact with S3. More concretely, Redshift has to use this IAM role to run and commands. Once created, this IAM role needs to be configured in feature_store.yaml
file as offline_store: iam_role
.
Command | Permissions | Resources |
Apply | redshift-data:DescribeTable redshift:GetClusterCredentials | arn:aws:redshift:<region>:<account_id>:dbuser:<redshift_cluster_id>/<redshift_username> arn:aws:redshift:<region>:<account_id>:dbname:<redshift_cluster_id>/<redshift_database_name> arn:aws:redshift:<region>:<account_id>:cluster:<redshift_cluster_id> |
Materialize | redshift-data:ExecuteStatement | arn:aws:redshift:<region>:<account_id>:cluster:<redshift_cluster_id> |
Materialize | redshift-data:DescribeStatement | * |
Materialize | s3:ListBucket s3:GetObject s3:DeleteObject | arn:aws:s3:::<bucket_name> arn:aws:s3:::<bucket_name>/* |
Get Historical Features | redshift-data:ExecuteStatement redshift:GetClusterCredentials | arn:aws:redshift:<region>:<account_id>:dbuser:<redshift_cluster_id>/<redshift_username> arn:aws:redshift:<region>:<account_id>:dbname:<redshift_cluster_id>/<redshift_database_name> arn:aws:redshift:<region>:<account_id>:cluster:<redshift_cluster_id> |
Get Historical Features | redshift-data:DescribeStatement | * |
Get Historical Features | s3:ListBucket s3:GetObject s3:PutObject s3:DeleteObject | arn:aws:s3:::<bucket_name> arn:aws:s3:::<bucket_name>/* |