LogoLogo
v0.34-branch
v0.34-branch
  • Introduction
  • Community & getting help
  • Roadmap
  • Changelog
  • Getting started
    • Quickstart
    • Concepts
      • Overview
      • Data ingestion
      • Entity
      • Feature view
      • Feature retrieval
      • Point-in-time joins
      • Registry
      • [Alpha] Saved dataset
    • Architecture
      • Overview
      • Registry
      • Offline store
      • Online store
      • Batch Materialization Engine
      • Provider
    • Third party integrations
    • FAQ
  • Tutorials
    • Sample use-case tutorials
      • Driver ranking
      • Fraud detection on GCP
      • Real-time credit scoring on AWS
      • Driver stats on Snowflake
    • Validating historical features with Great Expectations
    • Using Scalable Registry
    • Building streaming features
  • How-to Guides
    • Running Feast with Snowflake/GCP/AWS
      • Install Feast
      • Create a feature repository
      • Deploy a feature store
      • Build a training dataset
      • Load data into the online store
      • Read features from the online store
      • Scaling Feast
      • Structuring Feature Repos
    • Running Feast in production (e.g. on Kubernetes)
    • Upgrading for Feast 0.20+
    • Customizing Feast
      • Adding a custom batch materialization engine
      • Adding a new offline store
      • Adding a new online store
      • Adding a custom provider
    • Adding or reusing tests
  • Reference
    • Codebase Structure
    • Type System
    • Data sources
      • Overview
      • File
      • Snowflake
      • BigQuery
      • Redshift
      • Push
      • Kafka
      • Kinesis
      • Spark (contrib)
      • PostgreSQL (contrib)
      • Trino (contrib)
      • Azure Synapse + Azure SQL (contrib)
    • Offline stores
      • Overview
      • File
      • Snowflake
      • BigQuery
      • Redshift
      • Spark (contrib)
      • PostgreSQL (contrib)
      • Trino (contrib)
      • Azure Synapse + Azure SQL (contrib)
    • Online stores
      • Overview
      • SQLite
      • Snowflake
      • Redis
      • Dragonfly
      • Datastore
      • DynamoDB
      • Bigtable
      • PostgreSQL (contrib)
      • Cassandra + Astra DB (contrib)
      • MySQL (contrib)
      • Rockset (contrib)
      • Hazelcast (contrib)
    • Providers
      • Local
      • Google Cloud Platform
      • Amazon Web Services
      • Azure
    • Batch Materialization Engines
      • Bytewax
      • Snowflake
      • AWS Lambda (alpha)
      • Spark (contrib)
    • Feature repository
      • feature_store.yaml
      • .feastignore
    • Feature servers
      • Python feature server
      • [Alpha] Go feature server
      • [Alpha] AWS Lambda feature server
    • [Beta] Web UI
    • [Alpha] On demand feature view
    • [Alpha] Data quality monitoring
    • Feast CLI reference
    • Python API reference
    • Usage
  • Project
    • Contribution process
    • Development guide
    • Backwards Compatibility Policy
      • Maintainer Docs
    • Versioning policy
    • Release process
    • Feast 0.9 vs Feast 0.10+
Powered by GitBook
On this page
  • Overview
  • Usage

Was this helpful?

Edit on GitHub
Export as PDF
  1. How-to Guides

Upgrading for Feast 0.20+

PreviousRunning Feast in production (e.g. on Kubernetes)NextCustomizing Feast

Last updated 1 year ago

Was this helpful?

Overview

Starting with Feast 0.20, the APIs of many core objects (e.g. feature views and entities) have been changed. For example, many parameters have been renamed. These changes were made in a backwards-compatible fashion; existing Feast repositories will continue to work until Feast 0.23, without any changes required. However, Feast 0.24 will fully deprecate all of the old parameters, so in order to use Feast 0.24+ users must modify their Feast repositories.

There are currently deprecation warnings that indicate to users exactly how to modify their repos. In order to make the process somewhat easier, Feast 0.23 also introduces a new CLI command, repo-upgrade, that will partially automate the process of upgrading Feast repositories.

The upgrade command aims to automatically modify the object definitions in a feature repo to match the API required by Feast 0.24+. When running the command, the Feast CLI analyzes the source code in the feature repo files using , and attempted to rewrite the files in a best-effort way. It's possible for there to be parts of the API that are not upgraded automatically.

The repo-upgrade command is specifically meant for upgrading Feast repositories that were initially created in versions 0.23 and below to be compatible with versions 0.24 and above. It is not intended to work for any future upgrades.

Usage

At the root of a feature repo, you can run feast repo-upgrade. By default, the CLI only echos the changes it's planning on making, and does not modify any files in place. If the changes look reasonably, you can specify the --write flag to have the changes be written out to disk.

An example:

$ feast repo-upgrade --write
--- /Users/achal/feast/prompt_dory/example.py
+++ /Users/achal/feast/prompt_dory/example.py
@@ -13,7 +13,6 @@
     path="/Users/achal/feast/prompt_dory/data/driver_stats.parquet",
     event_timestamp_column="event_timestamp",
     created_timestamp_column="created",
-    date_partition_column="created"
 )

 # Define an entity for the driver. You can think of entity as a primary key used to
--- /Users/achal/feast/prompt_dory/example.py
+++ /Users/achal/feast/prompt_dory/example.py
@@ -3,7 +3,7 @@
 from google.protobuf.duration_pb2 import Duration
 import pandas as pd

-from feast import Entity, Feature, FeatureView, FileSource, ValueType, FeatureService, OnDemandFeatureView
+from feast import Entity, FeatureView, FileSource, ValueType, FeatureService, OnDemandFeatureView

 # Read data from parquet files. Parquet is convenient for local development mode. For
 # production, you can use your favorite DWH, such as BigQuery. See Feast documentation
--- /Users/achal/feast/prompt_dory/example.py
+++ /Users/achal/feast/prompt_dory/example.py
@@ -4,6 +4,7 @@
 import pandas as pd

 from feast import Entity, Feature, FeatureView, FileSource, ValueType, FeatureService, OnDemandFeatureView
+from feast import Field

 # Read data from parquet files. Parquet is convenient for local development mode. For
 # production, you can use your favorite DWH, such as BigQuery. See Feast documentation
--- /Users/achal/feast/prompt_dory/example.py
+++ /Users/achal/feast/prompt_dory/example.py
@@ -28,9 +29,9 @@
     entities=[driver_id],
     ttl=Duration(seconds=86400 * 365),
     features=[
-        Feature(name="conv_rate", dtype=ValueType.FLOAT),
-        Feature(name="acc_rate", dtype=ValueType.FLOAT),
-        Feature(name="avg_daily_trips", dtype=ValueType.INT64),
+        Field(name="conv_rate", dtype=ValueType.FLOAT),
+        Field(name="acc_rate", dtype=ValueType.FLOAT),
+        Field(name="avg_daily_trips", dtype=ValueType.INT64),
     ],
     online=True,
     batch_source=driver_hourly_stats,

To write these changes out, you can run the same command with the --write flag:

$ feast repo-upgrade  --write

You should see the same output, but also see the changes reflected in your feature repo on disk.

bowler