The Python feature server is an HTTP endpoint that serves features with JSON I/O. This enables users to write and read features from the online store using any programming language that can make HTTP requests.
There is a CLI command that starts the server: feast serve
. By default, Feast uses port 6566; the port be overridden with a --port
flag.
One can deploy a feature server by building a docker image that bundles in the project's feature_store.yaml
. See this helm chart for an example on how to run Feast on Kubernetes.
Here's an example of how to start the Python feature server with a local feature repo:
After the server starts, we can execute cURL commands from another terminal tab:
It's also possible to specify a feature service name instead of the list of features:
The Python feature server also exposes an endpoint for push sources. This endpoint allows you to push data to the online and/or offline store.
The request definition for PushMode
is a string parameter to
where the options are: ["online"
, "offline"
, "online_and_offline"
].
Note: timestamps need to be strings, and might need to be timezone aware (matching the schema of the offline store)
or equivalently from Python:
Enabling TLS mode ensures that data between the Feast client and server is transmitted securely. For an ideal production environment, it is recommended to start the feature server in TLS mode.
In development mode we can generate a self-signed certificate for testing. In an actual production environment it is always recommended to get it from a trusted TLS certificate provider.
The above command will generate two files
key.pem
: certificate private key
cert.pem
: certificate public key
To start the feature server in TLS mode, you need to provide the private and public keys using the --key
and --cert
arguments with the feast serve
command.
Please refer the page for more details on how to configure authentication and authorization.
The Offline feature server is an Apache Arrow Flight Server that uses the gRPC communication protocol to exchange data. This server wraps calls to existing offline store implementations and exposes interfaces as Arrow Flight endpoints.
There is a CLI command that starts the Offline feature server: feast serve_offline
. By default, remote offline server uses port 8815, the port can be overridden with a --port
flag.
The Offline feature server can be deployed using helm chart see this helm chart.
User need to set feast_mode=offline
, when installing Offline feature server as shown in the helm command below:
The complete example can be found under remote-offline-store-example
Please see the detail how to configure offline store client remote-offline-store.md
The set of functionalities supported by remote offline stores is the same as those supported by offline stores with the SDK, which are described in detail here.
Please refer the page for more details on how to configure authentication and authorization.
Feast users can choose to retrieve features from a feature server, as opposed to through the Python SDK.
The Go feature server is an HTTP/gRPC endpoint that serves features. It is written in Go, and is therefore significantly faster than the Python feature server. See this for more details on the comparison between Python and Go. In general, we recommend the Go feature server for all production use cases that require extremely low-latency feature serving. Currently only the Redis and SQLite online stores are supported.
By default, the Go feature server is turned off. To turn it on you can add go_feature_serving: True
to your feature_store.yaml
:
Then the feast serve
CLI command will start the Go feature server. As with Python, the Go feature server uses port 6566 by default; the port be overridden with a --port
flag. Moreover, the server uses HTTP by default, but can be set to use gRPC with --type=grpc
.
Alternatively, if you wish to experiment with the Go feature server instead of permanently turning it on, you can just run feast serve --go
.
The Go component comes pre-compiled when you install Feast with Python versions 3.8-3.10 on macOS or Linux (on x86). In order to install the additional Python dependencies, you should install Feast with
For macOS, run brew install apache-arrow
. For linux users, you have to install libarrow-dev
.
For developers, if you want to build from source, run make compile-go-lib
to build and compile the go server. In order to build the go binaries, you will need to install the apache-arrow
c++ libraries.
The Go feature server can log all requested entities and served features to a configured destination inside an offline store. This allows users to create new datasets from features served online. Those datasets could be used for future trainings or for feature validations. To enable feature logging we need to edit feature_store.yaml
:
Feature logging configuration in feature_store.yaml
also allows to tweak some low-level parameters to achieve the best performance:
All these parameters are optional.
The logic for the Go feature server can also be used to retrieve features during a Python get_online_features
call. To enable this behavior, you must add go_feature_retrieval: True
to your feature_store.yaml
. You must also have all the dependencies installed as detailed above.
Endpoint | Resource Type | Permission | Description |
---|---|---|---|
Endpoint | Resource Type | Permission | Description |
---|---|---|---|
You must also install the Apache Arrow C++ libraries. This is because the Go feature server uses the cgo memory allocator from the Apache Arrow C++ library for interoperability between Go and Python, to prevent memory from being accidentally garbage collected when executing on-demand feature views. You can read more about the usage of the cgo memory allocator in these .
/get-online-features
FeatureView,OnDemandFeatureView
Read Online
Get online features from the feature store
/push
FeatureView
Write Online, Write Offline, Write Online and Offline
Push features to the feature store (online, offline, or both)
/write-to-online-store
FeatureView
Write Online
Write features to the online store
/materialize
FeatureView
Write Online
Materialize features within a specified time range
/materialize-incremental
FeatureView
Write Online
Incrementally materialize features up to a specified timestamp
offline_write_batch
FeatureView
Write Offline
Write a batch of data to the offline store
write_logged_features
FeatureService
Write Offline
Write logged features to the offline store
persist
DataSource
Write Offline
Persist the result of a read in the offline store
get_historical_features
FeatureView
Read Offline
Retrieve historical features
pull_all_from_table_or_query
DataSource
Read Offline
Pull all data from a table or read it
pull_latest_from_table_or_query
DataSource
Read Offline
Pull the latest data from a table or read it