Warning: This is an experimental feature. It's intended for early testing and feedback, and could change without warnings in future releases.
To enable this feature, run feast alpha enable direct_ingest_to_online_store
Streaming data sources are important sources of feature values. A typical setup with streaming data looks like:
Raw events come in (stream 1)
Streaming transformations applied (e.g. last_N_purchased_categories
) (stream 2)
Write stream 2 values to an offline store as a historical log for training
Write stream 2 values to an online store for low latency feature serving
Periodically materialize feature values from the offline store into the online store for improved correctness
Feast now allows users to push features previously registered in a feature view to the online store. This most commonly would be done from a stream processing job (e.g. a Beam or Spark Streaming job). Future versions of Feast will allow writing features directly to the offline store as well.
See https://github.com/feast-dev/feast-demo for an example on how to ingest stream data into Feast.
We register a feature view as normal, and during stream processing (e.g. Kafka consumers), now we push a dataframe matching the feature view schema:
Feast will coordinate between pushed stream data and regular materialization jobs to ensure only the latest feature values are written to the online store. This ensures correctness in served features for model inference.