master
Search
⌃K
master
Search
⌃K
Introduction
Community & getting help
Roadmap
Changelog
Getting started
Quickstart
Concepts
Architecture
Third party integrations
FAQ
Tutorials
Sample use-case tutorials
Validating historical features with Great Expectations
Using Scalable Registry
Building streaming features
How-to Guides
Running Feast with Snowflake/GCP/AWS
Running Feast in production (e.g. on Kubernetes)
Upgrading for Feast 0.20+
Customizing Feast
Adding or reusing tests
Reference
Codebase Structure
Type System
Data sources
Offline stores
Online stores
Providers
Batch Materialization Engines
Feature repository
Feature servers
[Beta] Web UI
[Alpha] On demand feature view
[Alpha] Data quality monitoring
Feast CLI reference
Python API reference
Usage
Project
Contribution process
Development guide
Backwards Compatibility Policy
Versioning policy
Release process
Feast 0.9 vs Feast 0.10+
Powered By GitBook

Building streaming features

Feast supports registering streaming feature views and Kafka and Kinesis streaming sources. It also provides an interface for stream processing called the Stream Processor. An example Kafka/Spark StreamProcessor is implemented in the contrib folder. For more details, please see the RFC for more details.
Please see here for a tutorial on how to build a versioned streaming pipeline that registers your transformations, features, and data sources in Feast.
Tutorials - Previous
Using Scalable Registry
Next - How-to Guides
Running Feast with Snowflake/GCP/AWS
Last modified 1yr ago