Docker Compose


This guide will give a walk-though on deploying Feast using Docker Compose.

The Docker Compose setup is recommended if you are running Feast locally to try things out. It includes a built in Jupyter Notebook Server that is preloaded with Feast example notebooks to get you started.

0. Requirements

1. Set up environment

Clone the latest stable version of the Feast repository and setup before we deploy:

git clone --depth 1 --branch v0.7.0
cd feast/infra/docker-compose
cp .env.sample .env

2. Start Feast for Online Serving

Use Docker Compose deploy Feast for Online Serving only:

docker-compose up

The Docker Compose deployment will take some time fully startup:

  • During this time you may see some connection failures and container restarts which should be automatically corrected a few minutes.

  • If container restarts do not stop after 10 minutes, try redeploying by

    • Terminating the current deployment with Ctrl-C

    • Deleting any attached volumes with docker-compose down -v

    • Redeploying with docker-compose up

You may see feast_historical_serving exiting with code 1, this expected and does not affect the functionality of Feast for Online Serving.

Once deployed, you should be able to connect at localhost:8888 to the bundled Jupyter Notebook Server and follow in the Online Serving sections of the example notebooks:

3. Start Feast for Training and Online Serving

Historical serving currently requires Google Cloud Platform to function, specifically a Service Account with access to Google Cloud Storage (GCS) and BigQuery.

3.1 Set up Google Cloud Platform

Create a service account for Feast to use. Make sure to copy the JSON key to infra/docker-compose/gcp-service-accounts/key.json under the cloned Feast repository.

gcloud iam service-accounts create feast-service-account
gcloud projects add-iam-policy-binding my-gcp-project \
--member \
--role roles/editor
gcloud iam service-accounts keys create credentials.json --iam-account \
cp credentials.json ${FEAST_REPO}/infra/docker-compose/gcp-service-accounts/key.json
# Required to prevent permissions error in Feast Jupyter:
chown 1000:1000 ${FEAST_REPO}/infra/docker-compose/gcp-service-accounts/key.json

Create a Google Cloud Storage Bucket that Feast will use to load data into and out of BigQuery

gsutil mb gs://my-feast-staging-bucket

Create a BigQuery Dataset for Feast to store historical data:

bq --location=US mk --dataset my_project:feast

3.2 Configure Docker Compose

Configure the .env file under ${FEAST_REPO}/infra/docker-compose/ based on your environment. At the very least you have to modify:




Set this to true to enable historical serving (BigQuery)

3.3 Configure Services

The following configuration has to be set in serving/historical-serving.yml




This is your GCP project Id.


This is the dataset name of the BigQuery dataset to use.


This is the staging location on Google Cloud Storage for retrieval of training datasets. Make sure you append a suffix (ie gs://mybucket/suffix)

The following configuration has to be set in jobcontroller/jobcontroller.yml



Beam ingestion jobs will persist data here before loading it into BigQuery. Use the same bucket as above and make sure you append a different suffix (ie gs://mybucket/anothersuffix).

3.4 Start Feast

Use Docker Compose deploy Feast:

docker-compose up

Once deployed, you should be able to connect at localhost:8888 to the bundled Jupyter Notebook Server with example notebooks.

6. Further Reading