sdk/python/tests/unit
. Integration tests are contained in sdk/python/tests/integration
. Let's inspect the structure of sdk/python/tests/integration
:feature_repos
has setup files for most tests in the test suite.conftest.py
(in the parent directory) contains the most common fixtures, which are designed as an abstraction on top of specific offline/online stores, so tests do not need to be rewritten for different stores. Individual test files also contain more specific fixtures.environment
and universal_data_sources
) that can be parametrized to cover various combinations of offline stores, online stores, and providers. This allows tests to run against all these various combinations without requiring excess code. The universal feature repo is constructed by fixtures in conftest.py
with help from the various files in feature_repos
.test_universal_e2e.py
test_go_feature_server.py
test_python_feature_server.py
test_usage_e2e.py
test_validation.py
test_push_features_to_offline_store.py
test_push_features_to_online_store.py
test_offline_write.py
test_universal_historical_retrieval.py
test_universal_online.py
test_feature_logging.py
test_universal_online.py
test_lambda.py
environment
and universal_data_sources
fixtures, which are defined in the feature_repos
directories and the conftest.py
file. This by default pulls in a standard dataset with driver and customer entities (that we have pre-defined), certain feature views, and feature values.environment
fixture sets up a feature store, parametrized by the provider and the online/offline store. It allows the test to query against that feature store without needing to worry about the underlying implementation or any setup that may be involved in creating instances of these datastores.IntegrationTestRepoConfig
which is used by pytest to generate a unique test testing one of the different environments that require testing.@pytest.mark.integration
marker is used to designate integration tests which will cause the test to be run when you call make test-python-integration
.@pytest.mark.universal_offline_stores
marker will parametrize the test on all of the universal offline stores including file, redshift, bigquery and snowflake.full_feature_names
parametrization defines whether or not the test should reference features as their full feature name (fully qualified path) or just the feature name itself.environment
and universal_data_sources
as an argument) to include the relevant test fixtures.universal_offline_stores
and universal_online_store
markers to parametrize the test against different offline store and online store combinations. You can also designate specific online and offline stores to test by using the only
parameter on the marker.pip install -e
.FULL_REPO_CONFIGS
variable defined in feature_repos/repo_configuration.py
. To overwrite this variable without modifying the Feast repo, create your own file that contains a FULL_REPO_CONFIGS
(which will require adding a new IntegrationTestRepoConfig
or two) and set the environment variable FULL_REPO_CONFIGS_MODULE
to point to that file. Then the core offline / online store tests can be run with make test-python-universal
.inference.py
so that Feast can infer your datasource schemastype_map.py
so that Feast knows how to convert your datastores types to Feast-recognized types in feast/types.py
.data_source_creator.py
for your offline store.repo_configuration.py
add a new IntegrationTestRepoConfig
or two (depending on how many online stores you want to test).make test-python-integration.
feast/infra/offline_stores/contrib/
.data_source_creator.py
for your offline store and implement the required APIs.contrib_repo_configuration.py
add a new IntegrationTestRepoConfig
(depending on how many online stores you want to test).make test-python-contrib-universal
.repo_configuration.py
add a new config that maps to a serialized version of configuration you need in feature_store.yaml
to setup the online store.repo_configuration.py
, add new IntegrationTestRepoConfig
for online stores you want to test.make test-python-integration
test_universal_types.py
for an example of how to do this.brew install redis
.redis-server --help
and redis-cli --help
should show corresponding help menus../infra/scripts/redis-cluster.sh start
then ./infra/scripts/redis-cluster.sh create
to start the Redis cluster locally. You should see output that looks like this:./infra/scripts/redis-cluster.sh stop
and then ./infra/scripts/redis-cluster.sh clean
.