[Alpha] Data quality monitoring
Data Quality Monitoring (DQM) is a Feast module aimed to help users to validate their data with the user-curated set of rules. Validation could be applied during:
- Historical retrieval (training dataset generation)
- [planned] Writing features into an online store
- [planned] Reading features from an online store
Its goal is to address several complex data problems, namely:
- Data consistency - new training datasets can be significantly different from previous datasets. This might require a change in model architecture.
- Issues/bugs in the upstream pipeline - bugs in upstream pipelines can cause invalid values to overwrite existing valid values in an online store.
- Training/serving skew - distribution shift could significantly decrease the performance of the model.
To monitor data quality, we check that the characteristics of the tested dataset (aka the tested dataset's profile) are "equivalent" to the characteristics of the reference dataset. How exactly profile equivalency should be measured is up to the user.
The validation process consists of the following steps:
- 3.Validation of tested dataset is performed with reference dataset and profiler provided as parameters.
Feast with Great Expectations support can be installed via
pip install 'feast[ge]'
Great Expectations supports automatic profiling as well as manually specifying expectations:
from great_expectations.dataset import Dataset
from great_expectations.core.expectation_suite import ExpectationSuite
from feast.dqm.profilers.ge_profiler import ge_profiler
def automatic_profiler(dataset: Dataset) -> ExpectationSuite:
from great_expectations.profile.user_configurable_profiler import UserConfigurableProfiler
However, from our experience capabilities of automatic profiler are quite limited. So we would recommend crafting your own expectations:
def manual_profiler(dataset: Dataset) -> ExpectationSuite:
dataset.expect_column_max_to_be_between("column", 1, 2)
During retrieval of historical features,
validation_referencecan be passed as a parameter to methods
.to_arrow(validation_reference=...)of RetrievalJob. If parameter is provided Feast will run validation once dataset is materialized. In case if validation successful materialized dataset is returned. Otherwise,
feast.dqm.errors.ValidationFailedexception would be raised. It will consist of all details for expectations that didn't pass.
from feast import FeatureStore
fs = FeatureStore(".")
job = fs.get_historical_features(...)