Welcome! The best way to work with Great Expectations is in this iterative dev loop.
What are Expectations?
An expectation is a falsifiable, verifiable statement about data.
Expectations provide a language to talk about data characteristics and data quality - humans to humans, humans to machines and machines to machines.
Expectations are both data tests and docs!
An expectation can be presented in a machine-friendly JSON
A machine can test if a dataset conforms to the expectation.
Validation produces a validation result object
Here's an example JSON (not from your data). This object has rich context about the test failure.
Validation results save you time.
This is an example of what a single failed Expectation looks like in Data Docs. Note the failure includes unexpected values from your data. This helps you debug pipeines faster.
Great Expectations provides a large library of expectations.
Built in expectations allow you to express how you understand your data, and you can add custom
expectations if you need a new one.
Now explore and edit the sample suite!
This sample suite shows you a few examples of expectations.
Note this is not a production suite and was autogenerated using only a small snippet of your data.
When you are ready, press the How to Edit button to kick off the iterative dev loop.