GX with Databricks and Azure Blob Storage

Hello @Chijioke ,

I do not have config_variables.yml and great_expectations.yml either.
Everything GX needs on runtime is created by the function get_gx_context() in class GX_Context. You can put the code from the class directly in your Databricks notebook for testing purposes.

I just had my mount to DataQuality/GX/ on ADLS where 4 directories were created (checkpoints/, expectations/, profilers/ and uncommitted/)

After profiling my dataframes (see code below “To get the expectation-suite JSON files and the checkpoint YML files you can do a profiling…”) I had a .yml in checkpoints/ dir for each dataframe I profiled as well as .json in expectations/ dir.

Both files are loaded during runtime.
Maybe this schema helps a bit:

On your storage account you have to activate the static website like this:

Then you can access the website via URL “Primary endpoint” (see pic (4)).

Your “roor_dir” seems to be a Volume (is UnityCatalog enabled?).
I do not have Unity enabled and my “root_dir” (= context_root_dir) is a mountpoint to ADLS

Can you sent what’s inside the context-Variable at the end?
print(context)