Empty DataDocs in Databricks

I’m using the following guide for the installation in databricks: Get started with Great Expectations and Databricks | Great Expectations

Seems to work since I was able to connect and register my unity catalog assets but when I go to the dbfs/gx/uncommitted/datadocs/ is empty, nothing has been generated so I’m not able either to open the suite. Can anyone point if I’m doing something wrong?

Also I was wondering if instead of using dbfs I would be able to use my devops repo to keep those config files but when I set that context path it doesn’t create the folder estructure.

Thanks

Make sure you are actually updating your data docs after it ran. So include this in your checkpoint:

action_list:

  • name: store_validation_result
    action:
    class_name: StoreValidationResultAction
  • name: update_data_docs
    action:
    class_name: UpdateDataDocsAction

Repo should be working, I think it should start with: /Workspace/Repos/… (can be handy to check where it is with dbutils.fs.ls(“file:”+ context_dir)

One thing to already mention about using a repo:
The context cannot be loaded in a job using git (.internal/commit, you will see), as great_expectations opens some files in write mode with open(file, ‘w’), which is currently not allowed within databricks. A solution can be copy the whole context directory to a tmp file in your workspace folder.

Hi Tim,

Many thanks for your help, it works now.

I’ve noticed that setting the context as: context = gx.get_context(context_root_dir=context_root_dir)
The data context path was been set to a tmp location.

So I did this instead: context = gx.get_context() and it detects my root folder and now it’s creating the datadocs