Can't see great expectations yaml file

I’ve deployed Great Expectations in Databricks, but I’m unable to locate the Great Expectations YAML file. All the configurations I’m making are within the notebook. How can I make edits to renderings and other low-level customizations if I don’t have access to the YAML file?

The files do exist in the dbfs and you can access them either from a notebook using command line commands (i.e. !ls) or from your local machine using Databricks CLI

The GX project files are stored in /dbfs/gx/
The files are persistent but I don’t remember how that works exactly, each cluster might have its own storage. Please check that.

What problems did you have specifically that you were not able to set up using Python?

Link to the CLI: GitHub - databricks/cli: Databricks CLI

1 Like

Thanks for the quick reply. I’ve set up a somewhat restricted environment. Unfortunately, I don’t have access to DBFS and CLI due to our organization’s security policy. As a workaround, I’m creating a volume on S3 and saving the files there.
below is my data context initialization
ge_root_dir = “/Volumes/silver_sb/test2/gxtest3”
data_context_config = DataContextConfig(
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory=ge_root_dir),
anonymous_usage_statistics={“enabled”: False}
)
context = BaseDataContext(project_config=data_context_config)

image
attached picture of my directory structure

1 Like

Alright, that makes sense.
Overall storing the configurations outside of DBFS seems like a better way to manage and edit them if Python is not enough and you need direct access to the YAML files.
Also, having the configurations under version control (Git) might be useful.

If this works, feel free to mark the issue as solved.

Actually thats the issue, i have installed great expectations through python from databricks notebook. But in this way i am not able to view the great expectations yaml file in the directory.