Metric/Evaluation Parameter store using Databricks delta tables

Hi GX people!

I am currently using great expectations in Databricks. I would like to setup an Evaluation Parameter Store and/or Metric Store using the Databricks native delta tables in Unity Catalog Database. Does anyone know whether this is somehow possible? So far it looks like I might have to construct a custom SQL alchemy engine? (Have difficulty understanding whether this is possible and how/where to implement it within great expectations)

Any suggestions are welcome!

Best regards,

Tim

Any help would be welcome ^^

Hi @Tim, thanks for reaching out!

I dug into this, and creating a MetricStore using a Databricks table is not currently supported. Right now, our only officially tested backend data store for MetricStores is Postgres, but I did check to see if it might work with Databricks though it’s not under test.

In theory, you’d add the MetricStore to your context by as shown below, providing the Databricks connection string (pattern shown below for reference). However, this throws an error when creating the backend data store.

dbx_connection_string = f"databricks://token:{databricks_api_token}@{host}:{port}/{database}?http_path={http_path}&catalog={catalog}&schema={schema}"

context.add_store(
    store_name="dbx_metrics_store",
    store_config={
        "class_name" : "MetricStore",
        "store_backend" : {
            "class_name" : "DatabaseStoreBackend",
            "connection_string" : dbx_connection_string,
        }
    }
)

I’ve pinged Engineering to double check there’s not a workaround I’m unaware of (will update the thread if there’s anything useful to pass on), however, I’d consider the answer to your question as it’s not currently possible.

Hi Rachel,

Thank you for the attempt and the follow up with Engineering! I will keep track of this thread and will be eagerly waiting for a possible solution :slight_smile:

Best regards,

Tim