I am using Databricks with great expectations 1.0.1 version.
I want with a single checkpoint to validate two different data assets with their respective expectations suites. I have the following:
Durin checkpoint.run() I am getting the following error:
BuildBatchRequestError: Bad input to build_batch_request: options must contain exactly 1 key, ādataframeā.
When using dataframes as the input data, the checkpoint run method always requires you to pass in the dataframe that should be validated as batch parameters. This dataframe is then passed on as the batch that is validated. Unfortunately, the Checkpoint canāt save the dataframe or a way to download the dataframe.
My guess is that it also canāt validate both datasets, it will just pass the same dataframe to both validation definitions.
If possible, consider using Databricks SQL instead. That allows you to perform the exact workflow you have here.
If I run either checkpoint.run() or validation_definition.run() without any parameter I get this error: BuildBatchRequestError: Bad input to build_batch_request: options must contain exactly 1 key, 'dataframe'.
If I run them with this parameter batch_parameters = {ādataframeā: df}, I get this error: TypeError: ValidationDefinition.run() takes 1 positional argument but 2 were given so I wonder what is the correct format ?