Create a Suite from Json/YML file

I can’t find any relevant resources on how to do this from Great Expectations version 0.18.12. Has anyone done this already or can suggest documentation:

I was looking to implement something along the lines of:

  ...
  print("Expectations init check:")
  self.context.add_or_update_expectation_suite("my_expectation_suite")
  
  # Pass in expectations file
  self.context.create_expectation_suite_from_file(
      "my_expectation_suite",
      expectations_file
  )
  
  validator = self.context.get_validator(
      batch_request=self.batch_request,
      expectation_suite_name="my_expectation_suite",
  )
  
  return validator

Welcome to GX Discourse!

Could you describe your use case in more detail?

The Expectation Suites are stored as YAML in the GX project files and you can edit those directly.
Beyond that, I’m not aware that this feature would exist.

Hello,

I run one by one the expectations I wanted in order to check a table’s columns and saved them in a json file so as to reuse it for checking another table. I am trying to find a way in order to do this. I imported the json file and I assume I have to pass it as an argument in the new suite I use:

context_root_dir = ‘path_name’

context = FileDataContext.create(project_root_dir=context_root_dir)

dataframe_datasource = context.sources.add_or_update_pandas(name=“test_datasource”)

data_asset = dataframe_datasource.add_dataframe_asset(
name=“dataframe_datasource”,
dataframe=df_spark,
)

my_batch_request = data_asset.build_batch_request()

#The json file is imported with the name ‘schema’

expectation_suite_name = “suite_new”

context.add_or_update_expectation_suite(expectation_suite_name=expectation_suite_name, expectations = schema)

The result does not raise an error but it does not add the expectations found in schema:

{
“expectation_suite_name”: “suite_new”,
“ge_cloud_id”: null,
“expectations”: [
{},
{},
{},
{},
{}
],
“data_asset_type”: null,
“meta”: {
“great_expectations_version”: “0.18.9”
}
}

All the above are run in Databricks environment.
I have investigated for days without finding relative material, the manual did not help! Could you please advise on this the soonest possible?
Thank you!