Migration From V0 to V1

Hi GX Community,

I am using GX 0.15.43 and want to migrate to GX CoreV1.

We are using GX in databricks and instantiating a Data Context without a yml file. Below is the sample code

import great_expectations as ge
from great_expectations.core import ExpectationSuite
from great_expectations.core.batch import RuntimeBatchRequest
from great_expectations.data_context import BaseDataContext
from great_expectations.data_context.types.base import FilesystemStoreBackendDefaults,DataContextConfig, DatasourceConfig

expectation_suite_config = {
“expectation_suite_name”: “my_expectation_suite”,
“expectations”: [ # List of expectations
{
“expectation_type”: “expect_column_values_to_not_be_null”,
“kwargs”: {
“column”: “my_column”,
“result_format”: {“result_format”: “COMPLETE”}

        }
    }
]

}

my_expectation_suite = ExpectationSuite(**expectation_suite_config)

Define DataContext configuration

data_context_config = DataContextConfig(
plugins_directory=None,
config_variables_file_path= None,
datasources={
“my_spark_datasource”: DatasourceConfig(
class_name= “Datasource”,
execution_engine={
“class_name”: “SparkDFExecutionEngine”,
“force_reuse_spark_context”: True,

        },
        data_connectors={
            "spark_runtime_dataconnector":{
                "class_name": "RuntimeDataConnector",
                "module_name":"great_expectations.datasource.data_connector",
                "batch_identifiers": ["batch_name"]
            },
        },
    )
},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory="/"),

)

from pyspark.sql import SparkSession

Create a SparkSession

spark = SparkSession.builder.appName(“SparkDataFrame”).getOrCreate()

Define the data

data = [(“John”, 30), (“Anna”, 25), (“Peter”, 32)]

Create the DataFrame

df = spark.createDataFrame(data, [“name”, “age”])

Show the DataFrame

df.show()

batch_request=RuntimeBatchRequest(datasource_name=“my_spark_datasource”,
data_connector_name=“spark_runtime_dataconnector”,
data_asset_name=“my_asset”,
runtime_parameters={“batch_data”: df},
batch_identifiers={“batch_name”: “batch_run”})

context = ge.get_context(project_config=data_context_config)
batch_validator = context.get_validator(batch_request=batch_request, expectation_suite=my_expectation_suite)
validation_result = batch_validator.validate()
print(validation_result)

I am facing challenges in migrating data context config object to GX V1. i am following below document for migration : GX V0 to V1 Migration Guide | Great Expectations
still didn’t find any solution.

Please help.

Hi there

It does not appear as though you are following the migration guide as I am seeing several references to older, deprecated methods in the code that you have submitted. I would recommend following our migration guide or using the our latest documentation which contains helpful code snippets in order to complete your migration.

Thank you for the reply.
I am facing challenge in finding the code snippet for the below. I don’t see any document on migrating datacontext config object.

data_context_config = DataContextConfig(
plugins_directory=None,
config_variables_file_path= None,
datasources={
“my_spark_datasource”: DatasourceConfig(
class_name= “Datasource”,
execution_engine={
“class_name”: “SparkDFExecutionEngine”,
“force_reuse_spark_context”: True,

        },
        data_connectors={
            "spark_runtime_dataconnector":{
                "class_name": "RuntimeDataConnector",
                "module_name":"great_expectations.datasource.data_connector",
                "batch_identifiers": ["batch_name"]
            },
        },
    )
},
store_backend_defaults=FilesystemStoreBackendDefaults(root_directory="/"),

)

If you have any reference, please share