PySprk - ValueError: Unrecognized spark type: DecimalType(20,0)

Dear Team,
I am trying to implement the to_be_of_type expectation mentioned here for DecimalType with precision and scale.

However, in PySpark I am getting following error while testing it.
Is there a possibility to validate the DecimalType with precision and scale values?

I am using GE version 0.14.12. Could you please let me know how to make it work?

Let me know if you need any further information.

{
    "success": False,
    "expectation_config": {
        "expectation_type": "expect_column_values_to_be_of_type",
        "meta": {},
        "kwargs": {
            "column": "project_id",
            "type_": "DecimalType(20,0)",
            "result_format": {
                "result_format": "SUMMARY"
            }
        }
    },
    "meta": {},
    "exception_info": {
        "raised_exception": True,
        "exception_message": "ValueError: Unrecognized spark type: DecimalType(20,0)",
        "exception_traceback": "Traceback (most recent call last):\n  File "/home/spark/.local/lib/python3.7/site-packages/great_expectations/dataset/sparkdf_dataset.py", line 1196, in expect_column_values_to_be_of_type\n    success = issubclass(col_type, getattr(sparktypes, type_))\nAttributeError: module \"pyspark.sql.types\" has no attribute \"DecimalType(20,0)\"\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File "/home/spark/.local/lib/python3.7/site-packages/great_expectations/data_asset/data_asset.py", line 275, in wrapper\n    return_obj = func(self, **evaluation_args)\n  File "/home/spark/.local/lib/python3.7/site-packages/great_expectations/dataset/sparkdf_dataset.py", line 1201, in expect_column_values_to_be_of_type\n    raise ValueError(f"Unrecognized spark type: {type_
        }")\nValueError: Unrecognized spark type: DecimalType(20,0)\n"
    },
    "result": {}
},