PySpark Issues for expect_column_value_z_scores_to_be_less_than

Dear Team,
I am trying to implement the z_score expectation mentioned here

However, in PySpark I am getting following error while testing it.
I am using GE version 0.14.12. Could you please let me know how to make it work?

Let me know if you need any further information.

 {
    "exception_info": {
        "exception_message": "'SparkDFDataset' object has no attribute 'expect_column_value_z_scores_to_be_less_than'",
        "exception_traceback": "Traceback (most recent call last):\n  File \"/Users/m_675575/opt/miniconda3/envs/dev379/lib/python3.7/site-packages/great_expectations/data_asset/data_asset.py\", line 938, in validate\n    expectation_method = getattr(self, expectation.expectation_type)\nAttributeError: 'SparkDFDataset' object has no attribute 'expect_column_value_z_scores_to_be_less_than'\n",
        "raised_exception": true
    },
    "expectation_config": {
        "expectation_type": "expect_column_value_z_scores_to_be_less_than",
        "kwargs": {
            "column": "wk_kritisch_doc",
            "threshold": 5.3178400729279796
        },
        "meta": {}
    },
    "meta": {},
    "result": {},
    "success": false
},