What are minimum Spark and PySpark versions supported by GE?

What are minimum Spark and PySpark versions supported by GE?
e.g. will GE 0.18.x run on Spark 2.4?
If no, which version will? How to guess?

Is it pyspark>=2.3.2 as stated here now - great_expectations/reqs/requirements-dev-spark.txt at develop · great-expectations/great_expectations · GitHub ?

Hey @Alex yup that’s right. For Spark, we always test against latest version that meets the minimum requirement. In this case pyspark>=2.3.2. So yes, 0.18 will run on Spark 2.4.