How to use PySpark to access Hive and verify data

Is there a complete case where ,you have no idea what the standard operation is?

spark = ge.core.util.get_or_create_spark_application()

from pyspark.sql import SparkSession
spark = SparkSession.builder.config(conf=conf)
Are these two ways different?