How to instantiate a Data Context on an Databricks Spark cluster

@here , I have initialised the datacontext using BaseDataContext Class , based on the above example
i can successfully run the test suite , problem unable to see the test suite files and docs in the s3 bucket , configured the stores for validations_S3_store , expectations_S3_store and s3_site for docs
i need to get the configured stored list , but basecontext doesn’t offer to list the store list

My env Databricks + AWS

1.How can i make sure the stores configured correctly ?
2.if any error comes in backend for uploading these files to s3 - how can i see those info
3.Is it possible to initialise DataContext instead of BaseDataContext class with project_config

Thanks
Dinakar S