Using ADLS instead of DBFS in Azure Databricks for all GX artefacts, especially data docs

Thank you, @rachel.house , for the hints.
Based on these I figured out to use this setting to be able to write to DBFS:

  context_root_dir = f"/dbfs{project.MNT_PATH}/GX/"

  project_config = DataContextConfig(
    ## Local storage backend
    store_backend_defaults=FilesystemStoreBackendDefaults(
      root_directory=context_root_dir
    ),
    ## Data docs site storage
    data_docs_sites={
      "az_site": {
        "class_name": "SiteBuilder",
        "store_backend": {
          "class_name": "TupleAzureBlobStoreBackend",
          "container":  "\$web",
          "connection_string":  "DefaultEndpointsProtocol=https;AccountName=<AccountName>;AccountKey=<AccountKey>;EndpointSuffix=core.windows.net",
        },
        "site_index_builder": {
          "class_name": "DefaultSiteIndexBuilder",
          "show_cta_footer": True,
        },
      }
    },
  )

  context = gx.get_context(project_config=project_config)```

But good to know also your way!
Greetings and Kudos!