I'm trying to reproduce the steps from the GR and Airflow tutorial with no success

I executed all steps from this tutorial Orchestrate Great Expectations with Airflow | Astronomer Documentation but gx_tutorial DAG is failing. Have somebody else experienced such problem and what might be the potential issues? It seems that the table cannot be created:

2024-03-29, 08:24:38 None graph_data None admin [(‘dag_id’, ‘gx_tutorial’)]
2024-03-29, 07:39:52 gx_validate_pg task 2024-03-29 07:23:29.407087+00:00 admin [(‘dag_id’, ‘gx_tutorial’), (‘task_id’, ‘gx_validate_pg’), (‘execution_date’, ‘2024-03-29T07:23:29.407087+00:00’), (‘map_index’, ‘-1’)]
2024-03-29, 07:39:43 create_table_pg failed 2024-03-29 07:39:40.313904+00:00 airflow None

If I run airflow test locally I have this issue:

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File “”, line 198, in run_module_as_main
File “”, line 88, in run_code
File "C:\Users\user\AppData\Local\Programs\Python\Python312\Scripts\airflow.exe_main
.py", line 4, in
File "C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow_init
.py", line 52, in
from airflow import configuration, settings
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\configuration.py”, line 2340, in
secrets_backend_list = initialize_secrets_backends()
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\configuration.py”, line 2254, in initialize_secrets_backends
secrets_backend_cls = import_string(class_name)
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\utils\module_loading.py”, line 37, in import_string
module = import_module(module_path)
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\importlib_init_.py”, line 90, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\secrets\metastore.py”, line 29, in
from airflow.utils.session import NEW_SESSION, provide_session
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\utils\session.py”, line 24, in
from airflow import settings
File “C:\Users\user\AppData\Local\Programs\Python\Python312\Lib\site-packages\airflow\settings.py”, line 57, in
TIMEZONE = pendulum.tz.timezone(“UTC”)
TypeError: ‘module’ object is not callable

I have the latest python, airflow and pendulum versions. The example DAGs are working correctly. I guess it’s something related to the postgres connection but I can’t find out what it is.

Hi all. I finally resolved all issues and run the tutorial successfully. Actually i downgraded the astronomer version in the Docker file in order to avoid the deprecated PostgresOperator in apache ariflow.

This is what I put in the DockerFile
to FROM Quay

I had problems with the tcp connection as well. It turned out the host that should but put in the Airflow Connection should be postgres, not localhost or

1 Like

Good to hear that you got the problem solved!
Feel free to also mark this issue as solved to help others with the same issue