I am using an InferredAssetS3DataConnector to validate data in S3 using GX version 0.18.19 and was able to get a successful manual checkpoint run. I’m now trying to set up an airflow operator to run these validations daily in my local mwaa environment that runs on Docker (eventually will want to figure out how to implement this in prod as well). I am getting this error when I try to run the operator.
File "/usr/local/airflow/.local/lib/python3.11/site-packages/botocore/auth.py", line 418, in add_auth
raise NoCredentialsError()
botocore.exceptions.NoCredentialsError: Unable to locate credentials
I have my aws credentials in the config_variables.yml and added them directly to the data connector using the boto3_options parameter. Do you have any suggestions on how I should be passing my aws credentials to the operator to connect to data stored in S3? I’ve also tried using a pre-existing AWS connection, but then get this error even when I added an s3_path parameter in the connection:
ValueError: No s3_path given in params.