In order to connect to S3, I have to have a config file in ~/.aws/config that looks like the following:
[profile my_profile]
output = json
region = us-east-1
role_arn = arn:aws:iam::yyyyyyyyyyyy:role/my_awesome_role
[profile my_profile_exec]
output = json
region = us-east-1
role_arn = arn:aws:iam::xxxxxxxxxxxx:role/my_awesome_role
source_profile = my_profile
PLEASE NOTE THE ARN BETWEEN THE TWO PROFILES IS DIFFERENT.
When producing credentials (aws_access_key_id, aws_secret_access_key and aws_session_token), all of which are in the credentials file at ~/.aws/credentials, they are for the profile my_profile.
With all that done, when I run the AWS CLI command: aws s3 ls --profile=my_profile_exec, I get proper results.
In pure Python, the connection works as well and looks like this (abbreviated):
>>> session = boto3.Session(profile_name=my_profile_exec)
>>> s3_client = session.client("s3")
>>> # do stuff with s3_client
I'm trying to get this to work with the DuckDB CLI but am having no luck at all. Here's about the closest I've been able to come to figuring it out:
D CREATE SECRET secret (
TYPE s3,
PROVIDER CREDENTIAL_CHAIN,
CHAIN 'config',
PROFILE 'my_profile_exec');
|----------|
| Success |
| boolean |
|----------|
| true |
|-----------|
D SELECT * FROM read_csv('s3://bucket/path/to/file');
HTTP Error: HTTP GET error on 'https://bucket/path/to/file' (HTTP 400)
I've searched high and low on the internet but haven't run into an example of someone with a setup like this.
Here's my environment. If anyone can help out, it'd be greatly appreciated:
1.1.3 19864453f7Python & 3rd Party Packages
Based on the HTTP 400 error you received, it seems that your DuckDB Secrets Manager isn't using the declared secret properly.
First, ensure that the defined secret matches the expected configuration of your ~/.aws/config file. You can review the declared fields using the following command:
D FROM duckdb_secrets();
Carefully check the output to confirm that the secret is configured correctly.
Otherwise, define REGION or other fields in your secrets using this instantiation, as per documentation :
D CREATE OR REPLACE SECRET secret (
TYPE s3,
PROVIDER CREDENTIAL_CHAIN,
CHAIN 'config',
PROFILE 'my_profile_exec',
REGION 'eu-west-1'
);
Next, verify that the S3 bucket associated with the CSV file matches the declared secret. You can do this with the following command:
```plsql
D FROM which_secret('https://bucket/path/to/file', 's3');
If the secret name shows up, it means your setup is correctly configured, and you should be able to download the CSV file.
Otherwise, restrict the scope of your S3 credentials based on bucket scope using this secret instantiation :
D CREATE OR REPLACE SECRET secret (
TYPE s3,
PROVIDER CREDENTIAL_CHAIN,
CHAIN 'config',
PROFILE 'my_profile_exec',
REGION 'eu-west-1',
SCOPE 's3://bucket/path/to/file'
);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With