Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why do I get the S3ServiceException error when loading AWS Redshift from S3?

I'm getting an error when trying to load a table in Redshift from a CSV file in S3. The error is:

   error:  S3ServiceException:All access to this object has been disabled,Status 403,Error AllAccessDisabled,Rid FBC64D9377CF9763,ExtRid o1vSFuV8SMtYDjkgKCYZ6VhoHlpzLoBVyXaio6hdSPZ5JRlug+c9XNTchMPzNziD,CanRetry 1
  code:      8001
  context:   Listing bucket=amazonaws.com prefix=els-usage/simple.txt
  query:     1122
  location:  s3_utility.cpp:540
  process:   padbmaster [pid=6649]

The copy statement used is:

copy public.simple from 's3://amazonaws.com/mypath/simple.txt' CREDENTIALS 'aws_access_key_id=xxxxxxx;aws_secret_access_key=xxxxxx' delimiter ',';

As this is my first attempt at using Redshift and S3, I've kept the simple.txt file (and its destination table) a single field record. I've run the copy in both Aginity Workbench and SQL Workbench with the same results.

I've clicked the link in the S3 file's property tab and it downloads the simple.txt file - so it appears the input file is accessible. Just to be sure, I've given it public access.

Unfortunately, I don't see any addition information that would be helpful in debugging this in the Redshift Loads tab.

Can anyone see anything I'm doing incorrectly?

like image 649
Todd Avatar asked Oct 21 '22 09:10

Todd


1 Answers

Removing the amazonaws.com from the URL fixed the problem. The resulting COPY statement is now:

copy public.simple from 's3://mypath/simple.txt' CREDENTIALS 'aws_access_key_id=xxxxxxx;aws_secret_access_key=xxxxxx' delimiter ',';
like image 58
Todd Avatar answered Nov 15 '22 07:11

Todd