Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Export from Aurora Serverless into S3

I am currently trying to export a .csv file from AWS Aurora Serverless with a MySQL Engine to AWS S3. This works perfectly fine when using an Instance backed Aurora but not with the serverless. I set up IAM Policies for the S3 as written in the documentation and made an IAM User with Full Access to S3 and RDS. The Parameter Group which the Serverless is using has the ARN of the IAM User in the LOAD DATA FROM S3 and SELECT INTO OUTFILE S3 fields.

The Code i am using to Export to S3:

SELECT * FROM TABLE WHERE ID = '6838' INTO OUTFILE S3 's3://bucketname/file.csv';

I've read the documentation here :https://docs.aws.amazon.com/de_de/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.SaveIntoS3.html

The Errormessage: Access denied; you need (at least one of) the SELECT INTO S3 privilege(s) for this operation

I expect the Aurora serverless to load as smooth as Aurora with Instances is doing. The Aurora with Instances is using the IAM User which was created for the serverless and is working fine.

EDIT: It seems that AWS Aurora Serverless does not support either IMPORT or EXPORT from and to S3. (https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/aurora-serverless.html#aurora-serverless.limitations)

like image 674
psk98 Avatar asked Oct 09 '19 14:10

psk98


People also ask

What is the difference between Amazon Aurora and Aurora Serverless?

Amazon Aurora and Aurora Serverless are two distinct products from AWS. While Amazon's Aurora lets you use cloud-based computing resources to test and run your applications, the Aurora Serverless is a configuration that enables automated capacity scaling and for connecting applications.

Does Aurora use S3?

If you don't specify a region value, then Aurora loads your file from Amazon S3 in the same region as your DB cluster. bucket-name – The name of the Amazon S3 bucket that contains the data to load. Object prefixes that identify a virtual folder path are supported.


2 Answers

Actually you can load your data only in the Aurora Cluster not the serverless one, I had the same issue maybe you can consider integrating your data into an Amazon RDS MySQL database and then let lambda do your data injection from S3 to RDS MySQL.

like image 50
Linda Naoui Avatar answered Oct 22 '22 00:10

Linda Naoui


Did you complete this step (assuming you are issuing that statement from the non-master DB account):

The database user that issues the SELECT INTO OUTFILE S3 statement must be granted the SELECT INTO S3 privilege to issue the statement. The master user name for a DB cluster is granted the SELECT INTO S3 privilege by default. You can grant the privilege to another user by using the following statement.

GRANT SELECT INTO S3 ON *.* TO 'user'@'domain-or-ip-address'
like image 45
Ashaman Kingpin Avatar answered Oct 22 '22 00:10

Ashaman Kingpin