Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to upload an image to AWS S3 using GraphQL?

I'm uploading a base64 string but the GraphQL gets hung. If I slice the string to less than 50,000 characters it works. After 50,000 characters, graphQL never makes it to the resolve function, yet does not give an error. On the smaller strings, it works just fine.

const file = e.target.files[0];
const reader = new FileReader();
reader.readAsDataURL(file);
reader.onloadend = () => {
  const imageArray = reader.result;
  this.context.fetch('/graphql', {
    body: JSON.stringify({
      query: `mutation s3Upload($img: String!) {
        s3Upload(file: $img) {
          logo,
        }
      }`,
      variables: {
        img: imageArray,
      },
    }),
  }).then(response => response.json())
  .then(({ data }) => {
    console.log(data);
  });
}

const s3Upload = {
    type: S3Type,
    args: {
      file: { type: new NonNull(StringType) },
    },
    resolve: (root, args, { user }) => upload(root, args, user),
};

const S3Type = new ObjectType({
  name: 'S3',
  fields: {
    logo: { type: StringType },
  },
});
like image 543
AstroBoogie Avatar asked Aug 11 '17 01:08

AstroBoogie


People also ask

How do I upload images to Amazon S3?

jpg . Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/ . In the Buckets list, choose the name of the bucket that you want to upload your folders or files to. Choose Upload.

Can GraphQL store images?

GraphQL API to store the image reference and other data about the type. Authentication service to authenticate users (only needed in order to upload files to S3)

Can you upload files with GraphQL?

File upload is a recent addition to GraphQL via libraries. It allows us to upload files directly with GraphQL APIs in addition to sending text data. It is very useful for many applications, expanding the use cases for GraphQL to include anything that requires file uploads and manipulation.


2 Answers

The correct approach here is to perform an actual S3 upload via a complex type using AWS AppSync - what you illustrate here looks more like you are attempting to save a base64 encoded image as a string to a field in what I can only assume to be a DynamoDB table entry. For this to work, though, you need to modify your mutation such that the file field is not a String!, but an S3ObjectInput.

There's a few moving parts under the hood you need to make sure you have in place before this "just works" (TM). First of all, you need to make sure you have an appropriate input and type for an S3 object defined in your GraphQL schema

enum Visibility {
    public
    private
}

input S3ObjectInput {
    bucket: String!
    region: String!
    localUri: String
    visibility: Visibility
    key: String
    mimeType: String
}

type S3Object {
    bucket: String!
    region: String!
    key: String!
}

The S3ObjectInput type, of course, is for use when uploading a new file - either by way of creating or updating a model within which said S3 object metadata is embedded. It can be handled in the request resolver of a mutation via the following:

{
    "version": "2017-02-28",
    "operation": "PutItem",
    "key": {
        "id": $util.dynamodb.toDynamoDBJson($ctx.args.input.id),
    },

    #set( $attribs = $util.dynamodb.toMapValues($ctx.args.input) )
    #set( $file = $ctx.args.input.file )
    #set( $attribs.file = $util.dynamodb.toS3Object($file.key, $file.bucket, $file.region, $file.version) )

    "attributeValues": $util.toJson($attribs)
}

This is making the assumption that the S3 file object is a child field of a model attached to a DynamoDB datasource. Note that the call to $utils.dynamodb.toS3Object() sets up the complex S3 object file, which is a field of the model with a type of S3ObjectInput. Setting up the request resolver in this way handles the upload of a file to S3 (when all the credentials are set up correctly - we'll touch on that in a moment), but it doesn't address how to get the S3Object back. This is where a field level resolver attached to a local datasource becomes necessary. In essence, you need to create a local datasource in AppSync and connect it to the model's file field in the schema with the following request and response resolvers:

## Request Resolver ##
{
    "version": "2017-02-28",
    "payload": {}
}

## Response Resolver ##
$util.toJson($util.dynamodb.fromS3ObjectJson($context.source.file))

This resolver simply tells AppSync that we want to take the JSON string that is stored in DynamoDB for the file field of the model and parse it into an S3Object - this way, when you do a query of the model, instead of returning the string stored in the file field, you get an object containing the bucket, region, and key properties that you can use to build a URL to access the S3 Object (either directly via S3 or using a CDN - that's really dependent on your configuration).

Do make sure you have credentials set up for complex objects, however (told you I'd get back to this). I'll use a React example to illustrate this - when defining your AppSync parameters (endpoint, auth, etc.), there is an additional property called complexObjectCredentials that needs to be defined to tell the client what AWS credentials to use to handle S3 uploads, e.g.:

const client = new AWSAppSyncClient({
    url: AppSync.graphqlEndpoint,
    region: AppSync.region,
    auth: {
        type: AUTH_TYPE.AWS_IAM,
        credentials: () => Auth.currentCredentials()
    },
    complexObjectsCredentials: () => Auth.currentCredentials(),
});

Assuming all of these things are in place, S3 uploads and downloads via AppSync should work.

like image 123
hatboyzero Avatar answered Oct 19 '22 22:10

hatboyzero


AWS AppSync (https://aws.amazon.com/appsync/) provides this with functionality known as "Complex Objects" where you can have a types for the S3 Object and the input:

type S3Object {
    bucket: String!
    key: String!
    region: String!
}

input S3ObjectInput {
    bucket: String!
    key: String!
    region: String!
    localUri: String
    mimeType: String
}

You could then do something like this to define this object as part of another type:

type UserProfile {
    id: ID!
    name: String
    file: S3Object
}

And then specify a mutation to add it:

type Mutation {
    addUser(id: ID! name: String file: S3ObjectInput): UserProfile!
}

Your client operations would need to specify the appropriate bucket, key (with file extension), region, etc.

More here: https://docs.aws.amazon.com/appsync/latest/devguide/building-a-client-app-react.html#complex-objects

like image 22
Richard Avatar answered Oct 19 '22 23:10

Richard