Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to upload an in memory file data to google cloud storage using nodejs?

I am reading an image from a url and processing it. I need to upload this data to a file in cloud storage, currently i am writing the data to a file and uploading this file and then deleting this file. Is there a way i can upload the data directly to the cloud stoage?

static async uploadDataToCloudStorage(rc : RunContextServer, bucket : string, path : string, data : any, mimeVal : string | false) : Promise<string> {
  if(!mimeVal) return ''

  const extension = mime.extension(mimeVal),
        filename  = await this.getFileName(rc, bucket, extension, path),
        modPath   = (path) ? (path + '/') : '',
        res       = await fs.writeFileSync(`/tmp/${filename}.${extension}`, data, 'binary'),
        fileUrl   = await this.upload(rc, bucket, 
                            `/tmp/${filename}.${extension}`,
                            `${modPath}${filename}.${extension}`)
                   
  await fs.unlinkSync(`/tmp/${filename}.${extension}`)

  return fileUrl
}

static async upload(rc : RunContextServer, bucketName: string, filePath : string, destination : string) : Promise<string> {
  const bucket : any = cloudStorage.bucket(bucketName),
        data   : any = await bucket.upload(filePath, {destination})

  return data[0].metadata.name
}
like image 715
Akash Avatar asked Jul 06 '17 09:07

Akash


3 Answers

Yes, it's possible to retrieve an image from a URL, perform edits to the image, and upload it to Google Cloud Storage (or Firebase storage) using nodejs, without ever saving the file locally.

This is building on Akash's answer with an entire function that worked for me, including the image manipulation step.

Steps

  • Use axios to retrieve a stream of the image from a remote url.
  • Use sharp to make your changes to the image
  • Use Google Cloud Storage Library to create a file, and save the image data to the file in Google Cloud Storage. (more node docs)

If you are a firebase user using firebase storage, you must still use this library. The firebase web implementation for storage does not work in node. If you created your storage in firebase, you can still access this all through Google Cloud Storage Console. They are the same thing.

const axios = require('axios');
const sharp = require('sharp');
const { Storage } = require('@google-cloud/storage');

const processImage = (imageUrl) => {
    return new Promise((resolve, reject) => {

        // Your Google Cloud Platform project ID
        const projectId = '<project-id>';

        // Creates a client
        const storage = new Storage({
            projectId: projectId,
        });

        // Configure axios to receive a response type of stream, and get a readableStream of the image from the specified URL
        axios({
            method:'get',
            url: imageUrl,
            responseType:'stream'
        })
        .then((response) => {

            // Create the image manipulation function
            var transformer = sharp()
            .resize(300)
            .jpeg();

            gcFile = storage.bucket('<bucket-path>').file('my-file.jpg')

            // Pipe the axios response data through the image transformer and to Google Cloud
            response.data
            .pipe(transformer)
            .pipe(gcFile.createWriteStream({
                resumable  : false,
                validation : false,
                contentType: "auto",
                metadata   : {
                    'Cache-Control': 'public, max-age=31536000'}
            }))
            .on('error', (error) => { 
                reject(error) 
            })
            .on('finish', () => { 
                resolve(true)
            });
        })
        .catch(err => {
            reject("Image transfer error. ", err);
        });
    })
}

processImage("<url-to-image>")
.then(res => {
  console.log("Complete.", res);
})
.catch(err => {
  console.log("Error", err);
});
like image 186
Matthew Rideout Avatar answered Nov 04 '22 18:11

Matthew Rideout


The data can be uploaded without writing to a file by using nodes streams.

const stream     = require('stream'),
      dataStream = new stream.PassThrough(),
      gcFile     = cloudStorage.bucket(bucketName).file(fileName)

dataStream.push('content-to-upload')
dataStream.push(null)

await new Promise((resolve, reject) => {
  dataStream.pipe(gcFile.createWriteStream({
    resumable  : false,
    validation : false,
    metadata   : {'Cache-Control': 'public, max-age=31536000'}
  }))
  .on('error', (error : Error) => { 
    reject(error) 
  })
  .on('finish', () => { 
    resolve(true)
  })
})
like image 24
Akash Avatar answered Nov 04 '22 18:11

Akash


This thread is old but in the current API, File object works with Streams

So you can have something like this to upload a JSON file from memory:

const { Readable } = require("stream")
const { Storage } = require('@google-cloud/storage');

const bucketName = '...';
const filePath = 'test_file_from_memory.json';
const storage = new Storage({
  projectId: '...',
  keyFilename: '...'
});
(() => {
  const json = {
    prop: 'one',
    att: 2
  };
  const file = storage.bucket(bucketName).file(filePath);
  Readable.from(JSON.stringify(json))
    .pipe(file.createWriteStream({
      metadata: {
        contentType: 'text/json'
      }
    }).on('error', (error) => {
      console.log('error', error)
    }).on('finish', () => {
      console.log('done');
    }));
})();

Source: https://googleapis.dev/nodejs/storage/latest/File.html#createWriteStream

like image 2
Ezequiel Alanis Avatar answered Nov 04 '22 17:11

Ezequiel Alanis