Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to upload an image to Google Cloud Storage from an image url in Node?

Given an image url, how can I upload that image to Google Cloud Storage for image processing using Node.js?

like image 451
Harris Lummis Avatar asked Apr 16 '16 08:04

Harris Lummis


4 Answers

I used the request library and storage library for make it. The code below is in TypeScript. Regards

import * as gcs from '@google-cloud/storage';
import {Storage} from '@google-cloud/storage';
import request from 'request';

private _storage: Storage;

constructor() {
    // example of json path: ../../config/google-cloud/google-storage.json
    this._storage = new gcs.Storage({keyFilename: 'JSON Config Path'});
}

public saveFileFromUrl(path: string): Promise<string> {
    return new Promise<any>((resolve, reject) => {
        request({url: path, encoding: null}, (err, res, buffer) => {
            if (res.statusCode !== 200) {
                reject(err);
            }
            const bucketName = 'bucket_name';
            const destination = `bucket location and file name`; // example: 'test/image.jpg'
            const file = this._storage.bucket(bucketName).file(destination);
            // put the image public
            file.save(buffer, {public: true, gzip: true}).then(data => {
                resolve(`${bucketName}/${destination}`)
            }).catch(err => {
                reject(err);
            });
        });
    })
}
like image 62
Amn Avatar answered Oct 05 '22 08:10

Amn


It's a 2 steps process:

  • Download file locally using request or fetch.
  • Upload to GCL with the official library.

    var fs = require('fs');
    var gcloud = require('gcloud');
    
    // Authenticating on a per-API-basis. You don't need to do this if you auth on a
    // global basis (see Authentication section above).
    
    var gcs = gcloud.storage({
      projectId: 'my-project',
      keyFilename: '/path/to/keyfile.json'
    });
    
    // Create a new bucket.
    gcs.createBucket('my-new-bucket', function(err, bucket) {
      if (!err) {
        // "my-new-bucket" was successfully created.
      }
    });
    
    // Reference an existing bucket.
    var bucket = gcs.bucket('my-existing-bucket');                
    var localReadStream = fs.createReadStream('/photos/zoo/zebra.jpg');
    var remoteWriteStream = bucket.file('zebra.jpg').createWriteStream();
    localReadStream.pipe(remoteWriteStream)
      .on('error', function(err) {})
      .on('finish', function() {
        // The file upload is complete.
      });
    

If you would like to save the file as a jpeg image, you will need to edit the remoteWriteStream stream and add custom metadata:

var image = bucket.file('zebra.jpg');
localReadStream.pipe(image.createWriteStream({
    metadata: {
      contentType: 'image/jpeg',
      metadata: {
        custom: 'metadata'
      }
    }
}))

I found this while digging through this documentation

like image 19
Yevgen Safronov Avatar answered Oct 09 '22 02:10

Yevgen Safronov


To add onto Yevgen Safronov's answer, we can pipe the request into the write stream without explicitly downloading the image into the local file system.

const request = require('request');
const storage = require('@google-cloud/storage')();

function saveToStorage(attachmentUrl, bucketName, objectName) {
  const req = request(attachmentUrl);
  req.pause();
  req.on('response', res => {

    // Don't set up the pipe to the write stream unless the status is ok.
    // See https://stackoverflow.com/a/26163128/2669960 for details.
    if (res.statusCode !== 200) {
      return;
    }

    const writeStream = storage.bucket(bucketName).file(objectName)
      .createWriteStream({

        // Tweak the config options as desired.
        gzip: true,
        public: true,
        metadata: {
          contentType: res.headers['content-type']
        }
      });
    req.pipe(writeStream)
      .on('finish', () => console.log('saved'))
      .on('error', err => {
        writeStream.end();
        console.error(err);
      });

    // Resume only when the pipe is set up.
    req.resume();
  });
  req.on('error', err => console.error(err));
}
like image 13
Kevin Lee Avatar answered Oct 09 '22 00:10

Kevin Lee


Incase of handling image uploads from a remote url. In reference to the latest library provided by Google docs. Instead of storing the buffer of image. We can directly send it to storage.

function sendUploadUrlToGCS(req, res, next) {
  if (!req.body.url) {
    return next();
  }

  var gcsname = Date.now() + '_name.jpg';
  var file = bucket.file(gcsname);

  return request({url: <remote-image-url>, encoding: null}, function(err, response, buffer) {
    req.file = {};
    var stream = file.createWriteStream({
      metadata: {
        contentType: response.headers['content-type']
      }
    });

    stream.on('error', function(err) {
       req.file.cloudStorageError = err;
       console.log(err);
       next(err);
    });

    stream.on('finish', function() {
      req.file.cloudStorageObject = gcsname;
      req.file.cloudStoragePublicUrl = getPublicUrl(gcsname);
      next();
    });

    stream.end(buffer);
  });
}
like image 2
Muthu Rg Avatar answered Oct 09 '22 00:10

Muthu Rg