Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I configure a structure for backing up elasticsearch data on Google Compute Engine?

I have e elasticsearch environment configured on GCE (Google Compute Engine) with two nodes, therefore two VM's and I need to create a backup strategy for this. I first thought I could use the elasticsearch snapshot API to backup all my data to a given storage as the API supports a few ways to store the snapshot.

  • Shared filesystem, such as a NAS
  • Amazon S3
  • HDFS (Hadoop Distributed File System)
  • Azure Cloud

I tried to used the shared filesystem option, but it requires that the store location be shared between nodes. Is there a way I can do this on GCE?

curl -XPUT http://x.x.x.x:9200/_snapshot/backup -d '{
    "type": "fs",
    "settings": {
        "compress" : true,
        "location": "/elasticsearch/backup"
    }

}'

nested: RepositoryVerificationException[[backup] store location [/elasticsearch/backup] is not shared between node

I know there is a AWS plugin for elasticsearch for storing the backups. Is there any plugin for Google Cloud Storage? Is is possible to do that?

If any of those alternatives above are not possible, is there any other recommended strategy for backing-up my data?

like image 844
Edmar Miyake Avatar asked Feb 04 '15 14:02

Edmar Miyake


1 Answers

Elasticsearch now has a plugin for Google Cloud Storage, so this is natively supported.

like image 110
Will Hayworth Avatar answered Oct 31 '22 17:10

Will Hayworth