I know for a fact that saved Kibana dashboards (ie, the JSON file of the dashboard) are saved in OR associated to a particular ElasticSearch instance. If I were to save my Kibana instance when attached to one server hosting ElasticSearch and I were to switch my ElasticSearch server to another address, I would lose my saved dashboard. But, if I were to switch back to the original server address, I will get the saved dashboard back.
My question, thus, is where exactly in the elasticsearch installation directory are the dashboards saved. I would rather be able to run a script to automatically load my pre-created Kibana dashboards than be forced to copy/paste JSON through the web console every time I start up a new ElasticSearch instance.
Thank you for the help.
According to this Google Groups post, the dashboards are saved into the kibana-int
_index with a _type of dashboard
and an _id of what I named the . So, to save my dashboards into new ElasticSearch instances, do I just need to execute a PUT into this _index through CURL? Is there a better way to do this?
Kibana stores its objects as documents in the . kibana index in Elasticsearch. The name of this index can be changed via the kibana. index configuration setting (starting with Kibana 4.2; prior to that this setting was named kibana_index ).
To open the dashboards, launch the Kibana web interface by pointing your browser to port 5601. For example, http://localhost:5601. Replace localhost with the name of the Kibana host. If you're using an Elastic Cloud instance, log in to your cloud account, then navigate to the Kibana endpoint in your deployment.
Edit the Kibana configuration file: the configuration file kibana. yml is located in the config subdirectory of <KIBANA_INSTALL_DIR> .
Yes, the Kibana dashboards are being saved in Elasticsearch under kibana-int
index (by default, you can override that in the config.js
file). If you want to move your Kibana dashboards to another ES cluster you have two options:
EDIT: For the second option, you can use the python elasticsearch library and its helper reindex
, if you feel more confortable with Python: https://elasticsearch-py.readthedocs.org/en/latest/helpers.html#elasticsearch.helpers.reindex
In fact, very easy, Copy two folders:
1) .\elasticsearch\data\nodes\0\indices\.kibana
2) .\elasticsearch\data\nodes\0\indices\kibana-int
paste in new elasticsearch.
Here's a standalone Python script that can copy Kibana dashboards from elasticsearch host to another.
#!/bin/env python
"""Migrate all the kibana dashboard from SOURCE_HOST to DEST_HOST.
This script may be run repeatedly, but any dashboard changes on
DEST_HOST will be overwritten if so.
"""
import urllib2, urllib, json
SOURCE_HOST = "your-old-es-host"
DEST_HOST = "your-new-es-host"
def http_post(url, data):
request = urllib2.Request(url, data)
return urllib2.urlopen(request).read()
def http_put(url, data):
opener = urllib2.build_opener(urllib2.HTTPHandler)
request = urllib2.Request(url, data)
request.get_method = lambda: 'PUT'
return opener.open(request).read()
if __name__ == '__main__':
old_dashboards_url = "http://%s:9200/kibana-int/_search" % SOURCE_HOST
# All the dashboards (assuming we have less than 9999) from
# kibana, ignoring those with _type: temp.
old_dashboards_query = """{
size: 9999,
query: { filtered: { filter: { type: { value: "dashboard" } } } } }
}"""
old_dashboards_results = json.loads(http_post(old_dashboards_url, old_dashboards_query))
old_dashboards_raw = old_dashboards_results['hits']['hits']
old_dashboards = {}
for doc in old_dashboards_raw:
old_dashboards[doc['_id']] = doc['_source']
for id, dashboard in old_dashboards.iteritems():
put_url = "http://%s:9200/kibana-int/dashboard/%s" % (DEST_HOST, urllib.quote(id))
print http_put(put_url, json.dumps(dashboard))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With