I'm trying to understand helm and I wonder if someone could ELI5 to me something or help me with something.
So i did run below:
helm repo add coreos https://s3-eu-west-1.amazonaws.com/coreos-charts/stable/
Then I installed kube-prometheus by using below:
helm install coreos/kube-prometheus --name kube-prometheus -f values.yaml --namespace monitoringtest
Everything works fine but I'm trying to add some custom dashboards from json files and I'm struggling to understand how to do it.
I was following this: https://blogcodevalue.wordpress.com/2018/09/16/automate-grafana-dashboard-import-process/
In my values.yaml I added below
serverDashboardConfigmaps:
- example-dashboards
I understand that if I do:
helm upgrade --install kube-prometheus -f values.yaml --namespace monitoringtest coreos/kube-prometheus
That should cause grafana to pickup a below configmap called example-dashboards
and load *.json files from custom-dashboards
folder.
apiVersion: v1
kind: ConfigMap
metadata:
name: example-dashboards
data:
{{ (.Files.Glob "custom-dashboards/*.json").AsConfig | indent 2 }}
# Or
#
# data:
# custom-dashboard.json: |-
# {{ (.Files.Get "custom.json") | indent 4 }}
#
# The filename (and consequently the key under data) must be in the format `xxx-dashboard.json` or `xxx-datasource.json`
# for them to be picked up.
Now two questions:
How do I add above configmap to this helm release?
Where is this custom-dashboards
folder located? Is it on my laptop and then is send to grafana?
Do I need to copy all of https://s3-eu-west-1.amazonaws.com/coreos-charts/stable/
onto my laptop?
Sorry for explaining everything but I'm just trying to understand this.
To import a dashboard, choose the + icon in the side menu, and then choose Import. You can upload a dashboard JSON file, paste a dashboard URL or paste dashboard JSON text directly into the text area.
Import a Grafana dashboardOn the create tab, select Import. Paste the ID of the dashboard you want to import and click Load. Select the Data Source as Prometheus and click Import.
Download the required JSON file (i.e., based on the statistics you need to view) from here. Start Grafana and access it via http://localhost:3000/ . To load a new dashboard, click the plus icon (+) in the side panel. Then click Import.
In the latest version of kube-prometheus-stack
chart in 2021, According to this answer on github, You should just create a configmap with dashboard data and right labels and it will be checked by sidecar in grafana pod.
Example:
apiVersion: v1
kind: ConfigMap
metadata:
name: grafana-dashboards-custom-1
namespace: monitoring
labels:
grafana_dashboard: "1"
prometheus: my-value
release: prometheus
data:
app-status.json: |-
{
"annotations": {
"list": [
{
prometheus: my-value
comes from this helm chart value:
prometheus:
prometheusSpec:
serviceMonitorSelector:
matchLabels:
prometheus: my-value
You can find a good example of how to do this in the charts for prometheus-operator here:
https://github.com/helm/charts/tree/master/stable/prometheus-operator/templates/grafana
It is a ConfigMapList that gets all JSONs from a given directory and stores them into ConfigMaps which are read by Grafana.
{{- $files := .Files.Glob "dashboards/*.json" }}
{{- if $files }}
apiVersion: v1
kind: ConfigMapList
items:
{{- range $path, $fileContents := $files }}
{{- $dashboardName := regexReplaceAll "(^.*/)(.*)\\.json$" $path "${2}" }}
- apiVersion: v1
kind: ConfigMap
metadata:
name: {{ printf "%s-%s" (include "prometheus-operator.fullname" $) $dashboardName | trunc 63 | trimSuffix "-" }}
namespace: {{ template "prometheus-operator.namespace" . }}
labels:
{{- if $.Values.grafana.sidecar.dashboards.label }}
{{ $.Values.grafana.sidecar.dashboards.label }}: "1"
{{- end }}
app: {{ template "prometheus-operator.name" $ }}-grafana
{{ include "prometheus-operator.labels" $ | indent 6 }}
data:
{{ $dashboardName }}.json: {{ $.Files.Get $path | toJson }}
{{- end }}
{{- end }}
Mind that the size of a ConfigMap might be limited: https://stackoverflow.com/a/53015758/4252480
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With