Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to ship logs from pods on Kubernetes running on top of GCP to elasticsearch/logstash?

I run new modules of my system in Google-Container-Engine. I would like to bring stdout and stderr from them (running in pods) to my centralised logstash. Is there an easy way to forward logs from pods to external logging service, e.g., logstash or elasticsearch?

like image 462
Skarab Avatar asked Oct 11 '16 08:10

Skarab


People also ask

How do you get logs from Kubernetes container?

You can see the logs of a particular container by running the command kubectl logs <container name> . Here's an example for Nginx logs generated in a container. If you want to access logs of a crashed instance, you can use –previous . This method works for clusters with a small number of containers and instances.

How do you aggregate Kubernetes logs?

The first option is to deploy tools like Fluentd, Filebeat, or Logstash to gather the logs from all nodes in your local cluster. One instance of a preferred application will then be deployed per node and will gather logs from all containers on the node. Typically, this is done using ReplicaSet or ReplicationController.


1 Answers

I decided to log directly to elasticsearch, an external virtual machine that can be access at elasticsearch.c.my-project.internal (I am on Google-Cloud-Platform). It is quite easy:

  1. Setup an ExternalService with name: elasticsearch that points to the elasticsearch instance:

    apiVersion: v1
    kind: Service
    metadata:
      name: elasticsearch-logging
      namespace: kube-system
      labels:
        k8s-app: elasticsearch
        kubernetes.io/name: "elasticsearch"
    spec:
      type: ExternalName
      externalName: elasticsearch.c.my-project.internal
      ports:
        - port: 9200
          targetPort: 9200
    
  2. Deploy a fluentd-elasticsearch as a DeamonSet. fluentd-elasticsearch will automatically connect to service with name elasticsearch-logging (based on a fluentd-elasticsearch deployment defintion :

    apiVersion: extensions/v1beta1
    kind: DaemonSet
    metadata:
      name: fluentd-elasticsearch
      namespace: kube-system
      labels:
        tier: monitoring
        app: fluentd-logging
        k8s-app: fluentd-logging
    spec:
      template:
        metadata:
          labels:
            name: fluentd-elasticsearch
        spec:
          containers:
            - name: fluentd-elasticsearch
              image: gcr.io/google_containers/fluentd-elasticsearch:1.19
              volumeMounts:
              - name: varlog
                mountPath: /var/log
              - name: varlibdockercontainers
                mountPath: /var/lib/docker/containers
                readOnly: true
          terminationGracePeriodSeconds: 30
          volumes:
          - name: varlog
            hostPath:
              path: /var/log
          - name: varlibdockercontainers
            hostPath:
              path: /var/lib/docker/containers
    

Use kubectl logs fluentd-elasticsearch-... to check whether you were able to connect to the elasticsearach instance.

  1. Now, you can access kibana and see the logs.
like image 111
Skarab Avatar answered Sep 23 '22 00:09

Skarab