Skip to content

Latest commit

 

History

History
173 lines (119 loc) · 4.55 KB

setup-fluentd-es-kibana.md

File metadata and controls

173 lines (119 loc) · 4.55 KB

Set up Fleuntd, Elastic search, and Kibana in Kubernetes

This document descriebs how to install Fluentd, Elastic Search, and Kibana to search logs in Kubernetes

Prerequisites

Contents

Install Elastic search and Kibana

  1. Create namespace for monitoring tool and add Helm repo for Elastic Search
kubectl create namespace dapr-monitoring
  1. Add Elastic helm repo
helm repo add elastic https://helm.elastic.co
helm repo update
  1. Install Elastic Search using Helm
helm install elasticsearch elastic/elasticsearch -n dapr-monitoring

If you are using minikube or want to disable persistent volumes for development purposes, you can disable it by using the following command.

helm install elasticsearch elastic/elasticsearch -n dapr-monitoring --set persistence.enabled=false --replicas=1
  1. Install Kibana
helm install kibana elastic/kibana -n dapr-monitoring
  1. Validation

Ensure Elastic Search and Kibana are running in your Kubernetes cluster.

kubectl get pods -n dapr-monitoring
NAME                            READY   STATUS    RESTARTS   AGE
elasticsearch-master-0          1/1     Running   0          6m58s
kibana-kibana-95bc54b89-zqdrk   1/1     Running   0          4m21s

Install Fluentd

  1. Install config map and Fluentd as a daemonset

Note: If you are running Fluentd in your cluster, please enable the nested json parser to parse JSON formatted log from Dapr.

kubectl apply -f ./fluentd-config-map.yaml
kubectl apply -f ./fluentd-dapr-with-rbac.yaml
  1. Ensure that Fluentd is running as a daemonset
kubectl get pods -n kube-system -w
NAME                          READY   STATUS    RESTARTS   AGE
coredns-6955765f44-cxjxk      1/1     Running   0          4m41s
coredns-6955765f44-jlskv      1/1     Running   0          4m41s
etcd-m01                      1/1     Running   0          4m48s
fluentd-sdrld                 1/1     Running   0          14s

Install Dapr with JSON formatted logs

  1. Install Dapr with enabling JSON-formatted logs
helm install dapr dapr/dapr --namespace dapr-system --set global.logAsJson=true
  1. Enable JSON formatted log in Dapr sidecar

Add dapr.io/log-as-json: "true" annotation to your deployment yaml.

Example:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: pythonapp
  labels:
    app: python
spec:
  replicas: 1
  selector:
    matchLabels:
      app: python
  template:
    metadata:
      labels:
        app: python
      annotations:
        dapr.io/enabled: "true"
        dapr.io/id: "pythonapp"
        dapr.io/log-as-json: "true"
...

Search logs

Note: Elastic Search takes a time to index the logs that Fluentd sends.

  1. Port-forward to svc/kibana-kibana
$ kubectl port-forward svc/kibana-kibana 5601 -n dapr-monitoring
Forwarding from 127.0.0.1:5601 -> 5601
Forwarding from [::1]:5601 -> 5601
Handling connection for 5601
Handling connection for 5601
  1. Browse http://localhost:5601

  2. Click Management -> Index Management

kibana management

  1. Wait until dapr-* is indexed.

index log

  1. Once dapr-* indexed, click Kibana->Index Patterns and Create Index Pattern

create index pattern

  1. Define index pattern - type dapr* in index pattern

define index pattern

  1. Select time stamp filed: @timestamp

timestamp

  1. Confirm that scope, type, app_id, level, etc are being indexed.

Note: if you cannot find the indexed field, please wait. it depends on the volume of data and resource size where elastic search is running.

indexing

  1. Click discover icon and search scope:*

Note: it would take some time to make log searchable based on the data volume and resource.

discover

References