Skip to main content Skip to complementary content

Configuring the ELK services

You need to configure the different services so they can communicate with each other.

The ELK stack is composed of 4 services:

  • Filebeat: Bridge between the Remote Engine Gen2 and the ELK stack, and more precisely Logstash.
  • Logstash: Data processing pipeline that ingests the data, transforms it, and sends it to Elasticsearch.
  • Elasticsearch: Search and analytics engine.
  • Kibana: User interface to visualize your Elasticsearch data and navigate the Elastic Stack.

To configure the link between each service:

Procedure

  1. In the filebeat/filebeat.yml file, set the location of your Logstash server, logstash:5044 in this example, since Filebeat and Logstash are in the same Docker network.
    filebeat.autodiscover:
      providers:
        - type: docker
          templates:
            - condition:
                contains:
                  docker.container.labels.filebeat_ingest: "true"
              config:
                - type: container
                  paths:
                    - /var/lib/docker/containers/${data.docker.container.id}/*.log
                  json.keys_under_root: true
                  json.add_error_key: true
                  json.message_key: message
                  ignore_older: 10m
    
    output:
      logstash:
        hosts: [ 'logstash:5044' ]

    This configuration enables Filebeat to consume the Remote Engine Gen2 logs and send them to logstash for parsing.

    The filebeat/filebeat.yml file also sets how to discover the logs. In this case, it looks for the logs of containers with label filebeat_ingest: true.

  2. In the logstash.conf file, make sure that the specified input port corresponds to your Logstash server, and that it matches the port set in the filebeat.yml file.
  3. Still in logstash.conf, set the output value as elasticsearch and specifiy the host and port of your Elasticsearch server, 9200 in this example.
    input {
      beats {
        port => 5044
      }
    }
    
    output {
        elasticsearch {
          hosts => '${ELASTICSEARCH_HOSTS:elasticsearch:9200}'
          index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      }
    }

    This simple input and output configuration allows Kibana to connect to the Elasticsearch host and port combination, and to read the data coming from Logstash.

Did this page help you?

If you find any issues with this page or its content – a typo, a missing step, or a technical error – let us know how we can improve!