filebeat收集多个目录日志配置

因业务需要,我们现有得服务器上一个节点上装了多个服务,前后端都有涉及,因此就需要用 filebeat 将这些日志收集起来生成不一样得索引,配置如下(仅供参考):

input:

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/nginx/*.log
  fields:
    log_type: "nginx"

  json.key_under_root: true
  json.overwite_keys: true
    #- c:\programdata\elasticsearch\logs\*
- type: log
  enabled: true
  paths:
    - /var/log/elasticsearch/elasticsearch.log
  fields:
    log_type: "es"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

- type: log
  enabled: true
  paths:
    - /data/ruoyi/*.log
  fields:
    log_type: "ruoyi"

  multiline.pattern: '^\s'
  multiline.negate: true
  multiline.match: after

output:

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["192.168.53.21:9200","192.168.53.22:9200"]
  index: "nginx-%{+yyyy.MM}"
  indices:
    - index: "es-log"
      when.contains:
        fields:
          log_type: "es"
    - index: "ruoyi-log"
      when.contains:
        fields:
          log_type: "ruoyi"

解释一下大概就是按域或者说是字段区分,按照域创建不同得索引,output 中 hosts 下面得index 意思是除下面两个判断,其他得放在nginx索引中

posted @ 2021-07-02 15:08  太阳的阳ฅ  阅读(1368)  评论(0编辑  收藏  举报