Logstash替换字符串,解析json数据,修改数据类型,获取日志时间

在某些情况下,有些日志文本文件类json,但它的是单引号,具体格式如下,我们需要根据下列日志数据,获取正确的字段和字段类型

{'usdCnyRate': '6.728', 'futureIndex': '463.36', 'timestamp': '1532933162361'}
{'usdCnyRate': '6.728', 'futureIndex': '463.378', 'timestamp': '1532933222335'}
{'usdCnyRate': '6.728', 'futureIndex': '463.38', 'timestamp': '1532933348347'}
{'usdCnyRate': '6.728', 'futureIndex': '463.252', 'timestamp': '1532933366866'}
{'usdCnyRate': '6.728', 'futureIndex': '463.31', 'timestamp': '1532933372350'}
{'usdCnyRate': '6.728', 'futureIndex': '463.046', 'timestamp': '1532933426899'}
{'usdCnyRate': '6.728', 'futureIndex': '462.806', 'timestamp': '1532933432346'}
{'usdCnyRate': '6.728', 'futureIndex': '462.956', 'timestamp': '1532933438353'}
{'usdCnyRate': '6.728', 'futureIndex': '462.954', 'timestamp': '1532933456796'}
{'usdCnyRate': '6.728', 'futureIndex': '462.856', 'timestamp': '1532933492411'}
{'usdCnyRate': '6.728', 'futureIndex': '462.776', 'timestamp': '1532933564378'}
{'usdCnyRate': '6.728', 'futureIndex': '462.628', 'timestamp': '1532933576849'}
{'usdCnyRate': '6.728', 'futureIndex': '462.612', 'timestamp': '1532933588338'}
{'usdCnyRate': '6.728', 'futureIndex': '462.718', 'timestamp': '1532933636808'}

此时我们如果当json直接用logstash Json filter plugin来解析会如下报错

[WARN ] 2018-07-31 10:20:12.708 [Ruby-0-Thread-5@[main]>worker1: :1] json - Error parsing json {:source=>"message", :raw=>"{'usdCnyRate': '6.728', 'futureIndex': '462.134', 'timestamp': '1532933714371'}", :exception=>#<LogStash::Json::ParserError: Unexpected character (''' (code 39)): was expecting double-quote to start field name at [Source: (byte[])"{'usdCnyRate': '6.728', 'futureIndex': '462.134', 'timestamp': '1532933714371'}"; line: 1, column: 3]>}

此处我认为简单的做法是替换单引号为双引号,替换过程应用了logstash mutate gsub
一定要看清楚我10-12行的写法,作用为替换字符串,14-15行为解析json。我们还需要将usdCnyRate和futureIndex转为float类型(18-21行),将timestamp转为时间类型,并重新定义一个logdate来存储(23-25行)此处用到
logstash date filter plugin

input{
    file {
        path => "/usr/share/logstash/wb.cond/test.log"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}
filter{
    mutate {
        gsub =>[
            "message", "'", '"'
        ]
    }
    json {
        source => "message"
    }
    mutate {
        convert => {
            "usdCnyRate" => "float"
            "futureIndex" => "float"
        }
    }
    date {
        match => [ "timestamp", "UNIX_MS" ]
        target => "logdate"
    }
}
output{
    stdout{
        codec=>rubydebug
    }
}

利用上述配置文件,我们能正确解析出日志文件的字段和类型

{
        "message" => "{\"usdCnyRate\": \"6.728\", \"futureIndex\": \"463.378\", \"timestamp\": \"1532933222335\"}",
     "@timestamp" => 2018-07-31T10:48:48.600Z,
           "host" => "logstashvm0",
           "path" => "/usr/share/logstash/wb.cond/test.log",
       "@version" => "1",
        "logdate" => 2018-07-30T06:47:02.335Z,
     "usdCnyRate" => 6.728,
      "timestamp" => "1532933222335",
    "futureIndex" => 463.378
}
{
        "message" => "{\"usdCnyRate\": \"6.728\", \"futureIndex\": \"463.252\", \"timestamp\": \"1532933366866\"}",
     "@timestamp" => 2018-07-31T10:48:48.602Z,
           "host" => "logstashvm0",
           "path" => "/usr/share/logstash/wb.cond/test.log",
       "@version" => "1",
        "logdate" => 2018-07-30T06:49:26.866Z,
     "usdCnyRate" => 6.728,
      "timestamp" => "1532933366866",
    "futureIndex" => 463.252
}
{
        "message" => "{\"usdCnyRate\": \"6.728\", \"futureIndex\": \"463.31\", \"timestamp\": \"1532933372350\"}",
     "@timestamp" => 2018-07-31T10:48:48.602Z,
           "host" => "logstashvm0",
           "path" => "/usr/share/logstash/wb.cond/test.log",
       "@version" => "1",
        "logdate" => 2018-07-30T06:49:32.350Z,
     "usdCnyRate" => 6.728,
      "timestamp" => "1532933372350",
    "futureIndex" => 463.31
}
posted @ 2018-10-22 16:59  杨文波  阅读(11741)  评论(0编辑  收藏  举报