php - How to solve data loss when logstash writes data to elasticsearch
欧阳克
欧阳克 2017-07-01 09:11:55
0
1
878

There are more than 600 pieces of data in my log, but only more than 300 pieces are written to elasticsearch.
Does anyone know the reason for this?
This is my configuration
input {

file {
    path => ["/usr/local/20170730.log"]
    type => "log_test_events"
    tags => ["log_tes_events"]
    start_position => "beginning"
    sincedb_path => "/data/logstash/sincedb/test.sincedb"
    codec => "json"
    close_older => "86400"
    #1 day
    ignore_older => "86400"
}
beats{port => 5044}

}
filter {

urldecode {
    all_fields => true
}

}

output{

   elasticsearch {
      hosts  => "localhost:9200"
      index  => "logstash_%{event_date}"
}

stdout { codec => json }
}

欧阳克
欧阳克

温故而知新,可以为师矣。 博客:www.ouyangke.com

reply all(1)
过去多啦不再A梦

因为读日志的时候es的模版根据数据的格式自动创建数据类型 比如字段a的值为int和字符串 他创建的索引第一个读取到是数字就是int类型 索引

修改配置 做映射
output {

      elasticsearch {
        hosts  => "localhost:9200"
        index  => "test1"
        manage_template => true
        template_overwrite => true
        template => "/usr/local/logstash/templates/stat_day.json"
     }
     

stat_day.json模版格式

     {

"order" : 1,
"template" : "test1",
"mappings" : {

 "log_test": {
      "properties" : {
            "event_id": { "type": "string"}
       }
  }

}
}

Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!