Re: syslog / logstah problem with timestamp

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]



> > {"index"=>{"_index"=>"%{[@metadata][comline]}-%{[@metadata][version]}",  
> > "_type"=>"doc", "_id"=>"U1XLXGkBpfl5FoHeY4J8", "status"=>400,  
> > "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to  
> > parse field [timestamp] of type [date]",  
> > "caused_by"=>{"type"=>"illegal_argument_exception",  
> > "reason"=>"Invalid format: \"Mar  8 11:13:54\""}}}}}
> [2019-03-08T11:13:47,125][WARN ][logstash.outputs.elasticsearch] Could  
> not index event to Elasticsearch. {:status=>400, :action=>["index",  
> {:_id=>nil,  
> :_index=>"%{[@metadata][comline]}-%{[@metadata][version]}",  
> :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x3af3f839>],  
> :response=>{"index"=>{"_index"=>"%{[@metadata][comline]}-%{[@metadata][version]}", "_type"=>"doc", "_id"=>"VFXLXGkBpfl5FoHeY4Ly", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestamp] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"Mar  8  
> 11:13:54\""}}}}}
> [2019-03-08T11:13:47,202][WARN ][logstash.outputs.elasticsearch] Could  
> not index event to Elasticsearch. {:status=>400, :action=>["index",  
> {:_id=>nil,  
> :_index=>"%{[@metadata][comline]}-%{[@metadata][version]}",  
> :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x4fedebdc>],  
> :response=>{"index"=>{"_index"=>"%{[@metadata][comline]}-%{[@metadata][version]}", "_type"=>"doc", "_id"=>"VVXLXGkBpfl5FoHeZII_", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestamp] of type [date]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"Mar  8  
> 11:13:54\""}}}}}


To be pedantic that's not a logstash error, it's an elasticsearch
error.

What logstash filters do you have for syslog messages? Make sure you
have a filter in place that does something like:

   filter {
     if [type] == "syslog" {
       grok {
         match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
         add_field => [ "received_at", "%{@timestamp}" ]
         add_field => [ "received_from", "%{host}" ]
       }
       date {
         match => [ "syslog_timestamp", "MMM  d HH:mm:ss.SSS", "MMM dd HH:mm:ss.SSS" ]
         timezone => "UTC"
       }
     }
   }

That will break apart the message into fields and set the correct date
format. But you will probably have to play with it to get it exactly
how you want it - the actual content of the syslog message as seen by
logstash often depends on its source (syslog, filebeat, redi etc.) -
pay particular attention to the format of the timestamps.

P.





_______________________________________________
CentOS mailing list
CentOS@xxxxxxxxxx
https://lists.centos.org/mailman/listinfo/centos



[Index of Archives]     [CentOS]     [CentOS Announce]     [CentOS Development]     [CentOS ARM Devel]     [CentOS Docs]     [CentOS Virtualization]     [Carrier Grade Linux]     [Linux Media]     [Asterisk]     [DCCP]     [Netdev]     [Xorg]     [Linux USB]


  Powered by Linux