The csv
filter is not useful in your context. Its goal is to parse incoming CSV data, but that's not what you have. What you need is to parse the log lines with a grok
filter first and only then you'll be able to send it properly to the csv
output:
filter {
grok {
match => {"message" => "[%{TIMESTAMP_ISO8601:TIMESTAMP}][%{LOGLEVEL:LOGLEVEL} ][%{DATA:QUERY}] [%{WORD:QUERY1}] [%{WORD:INDEX}][%{INT:SHARD}] took[%{BASE10NUM:TOOK}ms], took_millis[%{BASE10NUM:took_millis}], types[%{DATA:types}], stats[%{DATA:stats}], search_type[%{DATA:search_type}], total_shards[%{INT:total_shards}], source[%{DATA:source}], extra_source[%{DATA:extra_source}]"}
}
}
output {
csv {
fields => ["TIMESTAMP","LOGLEVEL","QUERY","QUERY1","INDEX-NAME","SHARD","TOOK","took_millis","types","stats","search_type","total_shards","source_query","extra_source"]
path => "F:logstash-5.1.1logstash-5.1.1finaloutput1"
spreadsheet_safe => false
}
}
Note: this doesn't yet work on Logstash 5.1.1 because of this open issue. It should get fixed soon, but in the meantime this works on Logstash 2.4.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…