Enhancing Kibana
We may find the need, to parse some more logs into Kibana for later use at convinient times.
How it works
The logstash config file is being generated:
- The logsearch-4cf packaging runs the build script which executes
rake build
. - The logsearch-4cf rake build renders the main template file and creates
logstash-filters-default.conf
. It follows theparsing-rules
order to acomplish it’s task. - Logsearch functionality
logstash_parser.filters
, takes a list of files that must be present on disk. These contain logstash filters. - Standard logsearch-4cf
logstash-filters-default.conf
is added to logstash_parser.filters. - We add the path of our own
custom-filters
file tologstash_parser.filters
. - With help of our PR we can add content to the
custom-filters
file by filling in propertylogstash_parser.custom_filters
- The
logsearch parser_ctl
gathers all the files, including the ones fromlogstash_parser.filters
to create/var/vcap/jobs/parser/config/logstash.conf
which then is used by Logstash on each run.
Running Logstash
- Download
logstash@5.0.0
- you will need Java 8. - Install the translate plugin:
logstash-plugin install logstash-filter-translate
- Copy
/var/vcap/jobs/parser/config/logstash.conf
locally or use the minimal config logsearch_logstash.conf - Edit the config file and point to the log input file
- Run logstash:
logstash -f ../../logstash/logsearch_logstash.conf
- You may find Grok debugger useful
Patience
Working with the above, may be a very unappealing process… If you combine it with you can save a lot of time
- To restart reading from the beginning of file:
rm data/plugins/inputs/file/.sincedb_*
and restart logstash To start reading the log file from beginning everytime you restart logstash:
file { path => "/Users/colin/Documents/Boulot/gds/logstash/nginx_access.log" start_position => "beginning" sincedb_path => "/dev/null" }
If you combine it with https://www.elastic.co/guide/en/logstash/current/reloading-config.html you can save a lot of time
Working example
Now, let’s say we’ve got a log like:
May 11 13:24:09 57n27d4b-60p0-42ly-a00a-f913j7b8841f custom_nginx_access: 10.0.1.217 - - [11/May/2017:13:24:09 +0000] "GET /healthz HTTP/1.1" 200 3 "-" "ELB-HealthChecker/1.0"
By adding few extra lines to our logstash configuration file, we can specify, what these actually are:
if [@source][component] == "custom_nginx_access" {
grok {
match => {
"@message" =>
'%{IPORHOST:[nginx][clientip]} - - \[%{HTTPDATE:[nginx][timestamp]}\] "%{WORD:[nginx][verb]} %{URIPATHPARAM:[nginx][request]} HTTP/%{NUMBER:[nginx][httpversion]}" %{NUMBER:[nginx][response]} (?:%{NUMBER:[nginx][bytes]}|-) (?:"(?:%{URI:[nginx][referrer]}|-)"|%{QS:[nginx][referrer]}) %{QS:[nginx][agent]}'
}
}
}
After a restart and successful entry in Kibana, it will look like that:
Key | Value |
---|---|
nginx.agent | “ELB-HealthChecker/1.0” |
nginx.bytes | 566 |
nginx.clientip | 10.0.16.6 |
nginx.httpversion | 1.1 |
nginx.request | /info |
nginx.response | 200 |
nginx.response_time | 0.029 |
nginx.timestamp | 11/May/2017:14:59:04 +0000 |
nginx.vcap_request_id | 9ed66476-764d-486e-b52c-05280929f726 |
nginx.verb | GET |
nginx.x_forwarded_for | 10.0.0.116 |