Logstash is a data pipeline which helps you in processing various events and logs coming from various systems. It is completely plugin based and with the help of these plugins, it can get input from plenty of sources and can give output to plenty of systems after processing the logs. For our use case, we'll take logstash forwarder as input and elasticsearch as output. We can attach Kibana to elasticsearch to do the analytics and create the dashboards.
Logstash Forwarder: Logstash Forwarder is a log shipper which ships logs to Logstash. It was earlier lumberjack and is written in Go language.
Here we explain how to setup the integration described earlier:
1. Name the host: Generally, IP based certificates in logstash-forwarder cause issues so name the host. For this make host entry in /etc/hosts:
127.0.0.1 maverick.logstash.com
2. Create certificates: Logstash forwarder requires certificates in order to communicate with Logstash. Path of these certificates need to be put in logstash forwader configuration as well as logstash configuration.
Use following command to generate key pairs (You need to have OpenSSL):
openssl req -x509 -nodes -newkey rsa:2048 -keyout
/etc/pki/tls/private/logstash-forwarder/logstash-forwarder.key -out
/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt -days 365
Enter name of the organisation, unit etc as prompted on command line. Make sure that you use same host name while key creation as in step 1. Above command will result in two keys - one private and another public key. You need to copy these keys to logstash host if logstash forwarder and logstash are not on same machine.
Now add the entry in CA:
openssl x509 -in
/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt -text >>
/etc/pki/tls/certs/ca-bundle.crt
3. Install Logstash:
Download logstash archive:
Unzip the archive:
tar -xzf logstash-1.5.2.tar.gz
Add configuration file for logstash named logstash.conf in logstash-1.5.2/bin. Sample logstash.conf will be like this:
input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder/logstash-forwarder.key"
}
}
filter{
grok {
match => [ "message", "%{COMBINEDAPACHELOG}" ]
}
}
output{
elasticsearch {
host => "10.1.40.222"
protocol => "http"
}
stdout { codec => rubydebug }
}
Input for this is lumberjack which is logstash forwarder and output is elasticsearch. Filter plugin does processing of log events which again uses another plugin called "grok". "grok" filters accept patterns and based on the pattern it can break log event into meaningful terms. For our example, we'll use Apache Log pattern which is in-built in logstash.
Note that 10.1.40.222 is IP of the host running elastic search.
Logstash can be started using following command:
./logstash -f logstash.conf
4. Install Logstash Forwarder: Best way to install it is building it from source code. For that, we need Go first.
Download Go:
Extract it:
tar -C /usr/local -xzf go1.4.2.linux-amd64.tar.gz
Add go to Path. Make any entry in /etc/profile
export PATH=$PATH:/usr/local/go/bin
Download logstash forwarder code:
git clone
git://github.com/elasticsearch/logstash-forwarder.git
Compile the source:
go build -o logstash-forwarder
Create logstash fowarder config like this:
{
"network":
{
"servers": ["maverick.logstash.com:5043"],
"ssl_ca": "/etc/ssl/certs/new-forwarder.crt",
"ssl_key": "/etc/ssl/private/new-forwarder.key"},
"files": [ { "paths": ["/tmp/access_log"],
"fields":
{
"type": "apache"
}
}
]
}
Run the logstash forwader like this:
./logstash-forwarder -config logstash-forwarder.conf
5. Run the elasticsearch:
./elasticsearch
6. Once logstash-forwader, logstash and elasticsearch are running. Put apache logs at the location defined in logstash-forwarder config "/tmp/access_log" in above example. You'll find that every line of log is being put in elasticsearch as separate document.