ElastiFlow is analytical tool used for monitor the network using Netflow and/or IPFIX as source of the data. ElastiFlow use Logstash, Elasticsearch and kibana for processing the data. Logstash used for shiper from source to the database. Elasticsearch used for store the data, and Kibana used for visualize the data.
In this post we will share how to configure ElastiFlow using Mikrotik (Netflow v5) for the source and CentOS 7 for the service (Logstash, Elasticsearch and Kibana).
In this sample we will use one server for run the whole process (Logstash, Elasticsearch and Kibana). First, make sure your server isntalled Java (JDK 8). Assume that the IP of the server is 192.168.1.1 and make sure your server is connected to the internet. The version of the package is described on the table bellow.
Be carefull with the firewall, open firewall for port that used for the Logstash, Elasticsearch and Kibana. For make it simple, in this case we disable the firewall.
We will install from .rpm file, follow the instruction bellow.
Edit /etc/elasticsearch/elasticsearch.yml file and follow the configuration bellow.
Starting and Testing
This step will start Elasticsearch and make it auto start on first boot with chkconfig command.
Open Elasticsearch Web UI on http://192.168.1.1:9200/ to make sure that elasticsearh is running on the server.
We will install logstash with tarball file, so we recommend to use tmux/screen to run the service on the background. The reason use the tarball file is because with tarball file we can run the logstash and the background, so we can run multiple logstash in one server with different port.
Download and Extract
Set JVM Size
Edit jvm.options on your logstash tarball directory and edit on JVM configuration section.
Add and Update Plugin
Clone ElastiFlow from Github
This step will clone ElastiFlow from Github to /opt/ directory.
Copy Pipeline Configuration
From cloned ElastiFlow project we will copy the configuration logstash file. Follow the instuction bellow.
Setup Environment Variable
Cloned elastiflow is provide executable file for set the environment variable of the server. We can find the file on “/opt/elastiflow/profile.d/elastiflow.sh”. For easier using the environment file, copy the file to the logstash directory.
Then configure the file and adjust to your server. Edit and match the contents like instruction bellow, and leave the other with default value.
*Run this file before start the logstash service.
Add the following line to the end of file.
Configure Elasticsearch Indices
We will use single elasticsearch, so we must configure the logstash to store data without replicas and only with one shard. We can skip this step, but if we don’t configure the file the indices will be unhealth indices because the default template is use 3 shards and 1 replicas. Edit and match the contents like instruction bellow, and leave the other with default value.
Before we start the logstash service, we must run the environment file “elastiflow.sh”. Follow the instruction bellow.
*Be patient to wait the service run, logstash take some minute to start the service
In this step we will configure Mikrotik router to enable netflow and set the target to our installed server. Follow the instruction bellow.
The step is described below :
1. Click “IP” menu
2. Then click “Traffic Flow”
3. Check on enabled radio button
4. Click “Target” thet set the target to our logstash server
5. Click “Plus/Add Sign” button
6. The new dialog will apear and fill the “Dst. Address” and “Port” to the logstash server
7. Click “Ok”
Checking Elasticsearch Data
This step will make sure that data from mikrotik is stored on elasticsearch. Open elasticsearch UI on http://192.168.1.1:9200/_cat/indices, the url will allow you to see the data based on the indices. Make sure you can see this indices elastiflow-3.2.2-xxxx.
Download and Install
This step will install kibana with .rpm file downloaded from kibana site here.
Configure the kibana service so we can open from the outside. Edit kibana.yml file and match the content bellow.
Then start the kibana with this command, also make kibana auto start when boot up.
Upload index pattern
Kibana need the pattern of the netflow data, we can upload the pattern via the API.
Load Vizualization and Dashboard
From the cloned project is included visualization and dashboard, we can easily use with load to the kibana. Open the kibana UI on
http://192.168.1.1:5601/ and follow the step to load the visualization and dashboard:
- Click “Management” menu on the left side.
- Click “Saved object”
- Click “Import”
- Then choose elastiflow.dashboards.6.3.x.json file, we can find the file on this path “/opt/elastiflow/kibana/elastiflow.dashboards.6.3.x.json”
- Click “Ok” and wait until the process is done.
Now you can access the visualization from Dashboard