HTTP and DNS Export to ElasticSearch using NPROBE

One of these days I was trying implement just the Nprobe module from the NTOP stack as I didn’t need the whole pack. Nprobe is netflow collector whiting Ntop. The idea was to ship all the netflow data to Logstash, then have it converted to ElasticSearch and viewed in Kibana.

The logstash has a netflow module to be used and that can be consulted here. If you have a opensource netflow collector that basic logstash module  will work just fine without much effort, the free version of Nprobe will also work limited to a number of lines. None of that was useful to me unless I had the DNS and URL data. After paying 500USD for the license I was supposed to see the URL and DNS info in Kibana..but nothing was showing up and had to dig a bit to discover the reason and here is what I have found.


Finding the problem

First thing I have done was to send an email to the support. As you can see in the shopping table I was entitled to a 5 days support from them.. so I used it. Here is their answer:

Hi Anderson,
In order to export additional fields you need to specify the fields to the nProbe template. A basic template for your use case should contain %HTTP_HOST and %DNS_QUERY . The resulting template option is:

Please check out "nprobe -H" for more details.

It is curious but you actually have to specify each and every field you want to export from nprobe… not just the exception….. craaazy. So I added the line below to the configuration file:


Then next logical path was to check whether the nprobe was sending the data and if the data was being received by Logstash. On the Nprobe node a simple tcpdump looking for the port set in the configuration file – my case it was port 2055 – would give us that answer.

All cool here,

Next step is to check for errors in the logs. With the command journactl -xf running in the logstash node I found the errors below.

[2019-09-16T19:07:40,488][WARN ][2019-09-16T19:57:16,214][WARN ][logstash.codecs.netflow ] Unsupported field in template 259 {:type=>57678, :length=>2}
[2019-09-16T19:57:16,215][WARN ][logstash.codecs.netflow ] Unsupported field in template 260 {:type=>57678, :length=>2}
[2019-09-16T19:57:16,216][WARN ][logstash.codecs.netflow ] Unsupported field in template 261 {:type=>57652, :length=>128}
[2019-09-16T19:57:16,217][WARN ][logstash.codecs.netflow ] Unsupported field in template 262 {:type=>57652, :length=>128}

It basically says that there is a template and some fields are not being recognized. Using Wireshark I was able to spot exactly what were the fields not being recognized by the Logstash. Ha! The fields I wanted related to HTTP and DNS fields.

Ok, I found the error. Cool. So NProbe module that I purchased is sending the data to Logstash but the Nprobe module in Logstash node is not ready to parse this fields. What do to next?

Fixing It

I found out that in the path – usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-codec-netflow-4.2.1/lib/logstash/codecs/netflow there is a netflow.yaml file which is the template the error is refering to. We then need to add the following lines to this template as per what was observed in the wireshark analysis.

- :string
- :http_url
- :uint16
- :http_ret_code
- :string
- :http_referer
- :string
- :http_ua
- :string
- :http_mime
- :string
- :http_host
- :string
- :dns_query
- :uint16
- :dns_query_id
- :uint16
- :dns_query_type
- :uint16
- :dns_ret_code

Now we can see the logstash receiving the data as it should and processing it all right. Next step is to build your nice dashboard in Kibana.

Deixe uma resposta

O seu endereço de e-mail não será publicado. Campos obrigatórios são marcados com *