Comments (7)
This is an index template. When applied to elasticsearch it will apply the correct types for the variables passed and generate geohash for visualization. I'm just not sure how to properly package it or apply it via logstash in a scripted way. You just have to change the and to what the output index name passed to elastic is. If you are using the defaults in the files here you can just change them to pf and get it working. I was hoping to find a way via logstash to feed the template into elasticsearch the same way filebeat does. I know it can do it, but just so new to this I haven't figured it out yet.
note: I applied this via the dev page on kibana for lack of a better method. (e.g. /app/kibana#/dev_tools/console ) I am now using the create template option in 7.4 to do so. Here is the result of that which likely can be imported simply. I used the variable pf for this example to stay consistent. Keep in mind that this doesn't resolve the splunk or suricata geo data yet. I am still working on getting that to even grok correctly. I'll likely have time to look at it soon. I am just busy with work so I have very little time to look at this.
edit: I found how to make a proper template for logstash to feed into elasticsearch. I'm working on putting one together. It would end up containing the below data, but also would encompass all the other field configurations which I'm unsure of what should or should not be done, but likely only matters for large data sets. (e.g. number of shards and query fields are defined in templates) Regardless, this template can be applied to get filterlog data right now. I am looking at the grok for snort since it doesn't seem to work properly. I'm unsure what is wrong with it, but looks like portless lines don't process correctly which i think just needs another filter string attempt added and should be enough. There are still a large number of grokparsefailure messages in this.
PUT _template/pf-location-template
{
"index_patterns": [
"pf-*"
],
"mappings": {
"properties": {
"destination": {
"properties": {
"geo": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
}
}
},
"as": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
}
}
}
}
},
"source": {
"properties": {
"geo": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
}
}
},
"as": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
}
}
}
}
}
}
}
}
from pfelk.
Looks like this might fix it up template side. I just had to get index properly setup with dynamic. I updated the template below. Let me know if this works and if there is a better way. First attempt was close, but this actually seem to be working properly. I will add the snort stuff soon.
PUT _template/<TEMPLATENAME>
{
"index_patterns" : ["<INDEXPATTERN>*"],
"mappings" : {
"properties" : {
"destination" : {
"properties" : {
"geo" : {
"dynamic" : true,
"properties" : {
"ip" : {
"type":"ip"
},
"latitude" : {
"type":"half_float"
},
"location" : {
"type":"geo_point"
},
"longitude" : {
"type":"half_float"
}
}
}
}
},
"source" : {
"properties" : {
"geo" : {
"dynamic" : true,
"properties" : {
"ip" : {
"type":"ip"
},
"latitude" : {
"type":"half_float"
},
"location" : {
"type":"geo_point"
},
"longitude" : {
"type":"half_float"
}
}
}
}
}
}
}
}
from pfelk.
Is there a template somewhere? Its not included in the .ndjson. I could go over it and fix some of the stuff.
from pfelk.
Thanks, I'll give that one a try.
If you want to load your template with Logstash you can use the template setting in the Elastic output.
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-template
Though my preferred method is to always upload my own templates using the Kibana dev console.
from pfelk.
Hi, thanks very much for sharing all that. The entire how to plus the configuration files have been very helpful. I did also have the Geo data problem, and have been breaking my head around this for weeks trying to figure it out. Mind you, I am totally new to all of the Elastic Stack components, so I am still trying to understand simple things like, why are there so many different files with numbers at the beginning of their name, some of them using the same Logstash stages like the filter ones, and how does Logstash reads all that, are the numbers to prioritize the order on which the files get loaded or just randomly chosen? Etc, etc.
So, as you can read from the above I still have a lot to learn. I did though manage to fix the geo_point data problem while I was trying to solve a different one. On a alot of the posts I have found on people having the "No Geo Data" or something of that sort when trying to plot ip locations on the map the Elastic guys often mention about the issues which may occur from creating custom index names on the output phase in Logstash, which apparently might result in Elasticsearch attaching the wrong template to the index and thus having the wrong data type for some fields. I still don't know exactly how that works, but there you go.
I have modified the input file like this:
input {
syslog {
type => "syslog"
port => 5140
}
Here is what I have done with the output:
output {
elasticsearch {
hosts => "localhost:9200"
index => "logstash-%{+YYYY.MM.dd}"
}
}
Everything else is the same as suggested on the original post.
Now all is working on my system and I can plot the sources and destinations on the map.
Maybe that could give some indication as how to fix it on the provided configuration files.
from pfelk.
from pfelk.
Utilizing the following template enables geo_point. Installed template prior to creating index pattern.
from pfelk.
Related Issues (20)
- OpenVPN+ELK Stack HOT 3
- Is there any updated documentation on starting this up in Docker? HOT 2
- Unable to create DHCP Dashboard, DHCP Saved Objects as there is a data view conflict HOT 7
- Error in logstash parse: illegal_argument_exception HOT 3
- issue when installing docker pfelk HOT 5
- Can't select firewall entry in Dashboard after uprade with Unable to fetch terms, error HOT 2
- Dashboard templates not receiving data HOT 11
- Excessive Null Values/Fields Due To "pfelk-mappings-ecs" HOT 1
- PFsense logs not being sent to PFelk HOT 1
- Data not appearing in dashboards HOT 15
- kea dhcp support HOT 15
- ngnix template error HOT 1
- Snort Dashboard - most data missing HOT 18
- Error loading Suricate template HOT 4
- Kibana will not start after changing VM (Ubuntu) IP Address HOT 1
- Cannot get data to display on NGINX dashboard HOT 16
- Cannot get data to display Suricata dashbord and Firewall dashbord HOT 8
- index_not_found_exception on logs-pfelk-haproxy HOT 1
- HAProxy dashboard / HAProxy HTTP Status Codes: field invalid HOT 13
- Unmapped fields in template HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pfelk.