Giter VIP home page Giter VIP logo

Comments (7)

dpedestrian avatar dpedestrian commented on August 22, 2024 1

This is an index template. When applied to elasticsearch it will apply the correct types for the variables passed and generate geohash for visualization. I'm just not sure how to properly package it or apply it via logstash in a scripted way. You just have to change the and to what the output index name passed to elastic is. If you are using the defaults in the files here you can just change them to pf and get it working. I was hoping to find a way via logstash to feed the template into elasticsearch the same way filebeat does. I know it can do it, but just so new to this I haven't figured it out yet.

note: I applied this via the dev page on kibana for lack of a better method. (e.g. /app/kibana#/dev_tools/console ) I am now using the create template option in 7.4 to do so. Here is the result of that which likely can be imported simply. I used the variable pf for this example to stay consistent. Keep in mind that this doesn't resolve the splunk or suricata geo data yet. I am still working on getting that to even grok correctly. I'll likely have time to look at it soon. I am just busy with work so I have very little time to look at this.

edit: I found how to make a proper template for logstash to feed into elasticsearch. I'm working on putting one together. It would end up containing the below data, but also would encompass all the other field configurations which I'm unsure of what should or should not be done, but likely only matters for large data sets. (e.g. number of shards and query fields are defined in templates) Regardless, this template can be applied to get filterlog data right now. I am looking at the grok for snort since it doesn't seem to work properly. I'm unsure what is wrong with it, but looks like portless lines don't process correctly which i think just needs another filter string attempt added and should be enough. There are still a large number of grokparsefailure messages in this.

PUT _template/pf-location-template
{
  "index_patterns": [
    "pf-*"
  ],
  "mappings": {
    "properties": {
      "destination": {
        "properties": {
          "geo": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          },
          "as": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          }
        }
      },
      "source": {
        "properties": {
          "geo": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          },
          "as": {
            "dynamic": true,
            "properties": {
              "ip": {
                "type": "ip"
              },
              "location": {
                "type": "geo_point"
              }
            }
          }
        }
      }
    }
  }
}

from pfelk.

dpedestrian avatar dpedestrian commented on August 22, 2024

Looks like this might fix it up template side. I just had to get index properly setup with dynamic. I updated the template below. Let me know if this works and if there is a better way. First attempt was close, but this actually seem to be working properly. I will add the snort stuff soon.

PUT _template/<TEMPLATENAME>
{
  "index_patterns" : ["<INDEXPATTERN>*"],
    "mappings" : {
      "properties" : {
        "destination" : {
          "properties" : {
            "geo" : {
              "dynamic" : true,
              "properties" : {
                "ip" : {
                  "type":"ip"
                },
                "latitude" : {
                  "type":"half_float"
                },
                "location" : {
                  "type":"geo_point"
                },
                "longitude" : {
                  "type":"half_float"
                }
              }
            }
          }
        },
        "source" : {
          "properties" : {
            "geo" : {
              "dynamic" : true,
              "properties" : {
                "ip" : {
                  "type":"ip"
                },
                "latitude" : {
                  "type":"half_float"
                },
                "location" : {
                  "type":"geo_point"
                },
                "longitude" : {
                  "type":"half_float"
                }
              }
            }
          }
        }
      }
    }
  }

from pfelk.

Sjaak01 avatar Sjaak01 commented on August 22, 2024

Is there a template somewhere? Its not included in the .ndjson. I could go over it and fix some of the stuff.

from pfelk.

Sjaak01 avatar Sjaak01 commented on August 22, 2024

Thanks, I'll give that one a try.

If you want to load your template with Logstash you can use the template setting in the Elastic output.
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-template

Though my preferred method is to always upload my own templates using the Kibana dev console.

from pfelk.

VanDudenStein avatar VanDudenStein commented on August 22, 2024

Hi, thanks very much for sharing all that. The entire how to plus the configuration files have been very helpful. I did also have the Geo data problem, and have been breaking my head around this for weeks trying to figure it out. Mind you, I am totally new to all of the Elastic Stack components, so I am still trying to understand simple things like, why are there so many different files with numbers at the beginning of their name, some of them using the same Logstash stages like the filter ones, and how does Logstash reads all that, are the numbers to prioritize the order on which the files get loaded or just randomly chosen? Etc, etc.

So, as you can read from the above I still have a lot to learn. I did though manage to fix the geo_point data problem while I was trying to solve a different one. On a alot of the posts I have found on people having the "No Geo Data" or something of that sort when trying to plot ip locations on the map the Elastic guys often mention about the issues which may occur from creating custom index names on the output phase in Logstash, which apparently might result in Elasticsearch attaching the wrong template to the index and thus having the wrong data type for some fields. I still don't know exactly how that works, but there you go.

I have modified the input file like this:

input {
   syslog  {
        type => "syslog"
        port => 5140
        }

Here is what I have done with the output:

output {
   elasticsearch {
       hosts => "localhost:9200"
       index => "logstash-%{+YYYY.MM.dd}"
   }
}

Everything else is the same as suggested on the original post.

Now all is working on my system and I can plot the sources and destinations on the map.

Maybe that could give some indication as how to fix it on the provided configuration files.

from pfelk.

a3ilson avatar a3ilson commented on August 22, 2024

from pfelk.

a3ilson avatar a3ilson commented on August 22, 2024

Utilizing the following template enables geo_point. Installed template prior to creating index pattern.

from pfelk.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.