• Support
  • Forums
  • Blogs

Use Logstash + Elasticsearch + Kibana manager log on ossim

junojuno

Hi everyone,
I have read discussion "Raw Logger replacement for us opensource folks (Logstash/Kibana/Elasticsearch)" of HawtDogFlvrWtr, thanks HawtDogFlvrWtr about infomation useful.
However, I not install as
HawtDogFlvrWtr, after I can install it into step by step :
1. Install Elasticsearch 1.1.1
2. Use logstash-1.4.2, unpackage and copy a file log.conf
log.conf: input {
  file {
    type => "syslog"
    path => ["/var/log/syslog", "/var/log/apache2", "/var/log/user.log", "/var/log/snort", "/var/log/messages", "/var/log/faillog" ]
    }
}

filter{
  if [type] == "syslog" {
    grok {
      match => {"message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program} (?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}"]
      add_field => [ "received_from", "%{host}"]
    }
    syslog_pri {}
    date {
      match => ["syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss"]
    }
  }
}

output {
  elasticsearch { host => localhost}
  stdout { codec => rubydebug }
}


3. Use kibana-lastest, I run kibana as plugin of elasticsearch :
- Copy kibana-lastest into /usr/share/elasticsearch/plugins   (note : you should use elasticsearch 1.1.1 (.deb for debian)
- Create a directory : _site
- Copy all file of kibana into _site
- I have : /usr/share/elasticsearch/plugins/kibana/_site/ (all file of kibana lastest)

4. Goto : /opt/logstash run command: ./bin/logstash -f log.conf
5. Restart elasticsearch : /etc/init.d/elasticsearch restart
After wait few minutes,

6. Open browser: http://OSSIM_IP:9200/_plugin/kibana/
And you can open sample dashboard, or logstash dashboard

If open logstash dashboard notice :
"No results There were no results because no indices were found that match your selected time span"
you can fix : install ntp on ossim , use dpkg-reconfigure tzdata and change timezone, after setup timezone and restart ntp

juno
avinashlukf

Share post:

Comments

  • Hi juno,

    I want to use Logstash with Ossim too.
    I use Logstash with the OSSEC logs. Not with alerts but all logs with the OSSEC's <logall> option
    Did you change the widget dashboard widget for  indexing logs

  • Old thread, but just integrated the two.  Here are the broad strokes how I did it.

    Install ELK on another VM/server.
    Go find and build mysqlbeat on the ELK server (a community beat for logstash.)
    Edit /etc/ossim/firewall_include and allow 3306 access to your desktop and the ELK server.
    Run ossim-reconfig on AlienVault to pick up the firewall change.
    Install MySQL Workbench on your desktop (if you haven't already.)
    Get the MySQL password from /etc/ossim/ossim_setup.conf
    Allow network access to MySQL (many threads online on how to do this.)
    Connect MySQL Workbench to MySQL and create a new read-only user for ELK.
    Configure mysqlbeat to pull data from MySQL-

    select
        unix_timestamp( e.timestamp ) as timestamp_utc,
        unix_timestamp( date_sub( e.timestamp, interval 5 hour ) ) as timestamp_est,
        ifnull( inet_ntoa( conv( hex( e.ip_src ), 16, 10 ) ), '0.0.0.0' ) as src_ip, e.layer4_sport as src_port, e.src_hostname,
        ifnull( inet_ntoa( conv( hex( e.ip_dst ), 16, 10 ) ), '0.0.0.0' ) as dst_ip, e.layer4_dport as dst_port, e.dst_hostname,
        e.ip_proto,
        e.plugin_id, p.name as plugin_name,
        e.plugin_sid, s.name as plugin_sid_name,
        e.ossim_risk_a, e.ossim_risk_c,
        d.userdata1, d.userdata2, d.userdata3, d.userdata4, d.userdata5, d.userdata6, d.userdata7, d.userdata8,
        d.filename, d.username
    from       alienvault_siem.acid_event e
    inner join alienvault_siem.extra_data d on ( e.id         = d.event_id )
    inner join alienvault.plugin          p on ( e.plugin_id  = p.id )
    inner join alienvault.plugin_sid      s on ( e.plugin_sid = s.sid ) and ( p.id = s.plugin_id )
    where
        ( timestamp >= date_sub( utc_timestamp(), INTERVAL 5 MINUTE ) )
    order by e.timestamp desc

    Create a template that matches-

    {
      "mappings": {
        "siem_record": { 
          "dynamic": false,
          "_all":       { "enabled": false }, 
          "properties": { 
            "timestamp_utc":   { "type": "date" },
            "timestamp_est":   { "type": "date" },
            "src_ip":          { "type": "ip" },
            "src_port":        { "type": "integer" },
            "src_hostname":    { "type": "keyword" },
            "src_location":    { 
              "type": "geo_point", 
              "ignore_malformed": "true"
            },
            "dst_ip":          { "type": "ip" },
            "dst_port":        { "type": "integer" },
            "dst_hostname":    { "type": "keyword" },
            "dst_location":    {
                "type": "geo_point", 
                "ignore_malformed": "true"
            },
            "ip_proto":        { "type": "integer" },
            "plugin_id":       { "type": "integer" },
            "plugin_name":     { "type": "keyword" },
            "plugin_sid":      { "type": "integer" },
            "plugin_sid_name": { "type": "keyword" },
            "ossim_risk_a":    { "type": "integer" },
            "ossim_risk_c":    { "type": "integer" },
            "userdata1":       { "type": "text" },
            "userdata2":       { "type": "text" },
            "userdata3":       { "type": "text" },
            "userdata4":       { "type": "text" },
            "userdata5":       { "type": "text" },
            "userdata6":       { "type": "text" },
            "userdata7":       { "type": "text" },
            "userdata8":       { "type": "text" },
            "filename":        { "type": "text" },
            "username":        { "type": "keyword" }
          }
        }
      },
      "template": "siem_data-*"
    }

    Set mysqlbeat to run as often as the INTERVAL in the SQL WHERE clause.
    Configure logstash-

    input {
        beats { 
            port => 5044
        }
    }

    filter {
        date {
            match => [ "timestamp_est", UNIX ]
            target => "timestamp_est"
        }
        date {
            match => [ "timestamp_utc", UNIX ]
            target => "timestamp_utc"
        }
    }


    output {
        elasticsearch { 
            hosts => "localhost:9200"
            index => "siem_data-%{+YYYY.MM.dd}"
            document_type => "siem_record"
            template => "/etc/logstash/siem_data.template.json"
            template_name => "siem_data"
            template_overwrite => true
        }
    }

    I have not gotten the GEOIP stuff working yet, but that and ingesting netflow data are the next projects.

    Enjoy!
    Rus

Sign In or Register to comment.