elasticsearch - Program to generate sample log to feed to logstash? -


i have written small java program generates dummy logs (writes stuff txt file basically). want feed data elk stack. logstash should read data txt file , want visualize these changes on kibana, feel of it.

what want change speed @ program writes dummy logs txt file can see changes on kibana.

i have started exploring elk stack , might wrong way kind of analysis. please suggest if there other better ways (considering don't have actual logs work right now)

edit : @val

input {     generator {         message => “’83.149.9.216 - - [17/may/2015:10:05:03 +0000] "get /presentations/logstash-monitorama-2013/images/kibana-search.png http/1.1" 200 203023 "http://semicomplete.com/presentations/logstash-monitorama-2013/" "mozilla/5.0 (macintosh; intel mac os x 10_9_1) applewebkit/537.36 (khtml, gecko) chrome/32.0.1700.77 safari/537.36””         count => 10     } } 

so here logstash.conf:

input {   stdin { }  }   filter {   grok {     match => {       "message" => '%{iporhost:clientip} %{user:ident} %{user:auth} \[%{httpdate:timestamp}\] "%{word:verb} %{data:request} http/%{number:httpversion}" %{number:response:int} (?:-|%{number:bytes:int}) %{qs:referrer} %{qs:agent}'     }   }    date {     match => [ "timestamp", "dd/mmm/yyyy:hh:mm:ss z" ]     locale => en   }    geoip {     source => "clientip"   }    useragent {     source => "agent"     target => "useragent"   } }  output {   stdout { codec => plain {                         charset => "iso-8859-1"                 }  }   elasticsearch {     hosts => "http://localhost:9200"     index => "apache_elk_example"     template => "./apache_template.json"     template_name => "apache_elk_example"     template_overwrite => true   } } 

now after starting elasticsearch , kabana do:

cat apache_logs | /usr/local/opt/logstash/bin/logstash -f apache_logs 

where apache_logs been fed java program:

public static void main(string[] args) {     // todo auto-generated method stub     try {         printstream out = new printstream(new fileoutputstream("/users/username/desktop/user/apache_logs"));         system.setout(out);     } catch (filenotfoundexception ex) {         system.out.print("exception");     }     while(true)     //for(int i=0;i<5;++i)     {         system.out.println(generaterandomips() + //other log stuff);         try {             thread.sleep(1000);                 //1000 milliseconds 1 second.         } catch(interruptedexception ex) {             thread.currentthread().interrupt();         }     } } 

so here problem :

kibana doesn't show me real time visualization i.e. , when java program feeds data apache_log file not show me. shows until whatever data written 'apache_log' @ time of execution of :

cat apache_logs | /usr/local/opt/logstash/bin/logstash -f apache_logs 

might bit late wrote small sample of meant.

i modified java program add timestamp this:

public class logwriter {       public static gson gson = new gson();      public static void main(string[] args) {          try {             printstream out = new printstream(new fileoutputstream("/var/logstash/input/test2.log"));             system.setout(out);         } catch (filenotfoundexception ex) {             system.out.print("exception");         }          map<string, string> timestamper = new hashmap<>();          while(true)         {              string format = localdatetime.now().format(datetimeformatter.iso_date_time);              timestamper.put("mytimestamp", format);             system.out.println(gson.tojson(timestamper));             try {                 thread.sleep(1000);                 //1000 milliseconds 1 second.             } catch(interruptedexception ex) {                 thread.currentthread().interrupt();             }         }      } } 

this write json like:

{"mytimestamp":"2016-06-10t10:42:16.299"} {"mytimestamp":"2016-06-10t10:42:17.3"} {"mytimestamp":"2016-06-10t10:42:18.301"} 

i setup logstash read file , parse , output stdout:

input {   file {      path => "/var/logstash/input/*.log"      start_position => "beginning"      ignore_older => 0      sincedb_path => "/dev/null"   }    }  filter {    json {       source => "message"    } }  output {     file {            path => "/var/logstash/out.log"     }     stdout { codec => rubydebug } } 

so it'll pick log, knows when created, parses it, , creates new timestamp represents when saw log:

{         "message" => "{\"mytimestamp\":\"2016-06-10t10:42:17.3\"}",        "@version" => "1",      "@timestamp" => "2016-06-10t09:42:17.687z",            "path" => "/var/logstash/input/test2.log",            "host" => "pandaadb",     "mytimestamp" => "2016-06-10t10:42:17.3" } {         "message" => "{\"mytimestamp\":\"2016-06-10t10:42:18.301\"}",        "@version" => "1",      "@timestamp" => "2016-06-10t09:42:18.691z",            "path" => "/var/logstash/input/test2.log",            "host" => "pandaadb",     "mytimestamp" => "2016-06-10t10:42:18.301" } 

here can see how long takes log seen processed. around 300 miliseconds, account fact java writer async writer , not flush right away.

you can make bit "cooler" using elapsed plugin calculate difference between timestamps you.

i hope helps testing :) might not advanced way of doing it, it's easy understand , pretty forward , fast.

artur


Comments

Popular posts from this blog

sequelize.js - Sequelize group by with association includes id -

android - Robolectric "INTERNET permission is required" -

java - Android raising EPERM (Operation not permitted) when attempting to send UDP packet after network connection -