Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How should I use sql_last_value in logstash?

I'm quite unclear of what sql_last_value does when I give my statement as such:

statement => "SELECT * from mytable where id > :sql_last_value"

I can slightly understand the reason behind using it, where it doesn't browse through the whole db table in order to update fields instead it only updates the records which were added newly. Correct me if I'm wrong.

So what I'm trying to do is, creating the index using logstash as such:

input {
    jdbc {
        jdbc_connection_string => "jdbc:mysql://hostmachine:3306/db" 
        jdbc_user => "root"
        jdbc_password => "root"
        jdbc_validate_connection => true
        jdbc_driver_library => "/path/mysql_jar/mysql-connector-java-5.1.39-bin.jar"
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        schedule => "* * * * *"
        statement => "SELECT * from mytable where id > :sql_last_value"
        use_column_value => true
        tracking_column => id
        jdbc_paging_enabled => "true"
        jdbc_page_size => "50000"
    }
}

output {
    elasticsearch {
        #protocol => http
        index => "myindex"
        document_type => "message_logs"
        document_id => "%{id}"
        action => index
        hosts => ["http://myhostmachine:9402"]
    }
}

Once I do this, the docs aren't getting uploaded at all to the index. Where am I going wrong?

Any help could be appreciated.

like image 710
Kulasangar Avatar asked Nov 01 '16 17:11

Kulasangar


People also ask

What is Sql_last_value?

In simple words, sql_last_value allows you to persist data from your last sql run as its name sugets. This value is specially useful when you schedule your query.

What database does Logstash use?

Each time Logstash polls MySQL, it stores the update or insertion time of the last record that it has read from MySQL.


2 Answers

If you have a timestamp column in your table (e.g. last_updated), you should preferably use it instead of the ID one. So that when a record gets updated, you modify that timestamp as well and the jdbc input plugin will pick up the record (i.e. the ID column won't change its value and the updated record won't get picked up)

input {
    jdbc {
        jdbc_connection_string => "jdbc:mysql://hostmachine:3306/db" 
        jdbc_user => "root"
        jdbc_password => "root"
        jdbc_validate_connection => true
        jdbc_driver_library => "/path/mysql_jar/mysql-connector-java-5.1.39-bin.jar"
        jdbc_driver_class => "com.mysql.jdbc.Driver"
        jdbc_paging_enabled => "true"
        jdbc_page_size => "50000"
        schedule => "* * * * *"
        statement => "SELECT * from mytable where last_updated > :sql_last_value"
    }
}

If you decide to stay with the ID column nonetheless, you should delete the $HOME/.logstash_jdbc_last_run file and try again.

like image 117
Val Avatar answered Nov 15 '22 09:11

Val


There are a few things to take care of:

  1. If you have run Logstash earlier without the schedule, then before running Logstash with schedule, delete the file:

    $HOME/.logstash_jdbc_last_run
    

    In Windows, this file is found at:

    C:\Users\<Username>\.logstash_jdbc_last_run
    
  2. The "statement =>" in Logstash config should have "order by" the tracking_column.

  3. tracking_column should be given correctly.

Here is an example of the Logstash config file:

    input {
jdbc {
    # MySQL DB jdbc connection string to our database, softwaredevelopercentral
    jdbc_connection_string => "jdbc:mysql://localhost:3306/softwaredevelopercentral?autoReconnect=true&useSSL=false"
    # The user we wish to execute our statement as
    jdbc_user => "root"
    # The user password
    jdbc_password => ""
    # The path to our downloaded jdbc driver
    jdbc_driver_library => "D:\Programs\MySQLJava\mysql-connector-java-6.0.6.jar"
    # The name of the driver class for MySQL DB
    jdbc_driver_class => "com.mysql.cj.jdbc.Driver"
    # our query
    schedule => "* * * * *"
    statement => "SELECT * FROM student WHERE studentid > :sql_last_value order by studentid"
    use_column_value => true
    tracking_column => "studentid"
}
}
output {
stdout { codec => json_lines }
elasticsearch { 
   hosts => ["localhost:9200"]
   index => "students"
   document_type => "student"
   document_id => "%{studentid}"
   }

}

To see a working example of the same you can check my blog post: http://softwaredevelopercentral.blogspot.com/2017/10/elasticsearch-logstash-kibana-tutorial.html

like image 22
Aj Tech Developer Avatar answered Nov 15 '22 10:11

Aj Tech Developer