I have a problem with jdbc_driver_library
. I'm using ELK_VERSION = 6.4.2
and I use Docker for ELK.
When I run:
/opt/logstash# bin/logstash -f /etc/logstash/conf.d/mysql.conf
I'm getting an error:
error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Driver path:
root@xxxxxxx:/etc/logstash/conectors# ls
mysql-connector-java-8.0.12.jar
root@xxxxxxxxxx:/etc/logstash/conectors#
mysql.conf:
input {
jdbc {
jdbc_driver_library => "/etc/logstash/conectors/mysql-connector-java-8.0.12.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://localhost:3306/mydb"
jdbc_user => "demouser"
jdbc_password => "demopassword"
statement => "SELECT id,name,city from ads"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
index => 'test'
document_type => 'tes'
document_id => '%{id}'
hosts => ['http://localhost:9200']
}
}
The whole error:
root@xxxxx:/opt/logstash# bin/logstash -f /etc/logstash/conf.d/mysql.conf
Sending Logstash logs to /opt/logstash/logs which is now configured via log4j2.properties
[2018-11-10T09:03:22,081][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-11-10T09:03:23,628][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-11-10T09:03:30,482][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-10T09:03:31,479][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-11-10T09:03:31,928][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-10T09:03:32,067][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-10T09:03:32,076][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-11-10T09:03:32,154][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-11-10T09:03:32,210][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-10T09:03:32,267][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-10T09:03:32,760][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x202f727c run>"}
[2018-11-10T09:03:32,980][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-10T09:03:33,877][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-10T09:03:34,315][ERROR][logstash.pipeline ] A plugin had an unrecoverable error. Will restart this plugin.
Pipeline_id:main
Plugin: <LogStash::Inputs::Jdbc jdbc_user=>"demouser", jdbc_password=><password>, statement=>"SELECT id,name,city from ads", jdbc_driver_library=>"/etc/logstash/conectors/mysql-connector-java-8.0.12.jar", jdbc_connection_string=>"jdbc:mysql://localhost:3306/mydb", id=>"233c4411c2434e93444c3f59eb9503f3a75cab4f85b0a947d96fa6773dac56cd", jdbc_driver_class=>"com.mysql.jdbc.Driver", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_cf5ab80c-91e4-4bc4-8d20-8c5a0f9f8077", enable_metric=>true, charset=>"UTF-8">, jdbc_paging_enabled=>false, jdbc_page_size=>100000, jdbc_validate_connection=>false, jdbc_validation_timeout=>3600, jdbc_pool_timeout=>5, sql_log_level=>"info", connection_retry_attempts=>1, connection_retry_attempts_wait_time=>0.5, parameters=>{"sql_last_value"=>1970-01-01 00:00:00 +0000}, last_run_metadata_path=>"/root/.logstash_jdbc_last_run", use_column_value=>false, tracking_column_type=>"numeric", clean_run=>false, record_last_run=>true, lowercase_column_names=>true>
Error: com.mysql.jdbc.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
Exception: LogStash::ConfigurationError
Stack: /opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:163:in `open_jdbc_connection'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/plugin_mixins/jdbc/jdbc.rb:221:in `execute_statement'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:277:in `execute_query'
/opt/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-input-jdbc-4.3.13/lib/logstash/inputs/jdbc.rb:263:in `run'
/opt/logstash/logstash-core/lib/logstash/pipeline.rb:409:in `inputworker'
/opt/logstash/logstash-core/lib/logstash/pipeline.rb:403:in `block in start_input'
When I build an image and use docker run
, I get another error:
[2018-11-10T10:32:52,935][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/opt/logstash/data/queue"}
[2018-11-10T10:32:52,966][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/opt/logstash/data/dead_letter_queue"}
[2018-11-10T10:32:54,509][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Same problem when I use PostgreSQL.
psql.conf
input {
jdbc {
type => 'test'
jdbc_driver_library => '/etc/logstash/postgresql-9.1-901-1.jdbc4.jar'
jdbc_driver_class => 'org.postgresql.Driver'
jdbc_connection_string => 'jdbc:postgresql://localhost:5432/mytestdb'
jdbc_user => 'postgres'
jdbc_password => 'xxxxxx'
jdbc_page_size => '50000'
statement => 'SELECT id, name, city FROM ads'
}
}
Then I run:
/opt/logstash# bin/logstash -f /etc/logstash/conf.d/psql.conf
Error:
error: org.postgresql.Driver not loaded. Are you sure you've included the correct jdbc driver in :jdbc_driver_library?
I got the same issue and the bellow solution fixed my issue .
for logstash 6.2.x and above, add the required drivers under:
logstash_install_dir/logstash-core/lib/jars/ and don't provide any driver path in config file.
I solved the problem:
First check your java version:
root@xxxxxx:/# java -version
openjdk version "1.8.0_181"
If you are using 1.8 then you should use the JDBC42 version.
If you are using 1.7 then you should use the JDBC41 version.
If you are using 1.6 then you should use the JDBC43 version.
postgresql-9.4-1203.jdbc42.jar
jdbc_driver_library => '/path_to_jar/postgresql-9.4-1203.jdbc42.jar'
jdbc_driver_class => 'org.postgresql.Driver'
mysql-connector-java-5.1.46.jar
jdbc_driver_library => "//path_to_jar/mysql-connector-java-5.1.46.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With