I'm writing a Kafka Streams application on my development Windows machine.
If I try to use the leftJoin
and branch
features of Kafka Streams I get the error below when executing the jar application:
Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:\Users\user\AppData\Local\Temp\librocksdbjni325337723194862275.dll: Can't find dependent libraries
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
at org.rocksdb.Options.<clinit>(Options.java:22)
at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:115)
at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38)
at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:75)
at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:72)
at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:54)
at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:109)
at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:101)
at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:43)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:641)
at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:368)
It seems like Kafka does not find a DLL, but wait...I'm developing a Java application!
What could be the problem? And why this error doesn't show off if I try to do simpler streaming operations like only a filter
?
UPDATE:
This problem raises only when a message is present in the broker. I'm using Kafka Streams version 0.10.2.1.
This is the piece of code which raises the problem
public class KafkaStreamsMainClass {
private KafkaStreamsMainClass() {
}
public static void main(final String[] args) throws Exception {
Properties streamsConfiguration = new Properties();
streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams");
streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-server:9092");
streamsConfiguration.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "schema-registry:8082");
streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
KStreamBuilder builder = new KStreamBuilder();
KStream<GenericRecord, GenericRecord> sourceStream = builder.stream(SOURCE_TOPIC);
KStream<GenericRecord, GenericRecord> finishedFiltered = sourceStream
.filter((GenericRecord key, GenericRecord value) -> value.get("endTime") != null);
KStream<GenericRecord, GenericRecord>[] branchedStreams = sourceStream
.filter((GenericRecord key, GenericRecord value) -> value.get("endTime") == null)
.branch((GenericRecord key, GenericRecord value) -> value.get("firstField") != null,
(GenericRecord key, GenericRecord value) -> value.get("secondField") != null);
branchedStreams[0] = finishedFiltered.join(branchedStreams[0],
(GenericRecord value1, GenericRecord value2) -> {
return value1;
}, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));
branchedStreams[1] = finishedFiltered.join(branchedStreams[1],
(GenericRecord value1, GenericRecord value2) -> {
return value1;
}, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));
KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
streams.setUncaughtExceptionHandler((Thread thread, Throwable throwable) -> {
throwable.printStackTrace();
});
streams.start();
Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
}
}
I opened the rocksdbjni-5.0.1.jar
archive downloaded by Maven and it includes the librocksdbjni-win64.dll
library. It seems that it is trying to retrieve the library from the outside of the RocksDB instead from the inner.
I'm developing on a Windows 7 machine.
Have you ever experienced this problem?
Recently I came through this problem too. I managed to solve this in two steps:
librocksdbjni[...].dll
files from C:\Users\[your_user]\AppData\Local\Temp
folder.rocksdb
in your project, this works for me: https://mvnrepository.com/artifact/org.rocksdb/rocksdbjni/5.0.1
Compile your Kafka Stream App and run it. It should work!
I updated my kafka-streams project to the latest released version 1.0.0.
This version suffers of this bug but after patching it and uploading this patched version on the internal Artifactory server we were able to execute our kafka-streams agent both on Windows and on Linux. The next versions 1.0.1 and 1.1.0 will have this bug fix so as soon as one of these versions will be released we will switch to them instead of the patched version.
To sum up the Kafka guys solved this bug with the 1.0.0 release.
My problem was permissions in /tmp/
directory (CentOS)
rockdb uses
java.io.tmpdir
system property internally to decide where to place librocksdbjni
file, usually something like this /tmp/librocksdbjni2925599838907625983.so
Solved by setting different tempdir property with appropriate permissions in kafka-streams app.
System.setProperty("java.io.tmpdir", "/opt/kafka-streams/tmp");
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With