Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
419 views
in Technique[技术] by (71.8m points)

java - UnsatisfiedLinkError on Lib rocks DB dll when developing with Kafka Streams

I'm writing a Kafka Streams application on my development Windows machine. If I try to use the leftJoin and branch features of Kafka Streams I get the error below when executing the jar application:

Exception in thread "StreamThread-1" java.lang.UnsatisfiedLinkError: C:UsersuserAppDataLocalTemplibrocksdbjni325337723194862275.dll: Can't find dependent libraries
    at java.lang.ClassLoader$NativeLibrary.load(Native Method)
    at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
    at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
    at java.lang.Runtime.load0(Runtime.java:809)
    at java.lang.System.load(System.java:1086)
    at org.rocksdb.NativeLibraryLoader.loadLibraryFromJar(NativeLibraryLoader.java:78)
    at org.rocksdb.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:56)
    at org.rocksdb.RocksDB.loadLibrary(RocksDB.java:64)
    at org.rocksdb.RocksDB.<clinit>(RocksDB.java:35)
    at org.rocksdb.Options.<clinit>(Options.java:22)
    at org.apache.kafka.streams.state.internals.RocksDBStore.openDB(RocksDBStore.java:115)
    at org.apache.kafka.streams.state.internals.Segment.openDB(Segment.java:38)
    at org.apache.kafka.streams.state.internals.Segments.getOrCreateSegment(Segments.java:75)
    at org.apache.kafka.streams.state.internals.RocksDBSegmentedBytesStore.put(RocksDBSegmentedBytesStore.java:72)
    at org.apache.kafka.streams.state.internals.ChangeLoggingSegmentedBytesStore.put(ChangeLoggingSegmentedBytesStore.java:54)
    at org.apache.kafka.streams.state.internals.MeteredSegmentedBytesStore.put(MeteredSegmentedBytesStore.java:101)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:109)
    at org.apache.kafka.streams.state.internals.RocksDBWindowStore.put(RocksDBWindowStore.java:101)
    at org.apache.kafka.streams.kstream.internals.KStreamJoinWindow$KStreamJoinWindowProcessor.process(KStreamJoinWindow.java:65)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFlatMapValues$KStreamFlatMapValuesProcessor.process(KStreamFlatMapValues.java:43)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.kstream.internals.KStreamFilter$KStreamFilterProcessor.process(KStreamFilter.java:44)
    at org.apache.kafka.streams.processor.internals.ProcessorNode$1.run(ProcessorNode.java:48)
    at org.apache.kafka.streams.processor.internals.StreamsMetricsImpl.measureLatencyNs(StreamsMetricsImpl.java:188)
    at org.apache.kafka.streams.processor.internals.ProcessorNode.process(ProcessorNode.java:134)
    at org.apache.kafka.streams.processor.internals.ProcessorContextImpl.forward(ProcessorContextImpl.java:83)
    at org.apache.kafka.streams.processor.internals.SourceNode.process(SourceNode.java:70)
    at org.apache.kafka.streams.processor.internals.StreamTask.process(StreamTask.java:197)
    at org.apache.kafka.streams.processor.internals.StreamThread.runLoop(StreamThread.java:641)
    at org.apache.kafka.streams.processor.internals.StreamThread.run(StreamThread.java:368)

It seems like Kafka does not find a DLL, but wait...I'm developing a Java application!

What could be the problem? And why this error doesn't show off if I try to do simpler streaming operations like only a filter?

UPDATE:

This problem raises only when a message is present in the broker. I'm using Kafka Streams version 0.10.2.1.

This is the piece of code which raises the problem

public class KafkaStreamsMainClass {

    private KafkaStreamsMainClass() {
    }

    public static void main(final String[] args) throws Exception {
        Properties streamsConfiguration = new Properties();
        streamsConfiguration.put(StreamsConfig.APPLICATION_ID_CONFIG, "kafka-streams");
        streamsConfiguration.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka-server:9092");
        streamsConfiguration.put(AbstractKafkaAvroSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, "schema-registry:8082");
        streamsConfiguration.put(StreamsConfig.COMMIT_INTERVAL_MS_CONFIG, 10 * 1000);
        streamsConfiguration.put(StreamsConfig.CACHE_MAX_BYTES_BUFFERING_CONFIG, 0);
        streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        streamsConfiguration.put(StreamsConfig.VALUE_SERDE_CLASS_CONFIG, GenericAvroSerde.class);
        KStreamBuilder builder = new KStreamBuilder();
        KStream<GenericRecord, GenericRecord> sourceStream = builder.stream(SOURCE_TOPIC);

        KStream<GenericRecord, GenericRecord> finishedFiltered = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") != null);

        KStream<GenericRecord, GenericRecord>[] branchedStreams = sourceStream
                .filter((GenericRecord key, GenericRecord value) -> value.get("endTime") == null)
                .branch((GenericRecord key, GenericRecord value) -> value.get("firstField") != null,
                        (GenericRecord key, GenericRecord value) -> value.get("secondField") != null);

        branchedStreams[0] = finishedFiltered.join(branchedStreams[0],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        branchedStreams[1] = finishedFiltered.join(branchedStreams[1],
                (GenericRecord value1, GenericRecord value2) -> {
                    return value1;
                }, JoinWindows.of(TimeUnit.SECONDS.toMillis(2)));

        KafkaStreams streams = new KafkaStreams(builder, streamsConfiguration);
        streams.setUncaughtExceptionHandler((Thread thread, Throwable throwable) -> {
            throwable.printStackTrace();
        });
        streams.start();

        Runtime.getRuntime().addShutdownHook(new Thread(streams::close));
    }

}

I opened the rocksdbjni-5.0.1.jar archive downloaded by Maven and it includes the librocksdbjni-win64.dll library. It seems that it is trying to retrieve the library from the outside of the RocksDB instead from the inner.

I'm developing on a Windows 7 machine.

Have you ever experienced this problem?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Recently I came through this problem too. I managed to solve this in two steps:

  1. Delete all librocksdbjni[...].dll files from C:Users[your_user]AppDataLocalTemp folder.
  2. Add maven dependency for rocksdb in your project, this works for me: https://mvnrepository.com/artifact/org.rocksdb/rocksdbjni/5.0.1

Compile your Kafka Stream App and run it. It should work!


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...