• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java KafkaAvroDecoder类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中io.confluent.kafka.serializers.KafkaAvroDecoder的典型用法代码示例。如果您正苦于以下问题:Java KafkaAvroDecoder类的具体用法?Java KafkaAvroDecoder怎么用?Java KafkaAvroDecoder使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



KafkaAvroDecoder类属于io.confluent.kafka.serializers包,在下文中一共展示了KafkaAvroDecoder类的9个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: testKafkaLogAppender

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
@Test
public void testKafkaLogAppender() {
    Properties consumerProps = new Properties();
    consumerProps.put("zookeeper.connect", zookeeper);
    consumerProps.put("group.id", "kafka-log-appender-test");
    consumerProps.put("auto.offset.reset", "smallest");
    consumerProps.put("schema.registry.url", schemaRegistry);

    Map<String, Integer> topicMap = new HashMap<String, Integer>();
    topicMap.put(topic, 1);

    ConsumerIterator<String, Object> iterator = Consumer.createJavaConsumerConnector(new ConsumerConfig(consumerProps))
            .createMessageStreams(topicMap, new StringDecoder(null), new KafkaAvroDecoder(new VerifiableProperties(consumerProps)))
            .get(topic).get(0).iterator();

    String testMessage = "I am a test message";
    logger.info(testMessage);

    MessageAndMetadata<String, Object> messageAndMetadata = iterator.next();
    GenericRecord logLine = (GenericRecord) messageAndMetadata.message();
    assertEquals(logLine.get("line").toString(), testMessage);
    assertEquals(logLine.get("logtypeid"), KafkaLogAppender.InfoLogTypeId);
    assertNotNull(logLine.get("source"));
    assertEquals(((Map<CharSequence, Object>) logLine.get("timings")).size(), 1);
    assertEquals(((Map<CharSequence, Object>) logLine.get("tag")).size(), 2);
}
 
开发者ID:elodina,项目名称:java-kafka,代码行数:27,代码来源:KafkaLogAppenderTest.java


示例2: deserialize

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
@Override
public String deserialize(byte[] message) {
    if (kafkaAvroDecoder == null) {
        SchemaRegistryClient schemaRegistry = new CachedSchemaRegistryClient(this.schemaRegistryUrl, this.identityMapCapacity);
        this.kafkaAvroDecoder = new KafkaAvroDecoder(schemaRegistry);
    }
    return this.kafkaAvroDecoder.fromBytes(message).toString();
}
 
开发者ID:seanpquig,项目名称:flink-streaming-confluent,代码行数:9,代码来源:ConfluentAvroDeserializationSchema.java


示例3: processStream

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
private static void processStream(JavaStreamingContext ssc, JavaSparkContext sc) {
  System.out.println("--> Processing stream");

  Map<String, String> props = new HashMap<>();
  props.put("bootstrap.servers", "localhost:9092");
  props.put("schema.registry.url", "http://localhost:8081");
  props.put("group.id", "spark");
  props.put("specific.avro.reader", "true");

  props.put("value.deserializer", "io.confluent.kafka.serializers.KafkaAvroDeserializer");
  props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");

  Set<String> topicsSet = new HashSet<>(Collections.singletonList("test"));

  JavaPairInputDStream<String, Object> stream = KafkaUtils.createDirectStream(ssc, String.class, Object.class,
    StringDecoder.class, KafkaAvroDecoder.class, props, topicsSet);

  stream.foreachRDD(rdd -> {
    rdd.foreachPartition(iterator -> {
        while (iterator.hasNext()) {
          Tuple2<String, Object> next = iterator.next();
          Model model = (Model) next._2();
          System.out.println(next._1() + " --> " + model);
        }
      }
    );
  });
}
 
开发者ID:opencore,项目名称:kafka-spark-avro-example,代码行数:29,代码来源:SparkStreaming.java


示例4: schemaRegistryKafkaReporterTest

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
@Test(dataProvider = "kafkaMetrics")
public void schemaRegistryKafkaReporterTest(HashMap<String,String> tags, String metricKey, Double value,
    Double truncatedValue,
    long timestamp) {

  /* GIVEN: A Zookeeper instance, a Kafka broker, and a the Schema Registry-based Kafka reporter we're testing */
  initializeSchemaRegistryReporter();
  SimpleConsumer kafkaConsumer = new SimpleConsumer(testKafkaHost, testKafkaPort, 10000, 1024000, "simpleConsumer");
  KafkaAvroDecoder decoder = new KafkaAvroDecoder(schemaRegistryClient);

  /* WHEN: A new metric is appended to the reporter's buffer and we tell the reporter to send its data */
  submitMetricToReporter(tags, metricKey, value, timestamp);

  /* WHEN: A Kafka consumer reads the latest message from the same topic on the Kafka server */
  byte[] bytes = fetchLatestRecordPayloadBytes(kafkaConsumer);
  
  /* WHEN: The latest message is decoded using the Schema Regsitry based decoder */
  GenericRecord result = null;
  try {
    result = (GenericRecord) decoder.fromBytes(bytes);
  }
  catch (SerializationException e) {
    fail("Failed to deserialize message:" + e.getMessage());
  }

  /* THEN: The field values of the decoded record should be the same as those of the input fields. */
  assertThat(result).isNotNull();
  assertThat(result.get("prefix")).isEqualTo(TagsHelper.constructMetricPrefix(TagsHelper.DEFAULT_PREFIX, tags));
  assertThat(result.get("reportTime")).isEqualTo(timestamp);
  assertThat(((Map) result.get("metricValues")).get(metricKey)).isEqualTo(truncatedValue);
}
 
开发者ID:verisign,项目名称:storm-graphite,代码行数:32,代码来源:BaseKafkaReporterTest.java


示例5: testCodahaleKafkaMetricsReporter

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
@Test
    public void testCodahaleKafkaMetricsReporter() {
        registry = new MetricRegistry();
        registry.counter("test_counter").inc();

        kafkaReporter = KafkaReporter.builder(registry,
                kafkaConnect,
                topic,
                schemaRegistry).build();

//        ObjectMapper mapper = new ObjectMapper().registerModule(new MetricsModule(TimeUnit.SECONDS,
//                TimeUnit.SECONDS,
//                false));
//        StringWriter r = new StringWriter();
//        try {
//            mapper.writeValue(r, registry);
//        } catch (IOException e) {
//            e.printStackTrace();
//        }

        kafkaReporter.report();

        Properties props = new Properties();
        props.put("zookeeper.connect", zkConnect);
        props.put("group.id", UUID.randomUUID().toString());
        props.put("auto.offset.reset", "smallest");
        props.put("zookeeper.session.timeout.ms", "30000");
        props.put("consumer.timeout.ms", "30000");
        props.put("schema.registry.url", schemaRegistry);

        ConsumerConnector consumer = Consumer.createJavaConsumerConnector(new ConsumerConfig(props));
        KafkaStream<String, Object> messageStream = consumer.createMessageStreamsByFilter(new Whitelist(topic),
                1,
                new StringDecoder(null),
                new KafkaAvroDecoder(new VerifiableProperties(props))).get(0);

        GenericRecord message = (GenericRecord) messageStream.iterator().next().message();
        assertNotNull(message);
    }
 
开发者ID:elodina,项目名称:java-kafka,代码行数:40,代码来源:KafkaReporterTest.java


示例6: testTopicReporter

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
@Test
public void testTopicReporter() {
    MetricsRegistry registry = new MetricsRegistry();
    Counter counter = registry.newCounter(KafkaReporterTest.class, "test-counter");
    counter.inc();

    Properties producerProps = new Properties();
    producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaConnect);
    producerProps.put("schema.registry.url", schemaRegistry);

    KafkaReporter reporter = new KafkaReporter(registry, producerProps, topic);
    reporter.start(1, TimeUnit.SECONDS);

    Properties props = new Properties();
    props.put("zookeeper.connect", zkConnect);
    props.put("group.id", UUID.randomUUID().toString());
    props.put("auto.offset.reset", "smallest");
    props.put("zookeeper.session.timeout.ms", "30000");
    props.put("consumer.timeout.ms", "30000");
    props.put("schema.registry.url", schemaRegistry);

    ConsumerConnector consumer = Consumer.createJavaConsumerConnector(new ConsumerConfig(props));
    KafkaStream<String, Object> messageStream = consumer.createMessageStreamsByFilter(new Whitelist(topic),
            1,
            new StringDecoder(null),
            new KafkaAvroDecoder(new VerifiableProperties(props))).get(0);

    GenericRecord message = (GenericRecord) messageStream.iterator().next().message();
    assertNotNull(message);

    reporter.shutdown();
}
 
开发者ID:elodina,项目名称:java-kafka,代码行数:33,代码来源:KafkaReporterTest.java


示例7: initKafka

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
private void initKafka() {
    schemaRegistryClient = new MockSchemaRegistryClient();
    kafkaAvroDecoder = new KafkaAvroDecoder(schemaRegistryClient);
    Properties defaultConfig = new Properties();
    defaultConfig.put(KafkaAvroDeserializerConfig.SCHEMA_REGISTRY_URL_CONFIG, "bogus");
    avroSerializer = new KafkaAvroSerializer(schemaRegistryClient);
}
 
开发者ID:pinterest,项目名称:secor,代码行数:8,代码来源:SecorSchemaRegistryClientTest.java


示例8: AvroSerde

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
public AvroSerde(VerifiableProperties encoderProps, VerifiableProperties decoderProps) {
    encoder = new KafkaAvroEncoder(encoderProps);
    decoder = new KafkaAvroDecoder(decoderProps);
}
 
开发者ID:theduderog,项目名称:hello-samza-confluent,代码行数:5,代码来源:AvroSerde.java


示例9: init

import io.confluent.kafka.serializers.KafkaAvroDecoder; //导入依赖的package包/类
protected void init(SecorConfig config) {
    decoder = new KafkaAvroDecoder(schemaRegistryClient);
}
 
开发者ID:pinterest,项目名称:secor,代码行数:4,代码来源:SecorSchemaRegistryClient.java



注:本文中的io.confluent.kafka.serializers.KafkaAvroDecoder类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java SqlSingleValueAggFunction类代码示例发布时间:2022-05-23
下一篇:
Java FilterPackageInfo类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap