• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java RestClientException类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException的典型用法代码示例。如果您正苦于以下问题:Java RestClientException类的具体用法?Java RestClientException怎么用?Java RestClientException使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



RestClientException类属于io.confluent.kafka.schemaregistry.client.rest.exceptions包,在下文中一共展示了RestClientException类的14个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: registerSchema

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
/**
 * Register the schema with the kafka schema registry.
 * @param schemaName the name of the schema
 * @param schema the avro schema
 * @return the id for the schema returned from the schema registry.
 */
public int registerSchema(String schemaName, Schema schema) {
    int id;
    
    try {
        /*
         * "-value" appended to schema name per Kafka documentation.
         * See link: : http://docs.confluent.io/2.0.0/schema-registry/docs/api.html
         */
        id = client.register(schemaName+"-value", schema);
    } catch (RestClientException | IOException e) {
        LOG.error("registerSchema(): failed. Returning ID=0", e);
        id = 0;
    }

    LOG.debug("RegisterSchema(): Schema ID returned from registry for {} was {}", schemaName, id);
    
    return id;
}
 
开发者ID:oracle,项目名称:bdglue,代码行数:25,代码来源:KafkaRegistry.java


示例2: fetchSchemaMetadata

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
private SchemaMetadata fetchSchemaMetadata(
    AbstractStreamCreateStatement abstractStreamCreateStatement,
    SchemaRegistryClient schemaRegistryClient,
    String kafkaTopicName) throws IOException, RestClientException {

  if (abstractStreamCreateStatement.getProperties().containsKey(KsqlConstants.AVRO_SCHEMA_ID)) {
    int schemaId;
    try {
      schemaId = Integer.parseInt(StringUtil.cleanQuotes(abstractStreamCreateStatement.getProperties().get(KsqlConstants.AVRO_SCHEMA_ID).toString()));
    } catch (NumberFormatException e) {
      throw new KsqlException(String.format("Invalid schema id property: %s.",
                                            abstractStreamCreateStatement.getProperties().get(KsqlConstants.AVRO_SCHEMA_ID).toString()));
    }
    return schemaRegistryClient.getSchemaMetadata(
        kafkaTopicName +KsqlConstants.SCHEMA_REGISTRY_VALUE_SUFFIX, schemaId);
  } else {
    return schemaRegistryClient.getLatestSchemaMetadata(kafkaTopicName +
                                                                  KsqlConstants.SCHEMA_REGISTRY_VALUE_SUFFIX);
  }
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:21,代码来源:AvroUtil.java


示例3: shouldValidatePersistentQueryResultCorrectly

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Test
public void shouldValidatePersistentQueryResultCorrectly()
    throws IOException, RestClientException {
  SchemaRegistryClient schemaRegistryClient = mock(SchemaRegistryClient.class);
  KsqlTopic resultTopic = new KsqlTopic("testTopic", "testTopic", new KsqlAvroTopicSerDe());
  Schema resultSchema = SerDeUtil.getSchemaFromAvro(ordersAveroSchemaStr);
  PersistentQueryMetadata persistentQueryMetadata = new PersistentQueryMetadata("",
                                                                                null,
                                                                                null,
                                                                                "",
                                                                                null,
                                                                                DataSource.DataSourceType.KSTREAM,
                                                                                "",
                                                                                mock(KafkaTopicClient.class),
      resultSchema,
                                                                                resultTopic,
                                                                                null);
  org.apache.avro.Schema.Parser parser = new org.apache.avro.Schema.Parser();
  org.apache.avro.Schema avroSchema = parser.parse(ordersAveroSchemaStr);
  expect(schemaRegistryClient.testCompatibility(anyString(), EasyMock.isA(avroSchema.getClass())))
      .andReturn(true);
  replay(schemaRegistryClient);
  avroUtil.validatePersistentQueryResults(persistentQueryMetadata, schemaRegistryClient);
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:25,代码来源:AvroUtilTest.java


示例4: registerSchema

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
private void registerSchema(SchemaRegistryClient schemaRegistryClient)
    throws IOException, RestClientException {
  String ordersAveroSchemaStr = "{"
                                + "\"namespace\": \"kql\","
                                + " \"name\": \"orders\","
                                + " \"type\": \"record\","
                                + " \"fields\": ["
                                + "     {\"name\": \"ordertime\", \"type\": \"long\"},"
                                + "     {\"name\": \"orderid\",  \"type\": \"long\"},"
                                + "     {\"name\": \"itemid\", \"type\": \"string\"},"
                                + "     {\"name\": \"orderunits\", \"type\": \"double\"},"
                                + "     {\"name\": \"arraycol\", \"type\": {\"type\": \"array\", \"items\": \"double\"}},"
                                + "     {\"name\": \"mapcol\", \"type\": {\"type\": \"map\", \"values\": \"double\"}}"
                                + " ]"
                                + "}";
  org.apache.avro.Schema.Parser parser = new org.apache.avro.Schema.Parser();
  org.apache.avro.Schema avroSchema = parser.parse(ordersAveroSchemaStr);
  schemaRegistryClient.register("orders-topic" + KsqlConstants.SCHEMA_REGISTRY_VALUE_SUFFIX,
                                avroSchema);

}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:22,代码来源:KsqlResourceTest.java


示例5: retrieveSchema

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Override
public Schema retrieveSchema(TableId table, String topic) {
  try {
    String subject = getSubject(topic);
    logger.debug("Retrieving schema information for topic {} with subject {}", topic, subject);
    SchemaMetadata latestSchemaMetadata = schemaRegistryClient.getLatestSchemaMetadata(subject);
    org.apache.avro.Schema avroSchema = new Parser().parse(latestSchemaMetadata.getSchema());
    return avroData.toConnectSchema(avroSchema);
  } catch (IOException | RestClientException exception) {
    throw new ConnectException(
        "Exception encountered while trying to fetch latest schema metadata from Schema Registry",
        exception
    );
  }
}
 
开发者ID:wepay,项目名称:kafka-connect-bigquery,代码行数:16,代码来源:SchemaRegistrySchemaRetriever.java


示例6: shouldFailForInvalidResultAvroSchema

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Test
public void shouldFailForInvalidResultAvroSchema()
    throws IOException, RestClientException {
  SchemaRegistryClient schemaRegistryClient = mock(SchemaRegistryClient.class);
  KsqlTopic resultTopic = new KsqlTopic("testTopic", "testTopic", new KsqlAvroTopicSerDe
      ());
  Schema resultSchema = SerDeUtil.getSchemaFromAvro(ordersAveroSchemaStr);
  PersistentQueryMetadata persistentQueryMetadata = new PersistentQueryMetadata("",
                                                                                null,
                                                                                null,
                                                                                "",
                                                                                null,
                                                                                DataSource.DataSourceType.KSTREAM,
                                                                                "",
                                                                                mock(KafkaTopicClient.class),
      resultSchema,
                                                                                resultTopic,
                                                                                null);
  expect(schemaRegistryClient.testCompatibility(anyString(), anyObject())).andReturn(false);
  replay(schemaRegistryClient);
  try {
    avroUtil.validatePersistentQueryResults(persistentQueryMetadata, schemaRegistryClient);
    fail();
  } catch (Exception e) {
    assertThat("Incorrect exception message", "Cannot register avro schema for testTopic since "
                                              + "it is not valid for schema registry.", equalTo(e.getMessage()));
  }
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:29,代码来源:AvroUtilTest.java


示例7: setUp

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Before
public void setUp() throws IOException, RestClientException {
  schemaRegistryClient = new MockSchemaRegistryClient();
  registerSchema(schemaRegistryClient);
  ksqlRestConfig = new KsqlRestConfig(TestKsqlResourceUtil.getDefaultKsqlConfig());
  KsqlConfig ksqlConfig = new KsqlConfig(ksqlRestConfig.getKsqlStreamsProperties());
  ksqlEngine = new KsqlEngine(ksqlConfig, new MockKafkaTopicClient(), schemaRegistryClient);
}
 
开发者ID:confluentinc,项目名称:ksql,代码行数:9,代码来源:KsqlResourceTest.java


示例8: loadFromRegistry

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
/**
 * Loads and parses a schema for the specified subject from the schema registry
 * @param subject subject for which to fetch the latest version of a schema.
 * @return parsed avro schema
 * @throws SchemaRegistryException if there is an error with the registry client
 */
public Schema loadFromRegistry(String subject) throws SchemaRegistryException {
  try {
    SchemaMetadata metadata = registryClient.getLatestSchemaMetadata(subject);
    return registryClient.getByID(metadata.getId());
  } catch (IOException | RestClientException e) {
    throw new SchemaRegistryException(e);
  }
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:15,代码来源:AvroSchemaHelper.java


示例9: getSchemaIdFromSubject

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
/**
 * Looks up schema id for the specified subject from the schema registry
 * @param subject subject for which schema Id must be looked up.
 * @return the schema id
 * @throws SchemaRegistryException if there is an error with the registry client
 */
public int getSchemaIdFromSubject(String subject) throws SchemaRegistryException {
  try {
    SchemaMetadata metadata = registryClient.getLatestSchemaMetadata(subject);
    return metadata.getId();
  } catch (IOException | RestClientException e) {
    throw new SchemaRegistryException(e);
  }
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:15,代码来源:AvroSchemaHelper.java


示例10: fetchSchemaByKey

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Override
protected Schema fetchSchemaByKey(Integer key) throws SchemaRegistryException {
  try {
    return this.schemaRegistryClient.getByID(key);
  } catch (IOException | RestClientException e) {
    throw new SchemaRegistryException(e);
  }
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:9,代码来源:ConfluentKafkaSchemaRegistry.java


示例11: getLatestSchemaByTopic

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Override
public Schema getLatestSchemaByTopic(String topic) throws SchemaRegistryException {
  String schemaName = topic + this.schemaNameSuffix;
  try {
    return new Schema.Parser().parse(this.schemaRegistryClient.getLatestSchemaMetadata(schemaName).getSchema());
  } catch (IOException | RestClientException e) {
    log.error("Failed to get schema for topic " + topic + "; subject " + schemaName);
    throw new SchemaRegistryException(e);
  }
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:11,代码来源:ConfluentKafkaSchemaRegistry.java


示例12: register

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Override
public Integer register(Schema schema, String name) throws SchemaRegistryException {
  try {
    String schemaName = name + this.schemaNameSuffix;
    return this.schemaRegistryClient.register(schemaName, schema);
  } catch (IOException | RestClientException e) {
    throw new SchemaRegistryException(e);
  }
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:10,代码来源:ConfluentKafkaSchemaRegistry.java


示例13: testConfluentAvroDeserializer

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Test
public void testConfluentAvroDeserializer() throws IOException, RestClientException {
  WorkUnitState mockWorkUnitState = getMockWorkUnitState();
  mockWorkUnitState.setProp("schema.registry.url", TEST_URL);

  Schema schema = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .endRecord();

  GenericRecord testGenericRecord = new GenericRecordBuilder(schema).set(TEST_FIELD_NAME, "testValue").build();

  SchemaRegistryClient mockSchemaRegistryClient = mock(SchemaRegistryClient.class);
  when(mockSchemaRegistryClient.getByID(any(Integer.class))).thenReturn(schema);

  Serializer<Object> kafkaEncoder = new KafkaAvroSerializer(mockSchemaRegistryClient);
  Deserializer<Object> kafkaDecoder = new KafkaAvroDeserializer(mockSchemaRegistryClient);

  ByteBuffer testGenericRecordByteBuffer =
      ByteBuffer.wrap(kafkaEncoder.serialize(TEST_TOPIC_NAME, testGenericRecord));

  KafkaSchemaRegistry<Integer, Schema> mockKafkaSchemaRegistry = mock(KafkaSchemaRegistry.class);
  KafkaDeserializerExtractor kafkaDecoderExtractor =
      new KafkaDeserializerExtractor(mockWorkUnitState,
          Optional.fromNullable(Deserializers.CONFLUENT_AVRO), kafkaDecoder, mockKafkaSchemaRegistry);

  ByteArrayBasedKafkaRecord mockMessageAndOffset = getMockMessageAndOffset(testGenericRecordByteBuffer);

  Assert.assertEquals(kafkaDecoderExtractor.decodeRecord(mockMessageAndOffset), testGenericRecord);
}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:31,代码来源:KafkaDeserializerExtractorTest.java


示例14: testConfluentAvroDeserializerForSchemaEvolution

import io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException; //导入依赖的package包/类
@Test
public void testConfluentAvroDeserializerForSchemaEvolution() throws IOException, RestClientException, SchemaRegistryException {
  WorkUnitState mockWorkUnitState = getMockWorkUnitState();
  mockWorkUnitState.setProp("schema.registry.url", TEST_URL);

  Schema schemaV1 = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .endRecord();

  Schema schemaV2 = SchemaBuilder.record(TEST_RECORD_NAME)
      .namespace(TEST_NAMESPACE).fields()
      .name(TEST_FIELD_NAME).type().stringType().noDefault()
      .optionalString(TEST_FIELD_NAME2).endRecord();

  GenericRecord testGenericRecord = new GenericRecordBuilder(schemaV1).set(TEST_FIELD_NAME, "testValue").build();

  SchemaRegistryClient mockSchemaRegistryClient = mock(SchemaRegistryClient.class);
  when(mockSchemaRegistryClient.getByID(any(Integer.class))).thenReturn(schemaV1);

  Serializer<Object> kafkaEncoder = new KafkaAvroSerializer(mockSchemaRegistryClient);
  Deserializer<Object> kafkaDecoder = new KafkaAvroDeserializer(mockSchemaRegistryClient);

  ByteBuffer testGenericRecordByteBuffer =
      ByteBuffer.wrap(kafkaEncoder.serialize(TEST_TOPIC_NAME, testGenericRecord));

  KafkaSchemaRegistry<Integer, Schema> mockKafkaSchemaRegistry = mock(KafkaSchemaRegistry.class);
  when(mockKafkaSchemaRegistry.getLatestSchemaByTopic(TEST_TOPIC_NAME)).thenReturn(schemaV2);

  KafkaDeserializerExtractor kafkaDecoderExtractor = new KafkaDeserializerExtractor(mockWorkUnitState,
      Optional.fromNullable(Deserializers.CONFLUENT_AVRO), kafkaDecoder, mockKafkaSchemaRegistry);
  when(kafkaDecoderExtractor.getSchema()).thenReturn(schemaV2);

  ByteArrayBasedKafkaRecord mockMessageAndOffset = getMockMessageAndOffset(testGenericRecordByteBuffer);

  GenericRecord received = (GenericRecord) kafkaDecoderExtractor.decodeRecord(mockMessageAndOffset);
  Assert.assertEquals(received.toString(), "{\"testField\": \"testValue\", \"testField2\": null}");

}
 
开发者ID:apache,项目名称:incubator-gobblin,代码行数:40,代码来源:KafkaDeserializerExtractorTest.java



注:本文中的io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java XMLNotationDecl类代码示例发布时间:2022-05-23
下一篇:
Java Timer类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap