• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java HCatException类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hive.hcatalog.common.HCatException的典型用法代码示例。如果您正苦于以下问题:Java HCatException类的具体用法?Java HCatException怎么用?Java HCatException使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



HCatException类属于org.apache.hive.hcatalog.common包,在下文中一共展示了HCatException类的19个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: flush

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private void flush() throws HCatException {
  if (hCatRecordsBatch.isEmpty()) {
    return;
  }
  try {
    slaveWriter.write(hCatRecordsBatch.iterator());
    masterWriter.commit(writerContext);
  } catch (HCatException e) {
    LOG.error("Exception in flush - write/commit data to Hive", e);
    //abort on exception
    masterWriter.abort(writerContext);
    throw e;
  } finally {
    hCatRecordsBatch.clear();
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:17,代码来源:HCatalogIO.java


示例2: asFlinkTuples

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
/**
 * Specifies that the InputFormat returns Flink tuples instead of
 * {@link org.apache.hive.hcatalog.data.HCatRecord}.
 *
 * <p>Note: Flink tuples might only support a limited number of fields (depending on the API).
 *
 * @return This InputFormat.
 * @throws org.apache.hive.hcatalog.common.HCatException
 */
public HCatInputFormatBase<T> asFlinkTuples() throws HCatException {

	// build type information
	int numFields = outputSchema.getFields().size();
	if (numFields > this.getMaxFlinkTupleSize()) {
		throw new IllegalArgumentException("Only up to " + this.getMaxFlinkTupleSize() +
				" fields can be returned as Flink tuples.");
	}

	TypeInformation[] fieldTypes = new TypeInformation[numFields];
	fieldNames = new String[numFields];
	for (String fieldName : outputSchema.getFieldNames()) {
		HCatFieldSchema field = outputSchema.get(fieldName);

		int fieldPos = outputSchema.getPosition(fieldName);
		TypeInformation fieldType = getFieldType(field);

		fieldTypes[fieldPos] = fieldType;
		fieldNames[fieldPos] = fieldName;

	}
	this.resultType = new TupleTypeInfo(fieldTypes);

	return this;
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:35,代码来源:HCatInputFormatBase.java


示例3: asFlinkTuples

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
/**
 * Specifies that the InputFormat returns Flink tuples instead of
 * {@link org.apache.hive.hcatalog.data.HCatRecord}.
 *
 * Note: Flink tuples might only support a limited number of fields (depending on the API).
 *
 * @return This InputFormat.
 * @throws org.apache.hive.hcatalog.common.HCatException
 */
public HCatInputFormatBase<T> asFlinkTuples() throws HCatException {

	// build type information
	int numFields = outputSchema.getFields().size();
	if(numFields > this.getMaxFlinkTupleSize()) {
		throw new IllegalArgumentException("Only up to "+this.getMaxFlinkTupleSize()+
				" fields can be returned as Flink tuples.");
	}

	TypeInformation[] fieldTypes = new TypeInformation[numFields];
	fieldNames = new String[numFields];
	for (String fieldName : outputSchema.getFieldNames()) {
		HCatFieldSchema field = outputSchema.get(fieldName);

		int fieldPos = outputSchema.getPosition(fieldName);
		TypeInformation fieldType = getFieldType(field);

		fieldTypes[fieldPos] = fieldType;
		fieldNames[fieldPos] = fieldName;

	}
	this.resultType = new TupleTypeInfo(fieldTypes);

	return this;
}
 
开发者ID:axbaretto,项目名称:flink,代码行数:35,代码来源:HCatInputFormatBase.java


示例4: insert

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private void insert(Map<String, String> partitionSpec, Iterable<HCatRecord> rows) {
  WriteEntity entity = new WriteEntity.Builder()
      .withDatabase(databaseName)
      .withTable(tableName)
      .withPartition(partitionSpec)
      .build();

  try {
    HCatWriter master = DataTransferFactory.getHCatWriter(entity, config);
    WriterContext context = master.prepareWrite();
    HCatWriter writer = DataTransferFactory.getHCatWriter(context);
    writer.write(rows.iterator());
    master.commit(context);
  } catch (HCatException e) {
    throw new RuntimeException("An error occurred while inserting data to " + databaseName + "." + tableName, e);
  }
}
 
开发者ID:klarna,项目名称:HiveRunner,代码行数:18,代码来源:TableDataInserter.java


示例5: getReaderContext

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private ReaderContext getReaderContext(long desiredSplitCount) throws HCatException {
  ReadEntity entity =
      new ReadEntity.Builder()
          .withDatabase(spec.getDatabase())
          .withTable(spec.getTable())
          .withFilter(spec.getFilter())
          .build();
  // pass the 'desired' split count as an hint to the API
  Map<String, String> configProps = new HashMap<>(spec.getConfigProperties());
  configProps.put(
      HCatConstants.HCAT_DESIRED_PARTITION_NUM_SPLITS, String.valueOf(desiredSplitCount));
  return DataTransferFactory.getHCatReader(entity, configProps).prepareRead();
}
 
开发者ID:apache,项目名称:beam,代码行数:14,代码来源:HCatalogIO.java


示例6: start

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
@Override
public boolean start() throws HCatException {
  HCatReader reader =
      DataTransferFactory.getHCatReader(source.spec.getContext(), source.spec.getSplitId());
  hcatIterator = reader.read();
  return advance();
}
 
开发者ID:apache,项目名称:beam,代码行数:8,代码来源:HCatalogIO.java


示例7: initiateWrite

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
@Setup
public void initiateWrite() throws HCatException {
  WriteEntity entity =
      new WriteEntity.Builder()
          .withDatabase(spec.getDatabase())
          .withTable(spec.getTable())
          .withPartition(spec.getPartition())
          .build();
  masterWriter = DataTransferFactory.getHCatWriter(entity, spec.getConfigProperties());
  writerContext = masterWriter.prepareWrite();
  slaveWriter = DataTransferFactory.getHCatWriter(writerContext);
}
 
开发者ID:apache,项目名称:beam,代码行数:13,代码来源:HCatalogIO.java


示例8: processElement

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
@ProcessElement
public void processElement(ProcessContext ctx) throws HCatException {
  hCatRecordsBatch.add(ctx.element());
  if (hCatRecordsBatch.size() >= spec.getBatchSize()) {
    flush();
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:8,代码来源:HCatalogIO.java


示例9: validateHcatFieldFollowsPigRules

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private static void validateHcatFieldFollowsPigRules(HCatFieldSchema hcatField)
    throws PigException {
  try {
    Type hType = hcatField.getType();
    switch (hType) {
    case BOOLEAN:
      if (!pigHasBooleanSupport) {
        throw new PigException("Incompatible type found in HCat table schema: "
            + hcatField, PigHCatUtil.PIG_EXCEPTION_CODE);
      }
      break;
    case ARRAY:
      validateHCatSchemaFollowsPigRules(hcatField.getArrayElementSchema());
      break;
    case STRUCT:
      validateHCatSchemaFollowsPigRules(hcatField.getStructSubSchema());
      break;
    case MAP:
      // key is only string
      if (hcatField.getMapKeyType() != Type.STRING) {
        LOG.info("Converting non-String key of map " + hcatField.getName() + " from "
          + hcatField.getMapKeyType() + " to String.");
      }
      validateHCatSchemaFollowsPigRules(hcatField.getMapValueSchema());
      break;
    }
  } catch (HCatException e) {
    throw new PigException("Incompatible type found in hcat table schema: " + hcatField,
        PigHCatUtil.PIG_EXCEPTION_CODE, e);
  }
}
 
开发者ID:cloudera,项目名称:RecordServiceClient,代码行数:32,代码来源:PigHCatUtil.java


示例10: get

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private Object get(String name) {
  checkColumn(name);
  try {
    return row.get(name, schema);
  } catch (HCatException e) {
    throw new RuntimeException("Error getting value for " + name, e);
  }
}
 
开发者ID:klarna,项目名称:HiveRunner,代码行数:9,代码来源:TableDataBuilder.java


示例11: column

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private static HCatFieldSchema column(String name) {
  try {
    return new HCatFieldSchema(name, STRING, null);
  } catch (HCatException e) {
    throw new RuntimeException(e);
  }
}
 
开发者ID:klarna,项目名称:HiveRunner,代码行数:8,代码来源:TableDataBuilderTest.java


示例12: finishBundle

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
@FinishBundle
public void finishBundle() throws HCatException {
  flush();
}
 
开发者ID:apache,项目名称:beam,代码行数:5,代码来源:HCatalogIO.java


示例13: getReaderContext

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
/** Returns a ReaderContext instance for the passed datastore config params. */
static ReaderContext getReaderContext(Map<String, String> config) throws HCatException {
  return DataTransferFactory.getHCatReader(READ_ENTITY, config).prepareRead();
}
 
开发者ID:apache,项目名称:beam,代码行数:5,代码来源:HCatalogIOTestUtils.java


示例14: getWriterContext

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
/** Returns a WriterContext instance for the passed datastore config params. */
private static WriterContext getWriterContext(Map<String, String> config) throws HCatException {
  return DataTransferFactory.getHCatWriter(WRITE_ENTITY, config).prepareWrite();
}
 
开发者ID:apache,项目名称:beam,代码行数:5,代码来源:HCatalogIOTestUtils.java


示例15: writeRecords

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
/** Writes records to the table using the passed WriterContext. */
private static void writeRecords(WriterContext context) throws HCatException {
  DataTransferFactory.getHCatWriter(context).write(getHCatRecords(TEST_RECORDS_COUNT).iterator());
}
 
开发者ID:apache,项目名称:beam,代码行数:5,代码来源:HCatalogIOTestUtils.java


示例16: loadHCatRecordItr

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private static Iterator<HCatRecord> loadHCatRecordItr(ReaderContext readCntxt, int dataSplit) throws HCatException {
    HCatReader currentHCatReader = DataTransferFactory.getHCatReader(readCntxt, dataSplit);

    return currentHCatReader.read();
}
 
开发者ID:apache,项目名称:kylin,代码行数:6,代码来源:HiveTableReader.java


示例17: loadHCatRecordItr

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private static Iterator<HCatRecord> loadHCatRecordItr(ReaderContext readCntxt, int dataSplit) throws HCatException {
    HCatReader currentHCatReader = DataTransferFactory.getHCatReader(readCntxt, dataSplit);
    return currentHCatReader.read();
}
 
开发者ID:KylinOLAP,项目名称:Kylin,代码行数:5,代码来源:HiveTableReader.java


示例18: buildTestSchema

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
private HCatSchema buildTestSchema() throws HCatException {
    HCatSchema hCatSchema = HCatSchemaUtils.getListSchemaBuilder().build();
    String[] fields = new String[] {
            "general_transaction_id",
            "program_year",
            "payment_publication_date",
            "submitting_applicable_manufacturer_or_applicable_gpo_name",
            "covered_recipient_type",
            "teaching_hospital_id",
            "teaching_hospital_name",
            "physician_profile_id",
            "physician_first_name",
            "physician_middle_name",
            "physician_last_name",
            "physician_name_suffix",
            "recipient_primary_business_street_address_line1",
            "recipient_primary_business_street_address_line2",
            "recipient_city",
            "recipient_state",
            "recipient_zip_code",
            "recipient_country",
            "recipient_province",
            "recipient_postal_code",
            "physician_primary_type",
            "physician_specialty",
            "physician_license_state_code1",
            "physician_license_state_code2",
            "physician_license_state_code3",
            "physician_license_state_code4",
            "physician_license_state_code5",
            "product_indicator",
            "name_of_associated_covered_drug_or_biological1",
            "name_of_associated_covered_drug_or_biological2",
            "name_of_associated_covered_drug_or_biological3",
            "name_of_associated_covered_drug_or_biological4",
            "name_of_associated_covered_drug_or_biological5",
            "ndc_of_associated_covered_drug_or_biological1",
            "ndc_of_associated_covered_drug_or_biological2",
            "ndc_of_associated_covered_drug_or_biological3",
            "ndc_of_associated_covered_drug_or_biological4",
            "ndc_of_associated_covered_drug_or_biological5",
            "name_of_associated_covered_device_or_medical_supply1",
            "name_of_associated_covered_device_or_medical_supply2",
            "name_of_associated_covered_device_or_medical_supply3",
            "name_of_associated_covered_device_or_medical_supply4",
            "name_of_associated_covered_device_or_medical_supply5",
            "applicable_manufacturer_or_applicable_gpo_making_payment_name",
            "applicable_manufacturer_or_applicable_gpo_making_payment_id",
            "applicable_manufacturer_or_applicable_gpo_making_payment_state",
            "applicable_manufacturer_or_applicable_gpo_making_payment_country",
            "dispute_status_for_publication",
            "total_amount_of_payment_usdollars",
            "date_of_payment",
            "number_of_payments_included_in_total_amount",
            "form_of_payment_or_transfer_of_value",
            "nature_of_payment_or_transfer_of_value",
            "city_of_travel",
            "state_of_travel",
            "country_of_travel",
            "physician_ownership_indicator",
            "third_party_payment_recipient_indicator",
            "name_of_third_party_entity_receiving_payment_or_transfer_of_value",
            "charity_indicator",
            "third_party_equals_covered_recipient_indicator",
            "contextual_information",
            "delay_in_publication_of_general_payment_indicator" };
    for (String field : fields) {
        hCatSchema.append(new HCatFieldSchema(field, TypeInfoFactory.stringTypeInfo, ""));
    }
    return hCatSchema;
}
 
开发者ID:mmiklavc,项目名称:hadoop-testing,代码行数:72,代码来源:CMSTopStateTest.java


示例19: buildFlinkTuple

import org.apache.hive.hcatalog.common.HCatException; //导入依赖的package包/类
protected abstract T buildFlinkTuple(T t, HCatRecord record) throws HCatException; 
开发者ID:axbaretto,项目名称:flink,代码行数:2,代码来源:HCatInputFormatBase.java



注:本文中的org.apache.hive.hcatalog.common.HCatException类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java CollectionAdminResponse类代码示例发布时间:2022-05-22
下一篇:
Java PistonExtensionMaterial类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap