本文整理汇总了Java中org.apache.parquet.io.api.Converter类的典型用法代码示例。如果您正苦于以下问题:Java Converter类的具体用法?Java Converter怎么用?Java Converter使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
Converter类属于org.apache.parquet.io.api包,在下文中一共展示了Converter类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: groupConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter groupConverter(OutputMutator mutator,
List<Field> arrowSchema, GroupType groupType, PathSegment colNextChild, final String nameForChild) {
Collection<SchemaPath> c = new ArrayList<>();
while (colNextChild != null) {
if (colNextChild.isNamed()) {
break;
}
colNextChild = colNextChild.getChild();
}
if (colNextChild != null) {
SchemaPath s = new SchemaPath(colNextChild.getNameSegment());
c.add(s);
}
if (arrowSchema != null) {
return groupConverterFromArrowSchema(nameForChild, groupType.getName(), groupType, c);
}
return defaultGroupConverter(mutator, groupType, nameForChild, c, null);
}
开发者ID:dremio,项目名称:dremio-oss,代码行数:23,代码来源:ParquetGroupConverter.java
示例2: ParquetValueConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
public ParquetValueConverter(GroupType schema, ParentContainerUpdater updater)
{
super(updater);
ArrayList<String> fieldNames = new ArrayList<>();
for (Type type : schema.getFields()) {
fieldNames.add(type.getName());
}
this.currentMap = new InternalMap(fieldNames);
this.fieldConverters = new Converter[schema.getFieldCount()];
int i = 0;
for (Type field : schema.getFields()) {
InternalMapUpdater update = new InternalMapUpdater(currentMap, i);
fieldConverters[i++] = newFieldConverter(field, update);
}
}
开发者ID:CyberAgent,项目名称:embulk-input-parquet_hadoop,代码行数:18,代码来源:ParquetValueConverter.java
示例3: newFieldConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter newFieldConverter(Type parquetType, ParentContainerUpdater updater)
{
if (parquetType.isRepetition(Type.Repetition.REPEATED) && parquetType.getOriginalType() != OriginalType.LIST) {
// A repeated field that is neither contained by a `LIST`- or `MAP`-annotated group nor
// annotated by `LIST` or `MAP` should be interpreted as a required list of required
// elements where the element type is the type of the field.
if (parquetType.isPrimitive()) {
return new RepeatedPrimitiveConverter(parquetType, updater);
}
else {
return new RepeatedGroupConverter(parquetType, updater);
}
}
else {
return newConverter(parquetType, updater);
}
}
开发者ID:CyberAgent,项目名称:embulk-input-parquet_hadoop,代码行数:18,代码来源:ParquetValueConverter.java
示例4: buildFieldToConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Map<Integer, Converter> buildFieldToConverter(final GroupType schema) {
final Map<Integer, Converter> fieldToConverter = new HashMap<>(fieldCount);
int i = 0;
for (final Type field : schema.getFields()) {
final String[] newColumnPath = new String[columnPath.length + 1];
int j = 0;
for (final String part : columnPath) {
newColumnPath[j] = part;
j++;
}
newColumnPath[j] = field.getName();
if (field.isPrimitive()) {
fieldToConverter.put(i, new PrimitiveConverter(parquetColumnToObject, field.asPrimitiveType().getPrimitiveTypeName().javaType.getSimpleName(), newColumnPath, field.getOriginalType()));
} else {
fieldToConverter.put(i, new BypassGroupConverter(parquetColumnToObject, field.asGroupType(), newColumnPath));
}
i++;
}
return fieldToConverter;
}
开发者ID:gchq,项目名称:Gaffer,代码行数:21,代码来源:BypassGroupConverter.java
示例5: SimpleGroupConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
SimpleGroupConverter(SimpleGroupConverter parent, int index, GroupType schema) {
this.parent = parent;
this.index = index;
converters = new Converter[schema.getFieldCount()];
for (int i = 0; i < converters.length; i++) {
final Type type = schema.getType(i);
if (type.isPrimitive()) {
converters[i] = new SimplePrimitiveConverter(this, i);
} else {
converters[i] = new SimpleGroupConverter(this, i, type.asGroupType());
}
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:17,代码来源:SimpleGroupConverter.java
示例6: getConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
@Override
public Converter getConverter(int fieldIndex) {
// get the real converter from the delegate
Converter delegateConverter = checkNotNull(delegate.getConverter(fieldIndex), "delegate converter");
// determine the indexFieldPath for the converter proxy we're about to make, which is
// this converter's path + the requested fieldIndex
List<Integer> newIndexFieldPath = new ArrayList<Integer>(indexFieldPath.size() + 1);
newIndexFieldPath.addAll(indexFieldPath);
newIndexFieldPath.add(fieldIndex);
if (delegateConverter.isPrimitive()) {
PrimitiveColumnIO columnIO = getColumnIO(newIndexFieldPath);
ColumnPath columnPath = ColumnPath.get(columnIO.getColumnDescriptor().getPath());
ValueInspector[] valueInspectors = getValueInspectors(columnPath);
return new FilteringPrimitiveConverter(delegateConverter.asPrimitiveConverter(), valueInspectors);
} else {
return new FilteringGroupConverter(delegateConverter.asGroupConverter(), newIndexFieldPath, valueInspectorsByColumn, columnIOsByIndexFieldPath);
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:23,代码来源:FilteringGroupConverter.java
示例7: newConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter newConverter(List<TProtocol> events, Type type, ThriftField field) {
switch (field.getType().getType()) {
case LIST:
return new ListConverter(events, type.asGroupType(), field);
case SET:
return new SetConverter(events, type.asGroupType(), field);
case MAP:
return new MapConverter(events, type.asGroupType(), field);
case STRUCT:
return new StructConverter(events, type.asGroupType(), field);
case STRING:
return new FieldStringConverter(events, field);
case ENUM:
return new FieldEnumConverter(events, field);
default:
return new FieldPrimitiveConverter(events, field);
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:19,代码来源:ThriftRecordConverter.java
示例8: getConverterFromDescription
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
protected static Converter getConverterFromDescription(final Type type, final int index,
final HiveGroupConverter parent) {
if (type == null) {
return null;
}
if (type.isPrimitive()) {
return ETypeConverter.getNewConverter(type.asPrimitiveType().getPrimitiveTypeName().javaType,
index, parent);
} else {
if (type.asGroupType().getRepetition() == Repetition.REPEATED) {
return new ArrayWritableGroupConverter(type.asGroupType(), parent, index);
} else {
return new DataWritableGroupConverter(type.asGroupType(), parent, index);
}
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:17,代码来源:HiveGroupConverter.java
示例9: DataWritableGroupConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
public DataWritableGroupConverter(final GroupType selectedGroupType,
final HiveGroupConverter parent, final int index, final GroupType containingGroupType) {
this.parent = parent;
this.index = index;
final int totalFieldCount = containingGroupType.getFieldCount();
final int selectedFieldCount = selectedGroupType.getFieldCount();
currentArr = new Object[totalFieldCount];
converters = new Converter[selectedFieldCount];
List<Type> selectedFields = selectedGroupType.getFields();
for (int i = 0; i < selectedFieldCount; i++) {
Type subtype = selectedFields.get(i);
if (containingGroupType.getFields().contains(subtype)) {
converters[i] = getConverterFromDescription(subtype,
containingGroupType.getFieldIndex(subtype.getName()), this);
} else {
throw new IllegalStateException("Group type [" + containingGroupType +
"] does not contain requested field: " + subtype);
}
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:23,代码来源:DataWritableGroupConverter.java
示例10: newMessageConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter newMessageConverter(final Message.Builder parentBuilder, final Descriptors.FieldDescriptor fieldDescriptor, Type parquetType) {
boolean isRepeated = fieldDescriptor.isRepeated();
ParentValueContainer parent;
if (isRepeated) {
parent = new ParentValueContainer() {
@Override
public void add(Object value) {
parentBuilder.addRepeatedField(fieldDescriptor, value);
}
};
} else {
parent = new ParentValueContainer() {
@Override
public void add(Object value) {
parentBuilder.setField(fieldDescriptor, value);
}
};
}
return newScalarConverter(parent, parentBuilder, fieldDescriptor, parquetType);
}
开发者ID:apache,项目名称:parquet-mr,代码行数:25,代码来源:ProtoMessageConverter.java
示例11: newScalarConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter newScalarConverter(ParentValueContainer pvc, Message.Builder parentBuilder, Descriptors.FieldDescriptor fieldDescriptor, Type parquetType) {
JavaType javaType = fieldDescriptor.getJavaType();
switch (javaType) {
case STRING: return new ProtoStringConverter(pvc);
case FLOAT: return new ProtoFloatConverter(pvc);
case DOUBLE: return new ProtoDoubleConverter(pvc);
case BOOLEAN: return new ProtoBooleanConverter(pvc);
case BYTE_STRING: return new ProtoBinaryConverter(pvc);
case ENUM: return new ProtoEnumConverter(pvc, fieldDescriptor);
case INT: return new ProtoIntConverter(pvc);
case LONG: return new ProtoLongConverter(pvc);
case MESSAGE: {
Message.Builder subBuilder = parentBuilder.newBuilderForField(fieldDescriptor);
return new ProtoMessageConverter(pvc, subBuilder, parquetType.asGroupType());
}
}
throw new UnsupportedOperationException(String.format("Cannot convert type: %s" +
" (Parquet type: %s) ", javaType, parquetType));
}
开发者ID:apache,项目名称:parquet-mr,代码行数:23,代码来源:ProtoMessageConverter.java
示例12: AvroUnionConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
public AvroUnionConverter(ParentValueContainer parent, Type parquetSchema,
Schema avroSchema, GenericData model) {
this.parent = parent;
GroupType parquetGroup = parquetSchema.asGroupType();
this.memberConverters = new Converter[ parquetGroup.getFieldCount()];
int parquetIndex = 0;
for (int index = 0; index < avroSchema.getTypes().size(); index++) {
Schema memberSchema = avroSchema.getTypes().get(index);
if (!memberSchema.getType().equals(Schema.Type.NULL)) {
Type memberType = parquetGroup.getType(parquetIndex);
memberConverters[parquetIndex] = newConverter(memberSchema, memberType, model, new ParentValueContainer() {
@Override
public void add(Object value) {
Preconditions.checkArgument(memberValue==null, "Union is resolving to more than one type");
memberValue = value;
}
});
parquetIndex++; // Note for nulls the parquetIndex id not increased
}
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:23,代码来源:AvroIndexedRecordConverter.java
示例13: AvroUnionConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
public AvroUnionConverter(ParentValueContainer parent, Type parquetSchema,
Schema avroSchema, GenericData model) {
super(parent);
GroupType parquetGroup = parquetSchema.asGroupType();
this.memberConverters = new Converter[ parquetGroup.getFieldCount()];
int parquetIndex = 0;
for (int index = 0; index < avroSchema.getTypes().size(); index++) {
Schema memberSchema = avroSchema.getTypes().get(index);
if (!memberSchema.getType().equals(Schema.Type.NULL)) {
Type memberType = parquetGroup.getType(parquetIndex);
memberConverters[parquetIndex] = newConverter(memberSchema, memberType, model, new ParentValueContainer() {
@Override
public void add(Object value) {
Preconditions.checkArgument(
AvroUnionConverter.this.memberValue == null,
"Union is resolving to more than one type");
memberValue = value;
}
});
parquetIndex++; // Note for nulls the parquetIndex id not increased
}
}
}
开发者ID:apache,项目名称:parquet-mr,代码行数:25,代码来源:AvroRecordConverter.java
示例14: addChildConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
protected void addChildConverter(OutputMutator mutator,
List<Field> arrowSchema, Iterator<SchemaPath> colIterator, Type type, Function<String, String> childNameResolver) {
// Match the name of the field in the schema definition to the name of the field in the query.
String name = null;
SchemaPath col;
PathSegment colPath;
PathSegment colNextChild = null;
while (colIterator.hasNext()) {
col = colIterator.next();
colPath = col.getRootSegment();
colNextChild = colPath.getChild();
if (colPath.isNamed() && (!colPath.getNameSegment().getPath().equals("*"))) {
name = colPath.getNameSegment().getPath();
// We may have a field that does not exist in the schema
if (!name.equalsIgnoreCase(type.getName())) {
continue;
}
}
break;
}
if (name == null) {
name = type.getName();
}
final String nameForChild = childNameResolver.apply(name);
final Converter converter = type.isPrimitive() ?
getConverterForType(nameForChild, type.asPrimitiveType())
: groupConverter(mutator, arrowSchema, type.asGroupType(), colNextChild, nameForChild);
converters.add(converter);
}
开发者ID:dremio,项目名称:dremio-oss,代码行数:32,代码来源:ParquetGroupConverter.java
示例15: groupConverterFromArrowSchema
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
Converter groupConverterFromArrowSchema(String nameForChild, String fieldName, GroupType groupType, Collection<SchemaPath> c) {
final Field arrowField = Schema.findField(arrowSchema, fieldName);
final ArrowTypeID arrowTypeType = arrowField.getType().getTypeID();
final List<Field> arrowChildren = arrowField.getChildren();
if (arrowTypeType == ArrowTypeID.Union) {
// if it's a union we will add the children directly to the parent
return new UnionGroupConverter(mutator, getWriterProvider(), groupType, c, options, arrowChildren, nameForChild, containsCorruptedDates, readInt96AsTimeStamp);
} else if (arrowTypeType == ArrowTypeID.List) {
// make sure the parquet schema matches the arrow schema and delegate handling the logical list to defaultGroupConverter()
Preconditions.checkState(groupType.getOriginalType() == OriginalType.LIST, "parquet schema doesn't match the arrow schema for LIST " + nameForChild);
}
return defaultGroupConverter(mutator, groupType, nameForChild, c, arrowChildren);
}
开发者ID:dremio,项目名称:dremio-oss,代码行数:15,代码来源:ParquetGroupConverter.java
示例16: defaultGroupConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
Converter defaultGroupConverter(OutputMutator mutator, GroupType groupType, final String nameForChild,
Collection<SchemaPath> c, List<Field> arrowSchema) {
if (groupType.getOriginalType() == OriginalType.LIST && LogicalListL1Converter.isSupportedSchema(groupType)) {
return new LogicalListL1Converter(
nameForChild,
mutator,
getWriterProvider(),
groupType,
c,
options,
arrowSchema,
containsCorruptedDates,
readInt96AsTimeStamp);
}
final MapWriter map;
if (groupType.isRepetition(REPEATED)) {
if (arrowSchema != null) {
//TODO assert this should never occur at this level
// only parquet writer that writes arrowSchema doesn't write repeated fields except
// as part of a LOGICAL LIST, thus this scenario (repeated + arrow schema present) can
// only happen in LogicalList converter
arrowSchema = handleRepeatedField(arrowSchema, groupType);
}
map = list(nameForChild).map();
} else {
map = getWriterProvider().map(nameForChild);
}
return new StructGroupConverter(mutator, map, groupType, c, options, arrowSchema, containsCorruptedDates, readInt96AsTimeStamp);
}
开发者ID:dremio,项目名称:dremio-oss,代码行数:32,代码来源:ParquetGroupConverter.java
示例17: start
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
@Override
public void start()
{
int i = 0;
for (Converter converter : fieldConverters) {
((HasParentContainerUpdater) converter).getUpdater().start();
currentMap.set(i, ValueFactory.newNil());
i += 1;
}
}
开发者ID:CyberAgent,项目名称:embulk-input-parquet_hadoop,代码行数:11,代码来源:ParquetValueConverter.java
示例18: end
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
@Override
public void end()
{
for (Converter converter : fieldConverters) {
((HasParentContainerUpdater) converter).getUpdater().end();
}
getUpdater().set(currentMap.build());
}
开发者ID:CyberAgent,项目名称:embulk-input-parquet_hadoop,代码行数:9,代码来源:ParquetValueConverter.java
示例19: newConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Converter newConverter(Type parquetType, ParentContainerUpdater updater)
{
if (parquetType.isPrimitive()) {
return newConverterForPrimitiveField(parquetType.asPrimitiveType(), updater);
}
else {
return newConverterForGroupField(parquetType.asGroupType(), updater);
}
}
开发者ID:CyberAgent,项目名称:embulk-input-parquet_hadoop,代码行数:10,代码来源:ParquetValueConverter.java
示例20: buildFieldToConverter
import org.apache.parquet.io.api.Converter; //导入依赖的package包/类
private Map<Integer, Converter> buildFieldToConverter(final MessageType schema) {
final Map<Integer, Converter> fieldToConverter = new HashMap<>(fieldCount);
int i = 0;
for (final Type field : schema.getFields()) {
if (field.isPrimitive()) {
fieldToConverter.put(i, new PrimitiveConverter(parquetColumnToObject, field.asPrimitiveType().getPrimitiveTypeName().javaType.getSimpleName(), new String[]{field.getName()}, field.getOriginalType()));
} else {
fieldToConverter.put(i, new BypassGroupConverter(parquetColumnToObject, field.asGroupType(), new String[]{field.getName()}));
}
i++;
}
return fieldToConverter;
}
开发者ID:gchq,项目名称:Gaffer,代码行数:14,代码来源:GafferElementConverter.java
注:本文中的org.apache.parquet.io.api.Converter类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论