本文整理汇总了Java中avro.shaded.com.google.common.collect.Lists类的典型用法代码示例。如果您正苦于以下问题:Java Lists类的具体用法?Java Lists怎么用?Java Lists使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
Lists类属于avro.shaded.com.google.common.collect包,在下文中一共展示了Lists类的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: divideIntoBatches
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
/**
* Divide the part files into the batches to download and pre-sort at the same time.
*
* @param partFiles all part files to download.
*
* @return a map of path to download the parts and the parts that should be used (as reads
* source).
*
* @see #downloadBatchesAndPreSort(List)
*/
private Map<Path, ReadsDataSource> divideIntoBatches(final List<Path> partFiles) {
// partition the files into batches
final List<List<Path>> batches = Lists.partition(partFiles, numberOfParts);
final Map<Path, ReadsDataSource> toReturn = new LinkedHashMap<>(batches.size());
// creates a temp file for each in a common temp folder
final File tempDir = IOUtil.createTempDir(this.toString(), ".batches");
int i = 0;
for (final List<Path> parts : batches) {
// create a temp file and store it in the temp parts
final Path tempFile =
IOUtils.getPath(new File(tempDir, "batch-" + i++ + ".bam").getAbsolutePath());
logger.debug("Batch file {} will contain {} parts: {}",
() -> tempFile.toUri().toString(),
() -> parts.size(),
() -> parts.stream().map(p -> p.toUri().toString())
.collect(Collectors.toList()));
toReturn.put(tempFile, new ReadsDataSource(parts, getSamReaderFactory()));
}
return toReturn;
}
开发者ID:magicDGS,项目名称:ReadTools,代码行数:33,代码来源:DistmapPartDownloader.java
示例2: toListOfString
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
private static List<String> toListOfString(Map<Object, ?> map) {
List<Object> keysList = map.keySet().stream().collect(Collectors.toList());
List<String> keysListString = Lists.newArrayList();
keysList.forEach(key -> keysListString.add(String.valueOf(key)));
return keysListString;
}
开发者ID:atlascon,项目名称:avro-diff,代码行数:8,代码来源:AvroDiffMap.java
示例3: createConsumer
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
private Consumer<String, byte[]> createConsumer() throws Exception {
Properties props = PropertiesHolder.getProperties(Constants.Properties.CONSUMER_CONFIG);
props.setProperty("client.id", this.topologyId + "_consumer");
props.setProperty("group.id", this.topologyId + "_grp");
TopicProvider dataTopicProvider = new DataInputTopicProvider();
TopicProvider controlTopicProvider = new ControlTopicProvider();
TopicProvider schemaTopicProvider = new SchemaTopicProvider();
this.controlTopics = controlTopicProvider.provideTopics();
this.dataTopics = dataTopicProvider.provideTopics();
List<String> topics = Lists.newArrayList();
topics.addAll(controlTopics);
topics.addAll(dataTopics);
if(DbusDatasourceType.ORACLE == GlobalCache.getDatasourceType()) {
this.schemaTopics = schemaTopicProvider.provideTopics();
topics.addAll(schemaTopics);
}
Consumer<String, byte[]> consumer = consumerProvider.consumer(props, topics);
Map<String, Object> data = zkNodeOperator.getData();
for (Map.Entry<String, Object> entry : data.entrySet()) {
TopicInfo b = TopicInfo.build((Map<String, Object>) entry.getValue());
this.pausedTopics.put(b.getTopic(), b);
}
consumerProvider.pause(consumer, this.pausedTopics.entrySet()
.stream()
.map(entry -> entry.getValue())
.collect(Collectors.toList()));
return consumer;
}
开发者ID:BriData,项目名称:DBus,代码行数:36,代码来源:AppenderConsumer.java
示例4: getPathSteps
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
/**
* Get the step from the hierachical string path
*
* @param path
* @return
*/
public static Stack<String> getPathSteps(String path) {
if (path.startsWith(".")) {
path = path.substring(1);
}
Stack<String> pathSteps = new Stack<String>();
List<String> stepsList = Arrays.asList(path.split("\\."));
pathSteps.addAll(Lists.reverse(stepsList));
return pathSteps;
}
开发者ID:Talend,项目名称:components,代码行数:16,代码来源:TypeConverterUtils.java
示例5: testGetDeltaFieldNamesForNewSchema
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
@Test
public void testGetDeltaFieldNamesForNewSchema(){
Configuration conf = mock(Configuration.class);
when(conf.get(FieldAttributeBasedDeltaFieldsProvider.ATTRIBUTE_FIELD)).thenReturn("attributes_json");
when(conf.get(FieldAttributeBasedDeltaFieldsProvider.DELTA_PROP_NAME,
FieldAttributeBasedDeltaFieldsProvider.DEFAULT_DELTA_PROP_NAME))
.thenReturn(FieldAttributeBasedDeltaFieldsProvider.DEFAULT_DELTA_PROP_NAME);
AvroDeltaFieldNameProvider provider = new FieldAttributeBasedDeltaFieldsProvider(conf);
Schema original = new Schema.Parser().parse(FULL_SCHEMA_WITH_ATTRIBUTES);
GenericRecord record = mock(GenericRecord.class);
when(record.getSchema()).thenReturn(original);
List<String> fields = provider.getDeltaFieldNames(record);
Assert.assertEquals(fields, Lists.newArrayList("scn2", "scn"));
}
开发者ID:apache,项目名称:incubator-gobblin,代码行数:16,代码来源:FieldAttributeBasedDeltaFieldsProviderTest.java
示例6: reloadConfig
import avro.shaded.com.google.common.collect.Lists; //导入依赖的package包/类
/**
*
* @param reloadJson: reload control msg json
*/
private void reloadConfig(String reloadJson) {
close();
ZKHelper zkHelper = null;
DBHelper dbHelper = null;
try {
dsInfo = new DataSourceInfo();
zkHelper = new ZKHelper(topologyRoot, topologyID, zkServers);
zkHelper.loadDsNameAndOffset(dsInfo);
dbHelper = new DBHelper(zkServers);
dbHelper.loadDsInfo(dsInfo);
logger.info(String.format("Spout read datasource: %s", dsInfo.toString()));
//init consumer
dataTopicPartition = new TopicPartition(dsInfo.getDataTopic(), 0);
ctrlTopicPartition = new TopicPartition(dsInfo.getCtrlTopic(), 0);
List<TopicPartition> topics = Arrays.asList(dataTopicPartition, ctrlTopicPartition);
consumer = new KafkaConsumer(zkHelper.getConsumerProps());
consumer.assign(topics);
//skip offset
long oldOffset = consumer.position(dataTopicPartition);
logger.info(String.format("reloaded offset as: %d", oldOffset));
String offset = dsInfo.getDataTopicOffset();
if (offset.equalsIgnoreCase("none")) {
; // do nothing
} else if (offset.equalsIgnoreCase("begin")) {
consumer.seekToBeginning(Lists.newArrayList(dataTopicPartition));
logger.info(String.format("Offset seek to begin, changed as: %d", consumer.position(dataTopicPartition)));
} else if (offset.equalsIgnoreCase("end")) {
consumer.seekToEnd(Lists.newArrayList(dataTopicPartition));
logger.info(String.format("Offset seek to end, changed as: %d", consumer.position(dataTopicPartition)));
} else {
long nOffset = Long.parseLong(offset);
consumer.seek(dataTopicPartition, nOffset);
logger.info(String.format("Offset changed as: %d", consumer.position(dataTopicPartition)));
}
dsInfo.resetDataTopicOffset();
zkHelper.saveDsInfo(dsInfo);
zkHelper.saveReloadStatus(reloadJson, "dispatcher-spout", true);
} catch (Exception ex) {
logger.error("KafkaConsumerSpout reloadConfig():", ex);
collector.reportError(ex);
throw new RuntimeException(ex);
} finally {
if (dbHelper != null) {
dbHelper.close();
}
if (zkHelper != null) {
zkHelper.close();
}
}
}
开发者ID:BriData,项目名称:DBus,代码行数:66,代码来源:KafkaConsumerSpout.java
注:本文中的avro.shaded.com.google.common.collect.Lists类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论