• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java DatasetWriter类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.kitesdk.data.DatasetWriter的典型用法代码示例。如果您正苦于以下问题:Java DatasetWriter类的具体用法?Java DatasetWriter怎么用?Java DatasetWriter使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



DatasetWriter类属于org.kitesdk.data包,在下文中一共展示了DatasetWriter类的18个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
public void run(@DataIn(name="source.events", type=StandardEvent.class) View<StandardEvent> input,
                @DataOut(name="target.events", type=StandardEvent.class) View<StandardEvent> output) {

  DatasetReader<StandardEvent> reader = input.newReader();
  DatasetWriter<StandardEvent> writer = output.newWriter();

  try {
    while (reader.hasNext()) {

      writer.write(reader.next());
    }
  } finally {

    Closeables.closeQuietly(reader);
    Closeables.closeQuietly(writer);
  }
}
 
开发者ID:rbrush,项目名称:kite-apps,代码行数:18,代码来源:StandardEventsJob.java


示例2: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
public void run(@DataIn(name="source_users") View<GenericRecord> input,
                @DataOut(name="target_users") View<GenericRecord> output) {

  DatasetReader<GenericRecord> reader = input.newReader();
  DatasetWriter<GenericRecord> writer = output.newWriter();

  try {
    while (reader.hasNext()) {

      writer.write(reader.next());
    }
  } finally {

    Closeables.closeQuietly(reader);
    Closeables.closeQuietly(writer);
  }
}
 
开发者ID:rbrush,项目名称:kite-apps,代码行数:18,代码来源:ScheduledInputOutputJob.java


示例3: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
public void run(@DataOut(name="example_events", type=ExampleEvent.class) Signalable<ExampleEvent> view) {

    // Write some test data to the view.
    DatasetWriter<ExampleEvent> writer = view.newWriter();

    try {

      for (int i = 0; i < 10; ++i) {

        ExampleEvent event = ExampleEvent.newBuilder()
            .setUserId(i)
            .setSessionId(Integer.toString(i))
            .setTimestamp(getNominalTime().getMillis())
            .build();

        writer.write(event);
      }
    }
    finally {
      writer.close();
    }

    // Signal that our view is ready.
    view.signalReady();
  }
 
开发者ID:rbrush,项目名称:kite-apps,代码行数:26,代码来源:DataGeneratorJob.java


示例4: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(List<String> args) throws Exception {

  Preconditions.checkState(!Datasets.exists(uri),
      "events dataset already exists");

  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schema(StandardEvent.class).build();

  View<StandardEvent> events = Datasets.create(uri, descriptor, StandardEvent.class);
  DatasetWriter<StandardEvent> writer = events.newWriter();
  try {
    while (System.currentTimeMillis() - baseTimestamp < 36000) {
      writer.write(generateRandomEvent());
    }
  } finally {
    writer.close();
  }

  System.out.println("Generated " + counter + " events");

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:24,代码来源:CreateEvents.java


示例5: testAppendWriteExceptionInvokesPolicy

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Test
public void testAppendWriteExceptionInvokesPolicy()
    throws EventDeliveryException, NonRecoverableEventException {
  DatasetSink sink = sink(in, config);

  // run the sink
  sink.start();
  sink.process();

  // Mock an Event
  Event mockEvent = mock(Event.class);
  when(mockEvent.getBody()).thenReturn(new byte[] { 0x01 });

  // Mock a GenericRecord
  GenericRecord mockRecord = mock(GenericRecord.class);

  // Mock an EntityParser
  EntityParser<GenericRecord> mockParser = mock(EntityParser.class);
  when(mockParser.parse(eq(mockEvent), any(GenericRecord.class)))
      .thenReturn(mockRecord);
  sink.setParser(mockParser);

  // Mock a FailurePolicy
  FailurePolicy mockFailurePolicy = mock(FailurePolicy.class);
  sink.setFailurePolicy(mockFailurePolicy);

  // Mock a DatasetWriter
  DatasetWriter<GenericRecord> mockWriter = mock(DatasetWriter.class);
  doThrow(new DataFileWriter.AppendWriteException(new IOException()))
      .when(mockWriter).write(mockRecord);

  sink.setWriter(mockWriter);
  sink.write(mockEvent);

  // Verify that the event was sent to the failure policy
  verify(mockFailurePolicy).handle(eq(mockEvent), any(Throwable.class));

  sink.stop();
}
 
开发者ID:moueimei,项目名称:flume-release-1.7.0,代码行数:40,代码来源:TestDatasetSink.java


示例6: newWriter

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
private DatasetWriter<GenericRecord> newWriter(
    final UserGroupInformation login, final URI uri) {
  View<GenericRecord> view = KerberosUtil.runPrivileged(login,
      new PrivilegedExceptionAction<Dataset<GenericRecord>>() {
        @Override
        public Dataset<GenericRecord> run() {
          return Datasets.load(uri);
        }
      });

  DatasetDescriptor descriptor = view.getDataset().getDescriptor();
  String formatName = descriptor.getFormat().getName();
  Preconditions.checkArgument(allowedFormats().contains(formatName),
      "Unsupported format: " + formatName);

  Schema newSchema = descriptor.getSchema();
  if (targetSchema == null || !newSchema.equals(targetSchema)) {
    this.targetSchema = descriptor.getSchema();
    // target dataset schema has changed, invalidate all readers based on it
    readers.invalidateAll();
  }

  this.reuseDatum = !("parquet".equals(formatName));
  this.datasetName = view.getDataset().getName();

  return view.newWriter();
}
 
开发者ID:kite-sdk,项目名称:kite-examples-integration-tests,代码行数:28,代码来源:DatasetSink.java


示例7: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {
  // Create a partition strategy that hash partitions on username with 10 buckets
  PartitionStrategy partitionStrategy = new PartitionStrategy.Builder()
      .identity("favoriteColor", "favorite_color")
      .build();

  // Create a dataset of users with the Avro schema
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schemaUri("resource:user.avsc")
      .partitionStrategy(partitionStrategy)
      .build();
  Dataset<Record> users = Datasets.create(
      "dataset:hdfs:/tmp/data/users", descriptor, Record.class);

  // Get a writer for the dataset and write some users to it
  DatasetWriter<Record> writer = null;
  try {
    writer = users.newWriter();
    Random rand = new Random();
    GenericRecordBuilder builder = new GenericRecordBuilder(descriptor.getSchema());
    for (int i = 0; i < 100; i++) {
      Record record = builder.set("username", "user-" + i)
          .set("creationDate", System.currentTimeMillis())
          .set("favoriteColor", colors[rand.nextInt(colors.length)]).build();
      writer.write(record);
    }
  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:36,代码来源:CreateUserDatasetGenericPartitioned.java


示例8: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {
  // Create a dataset of users with the Avro schema
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schemaUri("resource:user.avsc")
      .build();
  Dataset<Record> users = Datasets.create(
      "dataset:hdfs:/tmp/data/users", descriptor, Record.class);

  // Get a writer for the dataset and write some users to it
  DatasetWriter<Record> writer = null;
  try {
    writer = users.newWriter();
    Random rand = new Random();
    GenericRecordBuilder builder = new GenericRecordBuilder(descriptor.getSchema());
    for (int i = 0; i < 100; i++) {
      Record record = builder.set("username", "user-" + i)
          .set("creationDate", System.currentTimeMillis())
          .set("favoriteColor", colors[rand.nextInt(colors.length)]).build();
      writer.write(record);
    }
  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:30,代码来源:CreateUserDatasetGeneric.java


示例9: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schemaUri("resource:user.avsc")
      .format(Formats.PARQUET)
      .build();
  Dataset<Record> users = Datasets.create(
      "dataset:hdfs:/tmp/data/users", descriptor, Record.class);

  // Get a writer for the dataset and write some users to it
  DatasetWriter<Record> writer = null;
  try {
    writer = users.newWriter();
    Random rand = new Random();
    GenericRecordBuilder builder = new GenericRecordBuilder(descriptor.getSchema());
    for (int i = 0; i < 100; i++) {
      Record record = builder.set("username", "user-" + i)
          .set("creationDate", System.currentTimeMillis())
          .set("favoriteColor", colors[rand.nextInt(colors.length)]).build();
      writer.write(record);
    }
  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:30,代码来源:CreateUserDatasetGenericParquet.java


示例10: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {

  // Create a dataset of products with the Avro schema
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schema(Product.class)
      .build();
  Dataset<Product> products = Datasets.create(
      "dataset:hdfs:/tmp/data/products", descriptor, Product.class);

  // Get a writer for the dataset and write some products to it
  DatasetWriter<Product> writer = null;
  try {
    writer = products.newWriter();
    int i = 0;
    for (String name : names) {
      Product product = new Product();
      product.setName(name);
      product.setId(i++);
      writer.write(product);
    }
  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:30,代码来源:CreateProductDatasetPojo.java


示例11: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {
  // Create a dataset of users with the Avro schema
  DatasetDescriptor descriptor = new DatasetDescriptor.Builder()
      .schemaUri("resource:user.avsc")
      .build();
  Dataset<Record> users = Datasets.create("dataset:hive?dataset=users",
      descriptor, Record.class);

  // Get a writer for the dataset and write some users to it
  DatasetWriter<Record> writer = null;
  try {
    writer = users.newWriter();
    Random rand = new Random();
    GenericRecordBuilder builder = new GenericRecordBuilder(descriptor.getSchema());
    for (int i = 0; i < 100; i++) {
      Record record = builder.set("username", "user-" + i)
          .set("creationDate", System.currentTimeMillis())
          .set("favoriteColor", colors[rand.nextInt(colors.length)]).build();
      writer.write(record);
    }

  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:31,代码来源:CreateHiveUserDatasetGeneric.java


示例12: getWriter

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@VisibleForTesting
DatasetWriter<GenericRecord> getWriter() {
  return writer;
}
 
开发者ID:moueimei,项目名称:flume-release-1.7.0,代码行数:5,代码来源:DatasetSink.java


示例13: setWriter

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@VisibleForTesting
void setWriter(DatasetWriter<GenericRecord> writer) {
  this.writer = writer;
}
 
开发者ID:moueimei,项目名称:flume-release-1.7.0,代码行数:5,代码来源:DatasetSink.java


示例14: testRuntimeExceptionThrowsEventDeliveryException

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Test
public void testRuntimeExceptionThrowsEventDeliveryException()
    throws EventDeliveryException, NonRecoverableEventException {
  DatasetSink sink = sink(in, config);

  // run the sink
  sink.start();
  sink.process();

  // Mock an Event
  Event mockEvent = mock(Event.class);
  when(mockEvent.getBody()).thenReturn(new byte[] { 0x01 });

  // Mock a GenericRecord
  GenericRecord mockRecord = mock(GenericRecord.class);

  // Mock an EntityParser
  EntityParser<GenericRecord> mockParser = mock(EntityParser.class);
  when(mockParser.parse(eq(mockEvent), any(GenericRecord.class)))
      .thenReturn(mockRecord);
  sink.setParser(mockParser);

  // Mock a FailurePolicy
  FailurePolicy mockFailurePolicy = mock(FailurePolicy.class);
  sink.setFailurePolicy(mockFailurePolicy);

  // Mock a DatasetWriter
  DatasetWriter<GenericRecord> mockWriter = mock(DatasetWriter.class);
  doThrow(new RuntimeException()).when(mockWriter).write(mockRecord);

  sink.setWriter(mockWriter);

  try {
    sink.write(mockEvent);
    Assert.fail("Should throw EventDeliveryException");
  } catch (EventDeliveryException ex) {

  }

  // Verify that the event was not sent to the failure policy
  verify(mockFailurePolicy, never()).handle(eq(mockEvent), any(Throwable.class));

  sink.stop();
}
 
开发者ID:moueimei,项目名称:flume-release-1.7.0,代码行数:45,代码来源:TestDatasetSink.java


示例15: getOrNewWriter

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
private DatasetWriter<GenericRecord> getOrNewWriter() {
  if (writer == null) {
    writer = dataset.newWriter();
  }
  return writer;
}
 
开发者ID:vybs,项目名称:sqoop-on-spark,代码行数:7,代码来源:KiteDatasetExecutor.java


示例16: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
public void run(@DataOut(name="kv-output", type= KeyValues.class) View<KeyValues> output) {


    DatasetWriter<KeyValues> writer = output.newWriter();

    try {

      JobContext context = getJobContext();

      KeyValues kv = KeyValues.newBuilder()
          .setJobsettings(context.getSettings())
          .setOutputsettings(context.getOutputSettings("kv-output"))
          .build();

      writer.write(kv);

    } finally {


      Closeables.closeQuietly(writer);
    }
  }
 
开发者ID:rbrush,项目名称:kite-apps,代码行数:23,代码来源:WriteConfigOutputJob.java


示例17: run

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception {
  // going to generate a lot of random log messages
  final Random rand = new Random();

  // data is written to the staging dataset
  Dataset<Record> staging = Datasets.load(
      "dataset:file:/tmp/data/logs_staging", Record.class);

  // this is going to build our simple log records
  GenericRecordBuilder builder = new GenericRecordBuilder(
      staging.getDescriptor().getSchema());

  // generate timestamps 1 second apart starting 1 day ago
  final Calendar now = Calendar.getInstance(TimeZone.getTimeZone("UTC"));
  final long yesterday = now.getTimeInMillis() - DAY_IN_MILLIS;

  DatasetWriter<Record> writer = null;
  try {
    writer = staging.newWriter();

    // generate 15,000 messages, each 5 seconds apart, starting 24 hours ago
    // this is a little less than 24 hours worth of messages
    for (int second : Ranges.closed(0, 15000).asSet(DiscreteDomains.integers())) {
      LOG.info("Generating log message " + second);

      builder.set("timestamp", yesterday + second * 5000);
      builder.set("component", "GenerateSimpleLogs");

      int level = rand.nextInt(LOG_LEVELS.length);
      builder.set("level", LOG_LEVELS[level]);
      builder.set("message", LOG_MESSAGES[level]);

      writer.write(builder.build());
    }

    if (writer instanceof Flushable) {
      ((Flushable) writer).flush();
    }
  } finally {
    if (writer != null) {
      writer.close();
    }
  }

  return 0;
}
 
开发者ID:kite-sdk,项目名称:kite-examples,代码行数:48,代码来源:GenerateSimpleLogs.java


示例18: runJob

import org.kitesdk.data.DatasetWriter; //导入依赖的package包/类
@Override
public void runJob(Map params, JobContext jobContext, Instant nominalTime) {

  View output = (View) params.get("output");

  DatasetWriter<GenericRecord> writer = output.newWriter();

  // Simply write all of the input datasets to the output.
  for (Map.Entry<String,View> param: ((Map<String,View>)  params).entrySet()) {

    if (!param.getKey().equals("output")) {
      DatasetReader<GenericRecord> reader = param.getValue().newReader();

      try {
        while (reader.hasNext()) {

          writer.write(reader.next());
        }
      } finally {

        Closeables.closeQuietly(reader);

      }
    }
  }

  Closeables.closeQuietly(writer);
}
 
开发者ID:rbrush,项目名称:kite-apps,代码行数:29,代码来源:DynamicInputOutputJob.java



注:本文中的org.kitesdk.data.DatasetWriter类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java SignalServiceSyncMessage类代码示例发布时间:2022-05-23
下一篇:
Java ProcessingState类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap