• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java JobConfigurationLoad类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.google.api.services.bigquery.model.JobConfigurationLoad的典型用法代码示例。如果您正苦于以下问题:Java JobConfigurationLoad类的具体用法?Java JobConfigurationLoad怎么用?Java JobConfigurationLoad使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



JobConfigurationLoad类属于com.google.api.services.bigquery.model包,在下文中一共展示了JobConfigurationLoad类的12个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: makeLoadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
private Job makeLoadJob(JobReference jobRef, String sourceUri, String tableId) {
  TableReference tableReference = new TableReference()
      .setProjectId(jobRef.getProjectId())
      .setDatasetId(SNAPSHOTS_DATASET)
      .setTableId(tableId);
  return new Job()
      .setJobReference(jobRef)
      .setConfiguration(new JobConfiguration()
          .setLoad(new JobConfigurationLoad()
              .setWriteDisposition(WriteDisposition.WRITE_EMPTY.toString())
              .setSourceFormat(SourceFormat.DATASTORE_BACKUP.toString())
              .setSourceUris(ImmutableList.of(sourceUri))
              .setDestinationTable(tableReference)));
}
 
开发者ID:google,项目名称:nomulus,代码行数:15,代码来源:LoadSnapshotAction.java


示例2: getExpectedJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
/**
 * Helper method to get the load request to BigQuery with numValues copies of jsonValue.
 */
private Job getExpectedJob() {
  // Configure a write job.
  JobConfigurationLoad loadConfig = new JobConfigurationLoad();
  loadConfig.setCreateDisposition("CREATE_IF_NEEDED");
  loadConfig.setWriteDisposition("WRITE_TRUNCATE");
  loadConfig.setSourceFormat("NEWLINE_DELIMITED_JSON");

  // Describe the resulting table you are importing to:
  loadConfig.setDestinationTable(getSampleTableRef());

  // Create and set the output schema.
  TableSchema schema = new TableSchema();
  schema.setFields(fields);
  loadConfig.setSchema(schema);

  // Create Job configuration.
  JobConfiguration jobConfig = new JobConfiguration();
  jobConfig.setLoad(loadConfig);

  // Set the output write job.
  Job expectedJob = new Job();
  expectedJob.setConfiguration(jobConfig);
  expectedJob.setJobReference(jobReference);
  return expectedJob;
}
 
开发者ID:GoogleCloudPlatform,项目名称:bigdata-interop,代码行数:29,代码来源:BigQueryRecordWriterTest.java


示例3: startLoadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
/**
 * {@inheritDoc}
 *
 * <p>Tries executing the RPC for at most {@code MAX_RPC_RETRIES} times until it succeeds.
 *
 * @throws IOException if it exceeds {@code MAX_RPC_RETRIES} attempts.
 */
@Override
public void startLoadJob(
    JobReference jobRef,
    JobConfigurationLoad loadConfig) throws InterruptedException, IOException {
  Job job = new Job()
      .setJobReference(jobRef)
      .setConfiguration(new JobConfiguration().setLoad(loadConfig));

  startJob(job, errorExtractor, client);
}
 
开发者ID:apache,项目名称:beam,代码行数:18,代码来源:BigQueryServicesImpl.java


示例4: startLoadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
@Override
public void startLoadJob(JobReference jobRef, JobConfigurationLoad loadConfig)
    throws InterruptedException, IOException {
  synchronized (allJobs) {
    verifyUniqueJobId(jobRef.getJobId());
    Job job = new Job();
    job.setJobReference(jobRef);
    job.setConfiguration(new JobConfiguration().setLoad(loadConfig));
    job.setKind(" bigquery#job");
    job.setStatus(new JobStatus().setState("PENDING"));

    // Copy the files to a new location for import, as the temporary files will be deleted by
    // the caller.
    if (loadConfig.getSourceUris().size() > 0) {
      ImmutableList.Builder<ResourceId> sourceFiles = ImmutableList.builder();
      ImmutableList.Builder<ResourceId> loadFiles = ImmutableList.builder();
      for (String filename : loadConfig.getSourceUris()) {
        sourceFiles.add(FileSystems.matchNewResource(filename, false /* isDirectory */));
        loadFiles.add(FileSystems.matchNewResource(
            filename + ThreadLocalRandom.current().nextInt(), false /* isDirectory */));
      }

      FileSystems.copy(sourceFiles.build(), loadFiles.build());
      filesForLoadJobs.put(jobRef.getProjectId(), jobRef.getJobId(), loadFiles.build());
    }

    allJobs.put(jobRef.getProjectId(), jobRef.getJobId(), new JobInfo(job));
  }
}
 
开发者ID:apache,项目名称:beam,代码行数:30,代码来源:FakeJobService.java


示例5: runLoadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
private JobStatus runLoadJob(JobReference jobRef, JobConfigurationLoad load)
    throws InterruptedException, IOException {
  TableReference destination = load.getDestinationTable();
  TableSchema schema = load.getSchema();
  checkArgument(schema != null, "No schema specified");
  List<ResourceId> sourceFiles = filesForLoadJobs.get(jobRef.getProjectId(), jobRef.getJobId());
  WriteDisposition writeDisposition = WriteDisposition.valueOf(load.getWriteDisposition());
  CreateDisposition createDisposition = CreateDisposition.valueOf(load.getCreateDisposition());
  checkArgument(load.getSourceFormat().equals("NEWLINE_DELIMITED_JSON"));
  Table existingTable = datasetService.getTable(destination);
  if (!validateDispositions(existingTable, createDisposition, writeDisposition)) {
    return new JobStatus().setState("FAILED").setErrorResult(new ErrorProto());
  }
  if (existingTable == null) {
    TableReference strippedDestination =
        destination
            .clone()
            .setTableId(BigQueryHelpers.stripPartitionDecorator(destination.getTableId()));
    existingTable =
        new Table()
            .setTableReference(strippedDestination)
            .setSchema(schema);
    if (load.getTimePartitioning() != null) {
      existingTable = existingTable.setTimePartitioning(load.getTimePartitioning());
    }
    datasetService.createTable(existingTable);
  }

  List<TableRow> rows = Lists.newArrayList();
  for (ResourceId filename : sourceFiles) {
    rows.addAll(readRows(filename.toString()));
  }
  datasetService.insertAll(destination, rows, null);
  FileSystems.delete(sourceFiles);
  return new JobStatus().setState("DONE");
}
 
开发者ID:apache,项目名称:beam,代码行数:37,代码来源:FakeJobService.java


示例6: load

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
/**
 * Starts an asynchronous load job to populate the specified destination table with the given
 * source URIs and source format.  Returns a ListenableFuture that holds the same destination
 * table object on success.
 */
public ListenableFuture<DestinationTable> load(
    DestinationTable dest,
    SourceFormat sourceFormat,
    Iterable<String> sourceUris) throws Exception {
  Job job = new Job()
      .setConfiguration(new JobConfiguration()
          .setLoad(new JobConfigurationLoad()
              .setWriteDisposition(dest.getWriteDisposition().toString())
              .setSourceFormat(sourceFormat.toString())
              .setSourceUris(ImmutableList.copyOf(sourceUris))
              .setDestinationTable(dest.getTableReference())));
  return transform(runJobToCompletion(job, dest), this::updateTable, directExecutor());
}
 
开发者ID:google,项目名称:nomulus,代码行数:19,代码来源:BigqueryConnection.java


示例7: loadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
public static Job loadJob(
    Bigquery bigquery,
    String cloudStoragePath, 
    TableReference table,
    TableSchema schema) throws IOException{
  
  JobConfigurationLoad load = new JobConfigurationLoad()
  .setDestinationTable(table)
  .setSchema(schema)
  .setSourceUris(Collections.singletonList(cloudStoragePath));

  return bigquery.jobs().insert(table.getProjectId(), 
      new Job().setConfiguration(new JobConfiguration().setLoad(load)))
      .execute();
}
 
开发者ID:googlearchive,项目名称:bigquery-samples-python,代码行数:16,代码来源:LoadDataCSVSample.java


示例8: jobConfiguration

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
@Override
protected JobConfiguration jobConfiguration(String projectId)
{
    JobConfigurationLoad cfg = new JobConfigurationLoad()
            .setSourceUris(sourceUris(params));

    if (params.has("schema")) {
        cfg.setSchema(tableSchema(params));
    }

    Optional<DatasetReference> defaultDataset = params.getOptional("dataset", String.class)
            .transform(Bq::datasetReference);

    String destinationTable = params.get("destination_table", String.class);
    cfg.setDestinationTable(tableReference(projectId, defaultDataset, destinationTable));

    params.getOptional("create_disposition", String.class).transform(cfg::setCreateDisposition);
    params.getOptional("write_disposition", String.class).transform(cfg::setWriteDisposition);

    params.getOptional("source_format", String.class).transform(cfg::setSourceFormat);
    params.getOptional("field_delimiter", String.class).transform(cfg::setFieldDelimiter);
    params.getOptional("skip_leading_rows", int.class).transform(cfg::setSkipLeadingRows);
    params.getOptional("encoding", String.class).transform(cfg::setEncoding);
    params.getOptional("quote", String.class).transform(cfg::setQuote);
    params.getOptional("max_bad_records", int.class).transform(cfg::setMaxBadRecords);
    params.getOptional("allow_quoted_newlines", boolean.class).transform(cfg::setAllowQuotedNewlines);
    params.getOptional("allow_jagged_rows", boolean.class).transform(cfg::setAllowJaggedRows);
    params.getOptional("ignore_unknown_values", boolean.class).transform(cfg::setIgnoreUnknownValues);
    params.getOptional("projection_fields", new TypeReference<List<String>>() {}).transform(cfg::setProjectionFields);
    params.getOptional("autodetect", boolean.class).transform(cfg::setAutodetect);
    params.getOptional("schema_update_options", new TypeReference<List<String>>() {}).transform(cfg::setSchemaUpdateOptions);

    return new JobConfiguration()
            .setLoad(cfg);
}
 
开发者ID:treasure-data,项目名称:digdag,代码行数:36,代码来源:BqLoadOperatorFactory.java


示例9: load

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
private void load(
    JobService jobService,
    DatasetService datasetService,
    String jobIdPrefix,
    TableReference ref,
    TimePartitioning timePartitioning,
    @Nullable TableSchema schema,
    List<String> gcsUris,
    WriteDisposition writeDisposition,
    CreateDisposition createDisposition,
    @Nullable String tableDescription)
    throws InterruptedException, IOException {
  JobConfigurationLoad loadConfig =
      new JobConfigurationLoad()
          .setDestinationTable(ref)
          .setSchema(schema)
          .setSourceUris(gcsUris)
          .setWriteDisposition(writeDisposition.name())
          .setCreateDisposition(createDisposition.name())
          .setSourceFormat("NEWLINE_DELIMITED_JSON");
  if (timePartitioning != null) {
    loadConfig.setTimePartitioning(timePartitioning);
  }
  String projectId = ref.getProjectId();
  Job lastFailedLoadJob = null;
  for (int i = 0; i < BatchLoads.MAX_RETRY_JOBS; ++i) {
    String jobId = jobIdPrefix + "-" + i;
    JobReference jobRef = new JobReference().setProjectId(projectId).setJobId(jobId);
    jobService.startLoadJob(jobRef, loadConfig);
    Job loadJob = jobService.pollJob(jobRef, BatchLoads.LOAD_JOB_POLL_MAX_RETRIES);
    Status jobStatus = BigQueryHelpers.parseStatus(loadJob);
    switch (jobStatus) {
      case SUCCEEDED:
        if (tableDescription != null) {
          datasetService.patchTableDescription(
              ref.clone().setTableId(BigQueryHelpers.stripPartitionDecorator(ref.getTableId())),
              tableDescription);
        }
        return;
      case UNKNOWN:
        throw new RuntimeException(
            String.format(
                "UNKNOWN status of load job [%s]: %s.",
                jobId, BigQueryHelpers.jobToPrettyString(loadJob)));
      case FAILED:
        lastFailedLoadJob = loadJob;
        continue;
      default:
        throw new IllegalStateException(
            String.format(
                "Unexpected status [%s] of load job: %s.",
                jobStatus, BigQueryHelpers.jobToPrettyString(loadJob)));
    }
  }
  throw new RuntimeException(
      String.format(
          "Failed to create load job with id prefix %s, "
              + "reached max retries: %d, last failed load job: %s.",
          jobIdPrefix,
          BatchLoads.MAX_RETRY_JOBS,
          BigQueryHelpers.jobToPrettyString(lastFailedLoadJob)));
}
 
开发者ID:apache,项目名称:beam,代码行数:63,代码来源:WriteTables.java


示例10: testSuccess_doPost

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
@Test
public void testSuccess_doPost() throws Exception {
  action.run();

  // Verify that bigqueryFactory was called in a way that would create the dataset if it didn't
  // already exist.
  verify(bigqueryFactory).create("Project-Id", "snapshots");

  // Capture the load jobs we inserted to do additional checking on them.
  ArgumentCaptor<Job> jobArgument = ArgumentCaptor.forClass(Job.class);
  verify(bigqueryJobs, times(3)).insert(eq("Project-Id"), jobArgument.capture());
  List<Job> jobs = jobArgument.getAllValues();
  assertThat(jobs).hasSize(3);

  // Check properties that should be common to all load jobs.
  for (Job job : jobs) {
    assertThat(job.getJobReference().getProjectId()).isEqualTo("Project-Id");
    JobConfigurationLoad config = job.getConfiguration().getLoad();
    assertThat(config.getSourceFormat()).isEqualTo("DATASTORE_BACKUP");
    assertThat(config.getDestinationTable().getProjectId()).isEqualTo("Project-Id");
    assertThat(config.getDestinationTable().getDatasetId()).isEqualTo("snapshots");
  }

  // Check the job IDs for each load job.
  assertThat(transform(jobs, job -> job.getJobReference().getJobId()))
      .containsExactly(
          "load-snapshot-id12345-one-1391096117045",
          "load-snapshot-id12345-two-1391096117045",
          "load-snapshot-id12345-three-1391096117045");

  // Check the source URI for each load job.
  assertThat(
          transform(
              jobs,
              job -> Iterables.getOnlyElement(job.getConfiguration().getLoad().getSourceUris())))
      .containsExactly(
          "gs://bucket/snapshot.one.backup_info",
          "gs://bucket/snapshot.two.backup_info",
          "gs://bucket/snapshot.three.backup_info");

  // Check the destination table ID for each load job.
  assertThat(
          transform(
              jobs, job -> job.getConfiguration().getLoad().getDestinationTable().getTableId()))
      .containsExactly("id12345_one", "id12345_two", "id12345_three");

  // Check that we executed the inserted jobs.
  verify(bigqueryJobsInsert, times(3)).execute();

  // Check that the poll tasks for each load job were enqueued.
  verify(bigqueryPollEnqueuer).enqueuePollTask(
      new JobReference()
          .setProjectId("Project-Id")
          .setJobId("load-snapshot-id12345-one-1391096117045"),
      UpdateSnapshotViewAction.createViewUpdateTask("snapshots", "id12345_one", "one"),
      QueueFactory.getQueue(UpdateSnapshotViewAction.QUEUE));
  verify(bigqueryPollEnqueuer).enqueuePollTask(
      new JobReference()
          .setProjectId("Project-Id")
          .setJobId("load-snapshot-id12345-two-1391096117045"),
      UpdateSnapshotViewAction.createViewUpdateTask("snapshots", "id12345_two", "two"),
      QueueFactory.getQueue(UpdateSnapshotViewAction.QUEUE));
  verify(bigqueryPollEnqueuer).enqueuePollTask(
      new JobReference()
          .setProjectId("Project-Id")
          .setJobId("load-snapshot-id12345-three-1391096117045"),
      UpdateSnapshotViewAction.createViewUpdateTask("snapshots", "id12345_three", "three"),
      QueueFactory.getQueue(UpdateSnapshotViewAction.QUEUE));
}
 
开发者ID:google,项目名称:nomulus,代码行数:70,代码来源:LoadSnapshotActionTest.java


示例11: importFromGcs

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
/**
 * Imports data from GCS into BigQuery via a load job. Optionally polls for completion before
 * returning.
 *
 * @param projectId the project on whose behalf to perform the load.
 * @param tableRef the reference to the destination table.
 * @param schema the schema of the source data to populate the destination table by.
 * @param sourceFormat the file format of the source data.
 * @param writeDisposition the write disposition of the output table.
 * @param gcsPaths the location of the source data in GCS.
 * @param awaitCompletion if true, block and poll until job completes, otherwise return as soon as
 *     the job has been successfully dispatched.
 * @throws IOException
 * @throws InterruptedException if interrupted while waiting for job completion.
 */
public void importFromGcs(
    String projectId,
    TableReference tableRef,
    @Nullable TableSchema schema,
    BigQueryFileFormat sourceFormat,
    String writeDisposition,
    List<String> gcsPaths,
    boolean awaitCompletion)
    throws IOException, InterruptedException {
  LOG.info(
      "Importing into table '{}' from {} paths; path[0] is '{}'; awaitCompletion: {}",
      BigQueryStrings.toString(tableRef),
      gcsPaths.size(),
      gcsPaths.isEmpty() ? "(empty)" : gcsPaths.get(0),
      awaitCompletion);

  // Create load conf with minimal requirements.
  JobConfigurationLoad loadConfig = new JobConfigurationLoad();
  loadConfig.setSchema(schema);
  loadConfig.setSourceFormat(sourceFormat.getFormatIdentifier());
  loadConfig.setSourceUris(gcsPaths);
  loadConfig.setDestinationTable(tableRef);
  loadConfig.setWriteDisposition(writeDisposition);

  // Auto detect the schema if we're not given one, otherwise use the passed schema.
  if (schema == null) {
    LOG.info("No import schema provided, auto detecting schema.");
    loadConfig.setAutodetect(true);
  } else {
    LOG.info("Using provided import schema '{}'.", schema.toString());
  }

  JobConfiguration config = new JobConfiguration();
  config.setLoad(loadConfig);

  JobReference jobReference = createJobReference(projectId, "direct-bigqueryhelper-import");
  Job job = new Job();
  job.setConfiguration(config);
  job.setJobReference(jobReference);

  // Insert and run job.
  insertJobOrFetchDuplicate(projectId, job);

  if (awaitCompletion) {
    // Poll until job is complete.
    BigQueryUtils.waitForJobCompletion(
        getRawBigquery(), projectId, jobReference, NOP_PROGRESSABLE);
  }
}
 
开发者ID:GoogleCloudPlatform,项目名称:bigdata-interop,代码行数:65,代码来源:BigQueryHelper.java


示例12: startLoadJob

import com.google.api.services.bigquery.model.JobConfigurationLoad; //导入依赖的package包/类
/**
 * Start a BigQuery load job.
 */
void startLoadJob(JobReference jobRef, JobConfigurationLoad loadConfig)
    throws InterruptedException, IOException;
 
开发者ID:apache,项目名称:beam,代码行数:6,代码来源:BigQueryServices.java



注:本文中的com.google.api.services.bigquery.model.JobConfigurationLoad类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java EntitySilverfish类代码示例发布时间:2022-05-23
下一篇:
Java TokenResponse类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap