• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java ContentType类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.sforce.async.ContentType的典型用法代码示例。如果您正苦于以下问题:Java ContentType类的具体用法?Java ContentType怎么用?Java ContentType使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



ContentType类属于com.sforce.async包,在下文中一共展示了ContentType类的11个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: createJob

import com.sforce.async.ContentType; //导入依赖的package包/类
private JobInfo createJob(String sobjectType, BulkConnection connection)
        throws AsyncApiException {
  JobInfo job = new JobInfo();
  job.setObject(sobjectType);
  job.setOperation(conf.queryAll ? OperationEnum.queryAll : OperationEnum.query);
  job.setContentType(ContentType.CSV);
  if (conf.usePKChunking) {
    String headerValue = CHUNK_SIZE + "=" + conf.chunkSize;
    if (!StringUtils.isEmpty(conf.startId)) {
      headerValue += "; " + START_ROW + "=" + conf.startId;
    }
    connection.addHeader(SFORCE_ENABLE_PKCHUNKING, headerValue);
  }
  job = connection.createJob(job);
  return job;
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:17,代码来源:ForceSource.java


示例2: createBulkJob

import com.sforce.async.ContentType; //导入依赖的package包/类
public String createBulkJob(String sobjectType, String upsertField,
    OperationEnum op) throws AsyncApiException
{
    Utils.log("[BULK] Creating Bulk Job:" + "\n\tObject:       [" + sobjectType + "]" + "\n\tUnique Field: ["
        + upsertField + "]" + "\n\tOperation:    [" + op + "]");

    // create a connection
    createBulkConnection();

    // create batch job
    JobInfo job = new JobInfo();
    job.setObject(sobjectType);
    job.setOperation(op);
    job.setConcurrencyMode(ConcurrencyMode.Serial);
    // JSON available in Spring '16
    job.setContentType(ContentType.JSON);
    if (upsertField != null)
    {
        job.setExternalIdFieldName(upsertField);
    }

    // create the job
    job = _bConn.createJob(job);

    Utils.log("Job created: " + job.getId());
    return job.getId();
}
 
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:28,代码来源:SalesforceService.java


示例3: addBatchToJob

import com.sforce.async.ContentType; //导入依赖的package包/类
public void addBatchToJob(String jobId, List<Map<String, Object>> records)
    throws UnsupportedEncodingException, AsyncApiException
{
    Utils.log("[BULK] Adding [" + records.size() + "] records to job [" + jobId + "].");

    // convert the records into a byte stream
    ByteArrayInputStream jsonStream = new ByteArrayInputStream(JsonUtil.toJson(records).getBytes("UTF-8"));

    JobInfo job = new JobInfo();
    job.setId(jobId);
    job.setContentType(ContentType.JSON);

    // submit a batch to the job
    _bConn.createBatchFromStream(job, jsonStream);
}
 
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:16,代码来源:SalesforceService.java


示例4: closeBulkJob

import com.sforce.async.ContentType; //导入依赖的package包/类
public void closeBulkJob(String jobId) throws AsyncApiException
{
    Utils.log("[BULK] Closing Bulk Job: [" + jobId + "]");

    JobInfo job = new JobInfo();
    job.setId(jobId);
    job.setState(JobStateEnum.Closed);
    job.setContentType(ContentType.JSON);
    // _bConn.updateJob(job, ContentType.JSON);

    // unclear if I can use this
    _bConn.closeJob(jobId);
}
 
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:14,代码来源:SalesforceService.java


示例5: createNewJob

import com.sforce.async.ContentType; //导入依赖的package包/类
public boolean createNewJob(String jobId) throws AsyncApiException
{
    JobInfo job = _bConn.getJobStatus(jobId, ContentType.JSON);
    Utils.log("[BULK] Getting Bulk Job Status: [" + jobId + "]");
    Calendar cal = job.getCreatedDate();

    // If current time is more than the time since the job was created, true
    return ((System.currentTimeMillis() - cal.getTime().getTime()) > SalesforceConstants.JOB_LIFE);
}
 
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:10,代码来源:SalesforceService.java


示例6: createBulkJob

import com.sforce.async.ContentType; //导入依赖的package包/类
@Override
public JobInfo createBulkJob(String objectName) throws ResourceException {
       try {
		JobInfo job = new JobInfo();
		job.setObject(objectName);
		job.setOperation(OperationEnum.insert);
		job.setContentType(ContentType.XML);
		return this.bulkConnection.createJob(job);
	} catch (AsyncApiException e) {
		throw new ResourceException(e);
	}
}
 
开发者ID:kenweezy,项目名称:teiid,代码行数:13,代码来源:SalesforceConnectionImpl.java


示例7: syncQuery

import com.sforce.async.ContentType; //导入依赖的package包/类
public List<Map<String, String>> syncQuery(String objectType, String query)
        throws InterruptedException, AsyncApiException, IOException {

    // ジョブ作成
    JobInfo jobInfo = new JobInfo();
    jobInfo.setObject(objectType);
    if (queryAll) {
        jobInfo.setOperation(OperationEnum.queryAll);
    } else {
        jobInfo.setOperation(OperationEnum.query);
    }
    jobInfo.setContentType(ContentType.CSV);
    jobInfo = bulkConnection.createJob(jobInfo);

    // バッチ作成
    InputStream is = new ByteArrayInputStream(query.getBytes());
    BatchInfo batchInfo = bulkConnection.createBatchFromStream(jobInfo, is);

    // ジョブクローズ
    JobInfo closeJob = new JobInfo();
    closeJob.setId(jobInfo.getId());
    closeJob.setState(JobStateEnum.Closed);
    bulkConnection.updateJob(closeJob);

    // 実行状況取得
    batchInfo = waitBatch(batchInfo);
    BatchStateEnum state = batchInfo.getState();

    // 実行結果取得
    if (state == BatchStateEnum.Completed) {
        QueryResultList queryResultList =
                bulkConnection.getQueryResultList(
                        batchInfo.getJobId(),
                        batchInfo.getId());
        return getQueryResultMapList(batchInfo, queryResultList);
    } else {
        throw new AsyncApiException(batchInfo.getStateMessage(), AsyncExceptionCode.InvalidBatch);
    }
}
 
开发者ID:mikoto2000,项目名称:embulk-input-salesforce_bulk,代码行数:40,代码来源:SalesforceBulkWrapper.java


示例8: createJob

import com.sforce.async.ContentType; //导入依赖的package包/类
private JobInfo createJob(String sobjectType, OperationEnum operation, String externalIdField)
    throws AsyncApiException {
  JobInfo job = new JobInfo();
  job.setObject(sobjectType);
  job.setOperation(operation);
  if (externalIdField != null) {
    job.setExternalIdFieldName(externalIdField);
  }
  job.setContentType(ContentType.CSV);
  job = bulkConnection.createJob(job);
  LOG.info("Created Bulk API job {}", job.getId());
  return job;
}
 
开发者ID:streamsets,项目名称:datacollector,代码行数:14,代码来源:ForceBulkWriter.java


示例9: setBulkOperation

import com.sforce.async.ContentType; //导入依赖的package包/类
private void setBulkOperation(String sObjectType, OutputAction userOperation, String externalIdFieldName,
        String contentTypeStr, String bulkFileName, int maxBytes, int maxRows) {
    this.sObjectType = sObjectType;
    switch (userOperation) {
    case INSERT:
        operation = OperationEnum.insert;
        break;
    case UPDATE:
        operation = OperationEnum.update;
        break;
    case UPSERT:
        operation = OperationEnum.upsert;
        break;
    case DELETE:
        operation = OperationEnum.delete;
        break;

    default:
        operation = OperationEnum.insert;
        break;
    }
    this.externalIdFieldName = externalIdFieldName;

    if ("csv".equals(contentTypeStr)) {
        contentType = ContentType.CSV;
    } else if ("xml".equals(contentTypeStr)) {
        contentType = ContentType.XML;
    }
    this.bulkFileName = bulkFileName;

    int sforceMaxBytes = 10 * 1024 * 1024;
    int sforceMaxRows = 10000;
    maxBytesPerBatch = (maxBytes > sforceMaxBytes) ? sforceMaxBytes : maxBytes;
    maxRowsPerBatch = (maxRows > sforceMaxRows) ? sforceMaxRows : maxRows;
}
 
开发者ID:Talend,项目名称:components,代码行数:36,代码来源:SalesforceBulkRuntime.java


示例10: doBulkQuery

import com.sforce.async.ContentType; //导入依赖的package包/类
/**
 * Creates and executes job for bulk query. Job must be finished in 2 minutes on Salesforce side.<br/>
 * From Salesforce documentation two scenarios are possible here:
 * <ul>
 * <li>simple bulk query. It should have status - {@link BatchStateEnum#Completed}.</li>
 * <li>primary key chunking bulk query. It should return first batch info with status - {@link BatchStateEnum#NotProcessed}.<br/>
 * Other batch info's should have status - {@link BatchStateEnum#Completed}</li>
 * </ul>
 *
 * @param moduleName - input module name.
 * @param queryStatement - to be executed.
 * @throws AsyncApiException
 * @throws InterruptedException
 * @throws ConnectionException
 */
public void doBulkQuery(String moduleName, String queryStatement)
        throws AsyncApiException, InterruptedException, ConnectionException {
    job = new JobInfo();
    job.setObject(moduleName);
    job.setOperation(OperationEnum.query);
    if (concurrencyMode != null) {
        job.setConcurrencyMode(concurrencyMode);
    }
    job.setContentType(ContentType.CSV);
    job = createJob(job);
    if (job.getId() == null) { // job creation failed
        throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_INTERNAL_SERVER_ERROR, "failedBatch"),
                ExceptionContext.build().put("failedBatch", job));
    }

    ByteArrayInputStream bout = new ByteArrayInputStream(queryStatement.getBytes());
    BatchInfo info = createBatchFromStream(job, bout);
    int secToWait = 1;
    int tryCount = 0;
    while (true) {
        LOGGER.debug("Awaiting " + secToWait + " seconds for results ...\n" + info);
        Thread.sleep(secToWait * 1000);
        info = getBatchInfo(job.getId(), info.getId());

        if (info.getState() == BatchStateEnum.Completed
                || (BatchStateEnum.NotProcessed == info.getState() && 0 < chunkSize)) {
            break;
        } else if (info.getState() == BatchStateEnum.Failed) {
            throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_BAD_REQUEST, "failedBatch"),
                    ExceptionContext.build().put("failedBatch", info));
        }
        tryCount++;
        if (tryCount % 3 == 0) {// after 3 attempt to get the result we multiply the time to wait by 2
            secToWait = secToWait * 2;
        }
        // There is also a 2-minute limit on the time to process the query.
        // If the query takes more than 2 minutes to process, a QUERY_TIMEOUT error is returned.
        // https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_limits.htm
        int processingTime = (int) ((System.currentTimeMillis() - job.getCreatedDate().getTimeInMillis()) / 1000);
        if (processingTime > 120) {
            throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_REQUEST_TIMEOUT, "failedBatch"),
                    ExceptionContext.build().put("failedBatch", info));
        }
    }

    retrieveResultsOfQuery(info);
}
 
开发者ID:Talend,项目名称:components,代码行数:63,代码来源:SalesforceBulkRuntime.java


示例11: createJob

import com.sforce.async.ContentType; //导入依赖的package包/类
/**
 * Create a new job using the Bulk API.
 * 
 * @param sobjectType
 *            The object type being loaded, such as "Account"
 * @param connection
 *            BulkConnection used to create the new job.
 * @param operation
 *            operation to be performed - insert/update/query/upsert
 * @return The JobInfo for the new job.
 * @throws AsyncApiException
 */
public JobInfo createJob(String sobjectType, BulkConnection bulkConnection, OperationEnum operation)
		throws AsyncApiException {
	JobInfo job = new JobInfo();
	job.setObject(sobjectType);
	job.setOperation(operation);
	job.setContentType(ContentType.CSV);
	job = bulkConnection.createJob(job);

	return job;
}
 
开发者ID:forcedotcom,项目名称:ApexUnit,代码行数:23,代码来源:AsyncBulkApiHandler.java



注:本文中的com.sforce.async.ContentType类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java LocatorTask类代码示例发布时间:2022-05-23
下一篇:
Java PackageRevision类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap