本文整理汇总了Java中com.sforce.async.OperationEnum类的典型用法代码示例。如果您正苦于以下问题:Java OperationEnum类的具体用法?Java OperationEnum怎么用?Java OperationEnum使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
OperationEnum类属于com.sforce.async包,在下文中一共展示了OperationEnum类的19个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: objectSpecificBulkCleanup
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected void objectSpecificBulkCleanup(DeskUtil du) throws Exception
{
// create new job
this.jobId = du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_CASE, CaseFields.DeskId,
OperationEnum.upsert);
// get the upper boundary for the created/updated timestamp
Case record = (Case) recList.get(recList.size() - 1);
int lastTimestamp = (int) ((delta ? record.getUpdatedAt().getTime() : record.getCreatedAt().getTime()) / 1000);
// close the bulk job
du.getSalesforceService().closeBulkJob(jobId);
// log the upper timestamp boundary
dr.addError(String.format("Migrated all records created/updated before: [%d]", lastTimestamp));
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:18,代码来源:DeskCaseMigration.java
示例2: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
private JobInfo createJob(String sobjectType, BulkConnection connection)
throws AsyncApiException {
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(conf.queryAll ? OperationEnum.queryAll : OperationEnum.query);
job.setContentType(ContentType.CSV);
if (conf.usePKChunking) {
String headerValue = CHUNK_SIZE + "=" + conf.chunkSize;
if (!StringUtils.isEmpty(conf.startId)) {
headerValue += "; " + START_ROW + "=" + conf.startId;
}
connection.addHeader(SFORCE_ENABLE_PKCHUNKING, headerValue);
}
job = connection.createJob(job);
return job;
}
开发者ID:streamsets,项目名称:datacollector,代码行数:17,代码来源:ForceSource.java
示例3: writeAndFlushRecord
import com.sforce.async.OperationEnum; //导入依赖的package包/类
private void writeAndFlushRecord(DataGenerator gen, Record record, OperationEnum op) throws IOException, DataGeneratorException {
// Make a record with just the fields we need
Record outRecord = context.createRecord(record.getHeader().getSourceId());
LinkedHashMap<String, Field> map = new LinkedHashMap<>();
for (Map.Entry<String, String> mapping : fieldMappings.entrySet()) {
String sFieldName = mapping.getKey();
String fieldPath = mapping.getValue();
// If we're missing fields, skip them.
if (!record.has(fieldPath)) {
continue;
}
// We only need Id for deletes
if (op == OperationEnum.delete && !("Id".equalsIgnoreCase(sFieldName))) {
continue;
}
map.put(sFieldName, record.get(fieldPath));
}
outRecord.set(Field.createListMap(map));
gen.write(outRecord);
gen.flush();
}
开发者ID:streamsets,项目名称:datacollector,代码行数:27,代码来源:ForceBulkWriter.java
示例4: createBulkJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
public String createBulkJob(String sobjectType, String upsertField,
OperationEnum op) throws AsyncApiException
{
Utils.log("[BULK] Creating Bulk Job:" + "\n\tObject: [" + sobjectType + "]" + "\n\tUnique Field: ["
+ upsertField + "]" + "\n\tOperation: [" + op + "]");
// create a connection
createBulkConnection();
// create batch job
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(op);
job.setConcurrencyMode(ConcurrencyMode.Serial);
// JSON available in Spring '16
job.setContentType(ContentType.JSON);
if (upsertField != null)
{
job.setExternalIdFieldName(upsertField);
}
// create the job
job = _bConn.createJob(job);
Utils.log("Job created: " + job.getId());
return job.getId();
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:28,代码来源:SalesforceService.java
示例5: createBulkJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
public JobInfo createBulkJob(String objectName) throws ResourceException {
try {
JobInfo job = new JobInfo();
job.setObject(objectName);
job.setOperation(OperationEnum.insert);
job.setContentType(ContentType.XML);
return this.bulkConnection.createJob(job);
} catch (AsyncApiException e) {
throw new ResourceException(e);
}
}
开发者ID:kenweezy,项目名称:teiid,代码行数:13,代码来源:SalesforceConnectionImpl.java
示例6: syncQuery
import com.sforce.async.OperationEnum; //导入依赖的package包/类
public List<Map<String, String>> syncQuery(String objectType, String query)
throws InterruptedException, AsyncApiException, IOException {
// ジョブ作成
JobInfo jobInfo = new JobInfo();
jobInfo.setObject(objectType);
if (queryAll) {
jobInfo.setOperation(OperationEnum.queryAll);
} else {
jobInfo.setOperation(OperationEnum.query);
}
jobInfo.setContentType(ContentType.CSV);
jobInfo = bulkConnection.createJob(jobInfo);
// バッチ作成
InputStream is = new ByteArrayInputStream(query.getBytes());
BatchInfo batchInfo = bulkConnection.createBatchFromStream(jobInfo, is);
// ジョブクローズ
JobInfo closeJob = new JobInfo();
closeJob.setId(jobInfo.getId());
closeJob.setState(JobStateEnum.Closed);
bulkConnection.updateJob(closeJob);
// 実行状況取得
batchInfo = waitBatch(batchInfo);
BatchStateEnum state = batchInfo.getState();
// 実行結果取得
if (state == BatchStateEnum.Completed) {
QueryResultList queryResultList =
bulkConnection.getQueryResultList(
batchInfo.getJobId(),
batchInfo.getId());
return getQueryResultMapList(batchInfo, queryResultList);
} else {
throw new AsyncApiException(batchInfo.getStateMessage(), AsyncExceptionCode.InvalidBatch);
}
}
开发者ID:mikoto2000,项目名称:embulk-input-salesforce_bulk,代码行数:40,代码来源:SalesforceBulkWrapper.java
示例7: processJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
private JobBatches processJob(
String sObjectName, OperationEnum operation, List<Record> records, String externalIdField
) throws
StageException {
try {
JobInfo job = createJob(sObjectName, operation, externalIdField);
List<BatchInfo> batchInfoList = createBatchesFromRecordCollection(job, records, operation);
closeJob(job.getId());
return new JobBatches(job, batchInfoList, records);
} catch (AsyncApiException | IOException e) {
throw new StageException(Errors.FORCE_13,
ForceUtils.getExceptionCode(e) + ", " + ForceUtils.getExceptionMessage(e)
);
}
}
开发者ID:streamsets,项目名称:datacollector,代码行数:16,代码来源:ForceBulkWriter.java
示例8: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
private JobInfo createJob(String sobjectType, OperationEnum operation, String externalIdField)
throws AsyncApiException {
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(operation);
if (externalIdField != null) {
job.setExternalIdFieldName(externalIdField);
}
job.setContentType(ContentType.CSV);
job = bulkConnection.createJob(job);
LOG.info("Created Bulk API job {}", job.getId());
return job;
}
开发者ID:streamsets,项目名称:datacollector,代码行数:14,代码来源:ForceBulkWriter.java
示例9: setBulkOperation
import com.sforce.async.OperationEnum; //导入依赖的package包/类
private void setBulkOperation(String sObjectType, OutputAction userOperation, String externalIdFieldName,
String contentTypeStr, String bulkFileName, int maxBytes, int maxRows) {
this.sObjectType = sObjectType;
switch (userOperation) {
case INSERT:
operation = OperationEnum.insert;
break;
case UPDATE:
operation = OperationEnum.update;
break;
case UPSERT:
operation = OperationEnum.upsert;
break;
case DELETE:
operation = OperationEnum.delete;
break;
default:
operation = OperationEnum.insert;
break;
}
this.externalIdFieldName = externalIdFieldName;
if ("csv".equals(contentTypeStr)) {
contentType = ContentType.CSV;
} else if ("xml".equals(contentTypeStr)) {
contentType = ContentType.XML;
}
this.bulkFileName = bulkFileName;
int sforceMaxBytes = 10 * 1024 * 1024;
int sforceMaxRows = 10000;
maxBytesPerBatch = (maxBytes > sforceMaxBytes) ? sforceMaxBytes : maxBytes;
maxRowsPerBatch = (maxRows > sforceMaxRows) ? sforceMaxRows : maxRows;
}
开发者ID:Talend,项目名称:components,代码行数:36,代码来源:SalesforceBulkRuntime.java
示例10: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
/**
* Create a new job using the Bulk API.
*
* @return The JobInfo for the new job.
* @throws AsyncApiException
* @throws ConnectionException
*/
private JobInfo createJob() throws AsyncApiException, ConnectionException {
JobInfo job = new JobInfo();
if (concurrencyMode != null) {
job.setConcurrencyMode(concurrencyMode);
}
job.setObject(sObjectType);
job.setOperation(operation);
if (OperationEnum.upsert.equals(operation)) {
job.setExternalIdFieldName(externalIdFieldName);
}
job.setContentType(contentType);
job = createJob(job);
return job;
}
开发者ID:Talend,项目名称:components,代码行数:22,代码来源:SalesforceBulkRuntime.java
示例11: migrateDeskLabels
import com.sforce.async.OperationEnum; //导入依赖的package包/类
public DeployResponse migrateDeskLabels() throws Exception
{
// declare page counter
int page = 0;
// declare deploy result
DeployResponse dr = new DeployResponse();
// declare the response objects at this scope so I can check them in the do/while loop
Response<ApiResponse<Label>> resp = null;
ApiResponse<Label> apiResp = null;
// declare the list that will be returned
List<Label> recList = new ArrayList<>();
// get a service
LabelService service = getDeskClient().labels();
// create bulk job
String jobId = getSalesforceService().createBulkJob(SalesforceConstants.OBJ_TOPIC, TopicFields.Name,
OperationEnum.upsert);
// loop through retrieving records
do
{
// increment the page counter
page++;
// retrieve the records synchronously
// TODO: replace "language" with actual language?
resp = service.getLabels(DESK_PAGE_SIZE_LABEL, page).execute();
// check for success
if (resp.isSuccess())
{
// get the response body
apiResp = resp.body();
// add the list of records to the return list
recList.addAll(apiResp.getEntriesAsList());
Utils.log("Retrieved [" + recList.size() + "] records. Max is [" + DESK_PAGE_SIZE_CASE + "]");
// every 10k records, pass to createCases() to bulk upsert them
if (recList.size() >= SalesforceConstants.BULK_MAX_SIZE && !SalesforceConstants.READ_ONLY)
{
// create the cases
dr.addDeployResponse(createTopicsFromLabels(jobId, recList));
// clear the records that were bulk inserted
recList = recList.subList(SalesforceConstants.BULK_MAX_SIZE, recList.size());
}
}
else
{
Utils.log(resp.headers().toString());
throw new Exception(
String.format("Error (%d): %s\n%s", resp.code(), resp.message(), resp.errorBody().string()));
}
}
// continue to loop while the request is successful and there are subsequent pages of results
while (resp.isSuccess() && apiResp.hasNextPage());
// process any records over the 10k chunk
if (!recList.isEmpty() && !SalesforceConstants.READ_ONLY)
{
dr.addDeployResponse(createTopicsFromLabels(jobId, recList));
}
// close the bulk job
getSalesforceService().closeBulkJob(jobId);
// return the list of records
return dr;
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:76,代码来源:DeskUtil.java
示例12: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception {
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_USER, UserFields.DeskId,
OperationEnum.upsert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:6,代码来源:DeskUserMigration.java
示例13: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception
{
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_CASE, CaseFields.DeskId,
OperationEnum.upsert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:7,代码来源:DeskCaseMigration.java
示例14: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception
{
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_CONTACT, CaseFields.DeskId,
OperationEnum.upsert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:7,代码来源:DeskContactMigration.java
示例15: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception {
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_ARTICLE, CaseFields.Id,
OperationEnum.upsert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:6,代码来源:DeskArticleMigration.java
示例16: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception
{
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_CASE_COMMENT, null,
OperationEnum.insert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:7,代码来源:DeskNoteMigration.java
示例17: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
@Override
protected String createJob(DeskUtil du) throws Exception
{
return du.getSalesforceService().createBulkJob(SalesforceConstants.OBJ_ACCOUNT, CaseFields.DeskId,
OperationEnum.upsert);
}
开发者ID:forcedotcom,项目名称:scmt-server,代码行数:7,代码来源:DeskAccountMigration.java
示例18: doBulkQuery
import com.sforce.async.OperationEnum; //导入依赖的package包/类
/**
* Creates and executes job for bulk query. Job must be finished in 2 minutes on Salesforce side.<br/>
* From Salesforce documentation two scenarios are possible here:
* <ul>
* <li>simple bulk query. It should have status - {@link BatchStateEnum#Completed}.</li>
* <li>primary key chunking bulk query. It should return first batch info with status - {@link BatchStateEnum#NotProcessed}.<br/>
* Other batch info's should have status - {@link BatchStateEnum#Completed}</li>
* </ul>
*
* @param moduleName - input module name.
* @param queryStatement - to be executed.
* @throws AsyncApiException
* @throws InterruptedException
* @throws ConnectionException
*/
public void doBulkQuery(String moduleName, String queryStatement)
throws AsyncApiException, InterruptedException, ConnectionException {
job = new JobInfo();
job.setObject(moduleName);
job.setOperation(OperationEnum.query);
if (concurrencyMode != null) {
job.setConcurrencyMode(concurrencyMode);
}
job.setContentType(ContentType.CSV);
job = createJob(job);
if (job.getId() == null) { // job creation failed
throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_INTERNAL_SERVER_ERROR, "failedBatch"),
ExceptionContext.build().put("failedBatch", job));
}
ByteArrayInputStream bout = new ByteArrayInputStream(queryStatement.getBytes());
BatchInfo info = createBatchFromStream(job, bout);
int secToWait = 1;
int tryCount = 0;
while (true) {
LOGGER.debug("Awaiting " + secToWait + " seconds for results ...\n" + info);
Thread.sleep(secToWait * 1000);
info = getBatchInfo(job.getId(), info.getId());
if (info.getState() == BatchStateEnum.Completed
|| (BatchStateEnum.NotProcessed == info.getState() && 0 < chunkSize)) {
break;
} else if (info.getState() == BatchStateEnum.Failed) {
throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_BAD_REQUEST, "failedBatch"),
ExceptionContext.build().put("failedBatch", info));
}
tryCount++;
if (tryCount % 3 == 0) {// after 3 attempt to get the result we multiply the time to wait by 2
secToWait = secToWait * 2;
}
// There is also a 2-minute limit on the time to process the query.
// If the query takes more than 2 minutes to process, a QUERY_TIMEOUT error is returned.
// https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/asynch_api_concepts_limits.htm
int processingTime = (int) ((System.currentTimeMillis() - job.getCreatedDate().getTimeInMillis()) / 1000);
if (processingTime > 120) {
throw new ComponentException(new DefaultErrorCode(HttpServletResponse.SC_REQUEST_TIMEOUT, "failedBatch"),
ExceptionContext.build().put("failedBatch", info));
}
}
retrieveResultsOfQuery(info);
}
开发者ID:Talend,项目名称:components,代码行数:63,代码来源:SalesforceBulkRuntime.java
示例19: createJob
import com.sforce.async.OperationEnum; //导入依赖的package包/类
/**
* Create a new job using the Bulk API.
*
* @param sobjectType
* The object type being loaded, such as "Account"
* @param connection
* BulkConnection used to create the new job.
* @param operation
* operation to be performed - insert/update/query/upsert
* @return The JobInfo for the new job.
* @throws AsyncApiException
*/
public JobInfo createJob(String sobjectType, BulkConnection bulkConnection, OperationEnum operation)
throws AsyncApiException {
JobInfo job = new JobInfo();
job.setObject(sobjectType);
job.setOperation(operation);
job.setContentType(ContentType.CSV);
job = bulkConnection.createJob(job);
return job;
}
开发者ID:forcedotcom,项目名称:ApexUnit,代码行数:23,代码来源:AsyncBulkApiHandler.java
注:本文中的com.sforce.async.OperationEnum类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论