• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java Context类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.datatorrent.api.Context的典型用法代码示例。如果您正苦于以下问题:Java Context类的具体用法?Java Context怎么用?Java Context使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Context类属于com.datatorrent.api包,在下文中一共展示了Context类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration configuration) {
    LogLevelProperties props = new LogLevelProperties(configuration);

    //dag.setAttribute(Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS, props.getWindowMillis());

    // create the operator to receive data from NiFi
    WindowDataManager inManager = new WindowDataManager.NoopWindowDataManager();
    NiFiSinglePortInputOperator nifiInput = getNiFiInput(dag, props, inManager);

    // create the operator to count log levels over a window
    String attributName = props.getLogLevelAttribute();
    LogLevelWindowCount count = dag.addOperator("count", new LogLevelWindowCount(attributName));
    dag.setAttribute(count, Context.OperatorContext.APPLICATION_WINDOW_COUNT, props.getAppWindowCount());

    // create the operator to send data back to NiFi
    WindowDataManager outManager = new WindowDataManager.NoopWindowDataManager();
    NiFiSinglePortOutputOperator nifiOutput = getNiFiOutput(dag, props, outManager);

    // configure the dag to get nifi-in -> count -> nifi-out
    dag.addStream("nifi-in-count", nifiInput.outputPort, count.input);
    dag.addStream("count-nifi-out", count.output, nifiOutput.inputPort);
}
 
开发者ID:bbende,项目名称:nifi-streaming-examples,代码行数:24,代码来源:LogLevelApplication.java


示例2: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration conf)
{
  TestStatsListener sl = new TestStatsListener();
  sl.adjustRate = conf.getBoolean("dt.hdsbench.adjustRate", false);
  TestGenerator gen = dag.addOperator("Generator", new TestGenerator());
  dag.setAttribute(gen, OperatorContext.STATS_LISTENERS, Lists.newArrayList((StatsListener)sl));
  TestStoreOperator store = dag.addOperator("Store", new TestStoreOperator());
  dag.setAttribute(store, OperatorContext.STATS_LISTENERS, Lists.newArrayList((StatsListener)sl));
  FileAccessFSImpl hfa = new HFileImpl();
  hfa.setBasePath(this.getClass().getSimpleName());
  store.setFileStore(hfa);
  dag.setInputPortAttribute(store.input, PortContext.PARTITION_PARALLEL, true);
  dag.getOperatorMeta("Store").getAttributes().put(Context.OperatorContext.COUNTERS_AGGREGATOR,
      new HDHTWriter.BucketIOStatAggregator());
  dag.addStream("Events", gen.data, store.input).setLocality(Locality.THREAD_LOCAL);
}
 
开发者ID:DataTorrent,项目名称:Megh,代码行数:18,代码来源:HDHTBenchmarkTest.java


示例3: getAttrDescription

import com.datatorrent.api.Context; //导入依赖的package包/类
private static JSONObject getAttrDescription(Context context, Collection<Field> attributes) throws JSONException,
    IllegalAccessException
{
  JSONObject response = new JSONObject();
  JSONArray attrArray = new JSONArray();
  response.put("attributes", attrArray);
  for (Field attrField : attributes) {
    JSONObject attrJson = new JSONObject();
    attrJson.put("name", attrField.getName());
    ParameterizedType attrType = (ParameterizedType)attrField.getGenericType();

    Attribute<?> attr = (Attribute<?>)attrField.get(context);
    Type pType = attrType.getActualTypeArguments()[0];

    TypeDiscoverer discoverer = new TypeDiscoverer();
    discoverer.resolveTypeParameters(pType, attrJson);

    if (attr.defaultValue != null) {
      attrJson.put("default", attr.defaultValue);
    }
    attrArray.put(attrJson);
  }
  return response;
}
 
开发者ID:apache,项目名称:apex-core,代码行数:25,代码来源:TypeDiscoverer.java


示例4: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(Context.OperatorContext context)
{
  if (isBatchProcessing) {
    getConfigProperties().setProperty(QUEUE_BUFFER_KEY, String.valueOf(context.getValue(
        Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS)));
  }
  super.setup(context);
  windowTimeSec = (context.getValue(Context.OperatorContext.APPLICATION_WINDOW_COUNT)
    * context.getValue(Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS) * 1.0) / 1000.0;
  if (pojoClass != null && keyField != "") {
    try {
      keyMethod = generateGetterForKeyField();
    } catch (NoSuchFieldException e) {
      throw new RuntimeException("Field " + keyField + " is invalid: " + e);
    }
  }
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:19,代码来源:POJOKafkaOutputOperator.java


示例5: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(Context.OperatorContext context)
{
  try {
    schema = new FixedWidthSchema(jsonSchema);
    recordLength = 0;
    List<FixedWidthSchema.Field> fields = schema.getFields();
    for (int i = 0; i < fields.size(); i++) {
      recordLength += fields.get(i).getFieldLength();
    }
    createUnivocityParser();
  } catch (Exception e) {
    logger.error("Cannot setup Parser Reason {}", e.getMessage());
    throw e;
  }
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:17,代码来源:FixedWidthParser.java


示例6: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration configuration)
{
  AvroFileInputOperator avroFileInputOperator = dag.addOperator("AvroFileInputOperator", this.avroFileInputOperator);
  AvroToPojo avroToPojo = dag.addOperator("AvroGenericObjectToPojo", new AvroToPojo());

  dag.setOutputPortAttribute(avroToPojo.output, Context.PortContext.TUPLE_CLASS, pojoClass);

  dag.addStream("avroFileContainerToPojo", avroFileInputOperator.output, avroToPojo.data)
      .setLocality(DAG.Locality.CONTAINER_LOCAL);

  output.set(avroToPojo.output);
  errorPort.set(avroToPojo.errorPort);

  completedAvroFilesPort.set(avroFileInputOperator.completedFilesPort);
  avroErrorRecordsPort.set(avroFileInputOperator.errorRecordsPort);
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:18,代码来源:AvroFileToPojoModule.java


示例7: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Before
public void setup()
{
  dag = StramTestSupport.createDAG(testMeta);
  dag.setAttribute(Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS, windowWidthMillis);
  dag.setAttribute(Context.DAGContext.HEARTBEAT_TIMEOUT_MILLIS, heartbeatTimeoutMillis);
  dag.setAttribute(com.datatorrent.api.Context.OperatorContext.STORAGE_AGENT, new StramTestSupport
      .MemoryStorageAgent());

  GenericTestOperator o1 = dag.addOperator("o1", GenericTestOperator.class);
  GenericTestOperator o2 = dag.addOperator("o2", GenericTestOperator.class);
  GenericTestOperator o3 = dag.addOperator("o3", GenericTestOperator.class);

  dag.addStream("o1.output1", o1.outport1, o3.inport1);
  dag.addStream("o2.output1", o2.outport1, o3.inport2);
  scm = new StreamingContainerManager(dag);
  PhysicalPlan plan = scm.getPhysicalPlan();
  o1p1 = plan.getOperators(dag.getMeta(o1)).get(0);
  o2p1 = plan.getOperators(dag.getMeta(o2)).get(0);
  o3p1 = plan.getOperators(dag.getMeta(o3)).get(0);
}
 
开发者ID:apache,项目名称:apex-core,代码行数:22,代码来源:LatencyTest.java


示例8: testMetricsAnnotatedMethod

import com.datatorrent.api.Context; //导入依赖的package包/类
@Test
public void testMetricsAnnotatedMethod() throws Exception
{
  CountDownLatch latch = new CountDownLatch(1);

  LogicalPlanConfiguration lpc = new LogicalPlanConfiguration(new Configuration());

  TestGeneratorInputOperator inputOperator = dag.addOperator("input", TestGeneratorInputOperator.class);

  OperatorWithMetricMethod o1 = dag.addOperator("o1", OperatorWithMetricMethod.class);
  MockAggregator aggregator = new MockAggregator(latch);
  dag.setOperatorAttribute(o1, Context.OperatorContext.METRICS_AGGREGATOR, aggregator);

  dag.setAttribute(Context.OperatorContext.STORAGE_AGENT, new StramTestSupport.MemoryStorageAgent());

  dag.addStream("TestTuples", inputOperator.outport, o1.inport1);

  lpc.prepareDAG(dag, null, "AutoMetricTest");

  StramLocalCluster lc = new StramLocalCluster(dag);
  lc.runAsync();
  latch.await();

  Assert.assertEquals("myMetric", 3, ((Integer)aggregator.result.get("myMetric")).intValue());
  lc.shutdown();
}
 
开发者ID:apache,项目名称:apex-core,代码行数:27,代码来源:AutoMetricTest.java


示例9: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(Context.OperatorContext context)
{
  super.setup(context);

  CacheStore primaryCache = new CacheStore();

  // set expiration to one day.
  primaryCache.setEntryExpiryDurationInMillis(cacheExpirationInterval);
  primaryCache.setCacheCleanupInMillis(cacheCleanupInterval);
  primaryCache.setEntryExpiryStrategy(expiryType);
  primaryCache.setMaxCacheSize(cacheSize);

  cacheManager.setPrimary(primaryCache);
  cacheManager.setBackup(store);
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:17,代码来源:AbstractEnricher.java


示例10: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
public void setup(PortContext context)
{
  jsonParser = new JSONParser();
  finder = new JsonKeyFinder();
  columnFields = new ArrayList<String>();
  columnFieldSetters = Lists.newArrayList();

  setPojoClass(context.getValue(Context.PortContext.TUPLE_CLASS));

  if (getFieldMappingString() == null) {
    setFieldInfos(createFieldInfoMap(generateFieldInfoInputs(getPojoClass())));
  } else {
    setFieldInfos(createFieldInfoMap(getFieldMappingString()));
  }
  initColumnFieldSetters(getFieldInfos());
  initializeActiveFieldSetters();

  ListIterator<FieldInfo> itr = fieldInfos.listIterator();
  while (itr.hasNext()) {
    columnFields.add(itr.next().getColumnName());
  }
  finder.setMatchKeyList(columnFields);
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:24,代码来源:StreamingJsonParser.java


示例11: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration conf)
{
  DummyOperator o1 = dag.addOperator("O1", new DummyOperator());
  o1.setOperatorProp(level1ModuleProp);

  /** set various attribute on the operator for testing */
  Attribute.AttributeMap attr = dag.getMeta(o1).getAttributes();
  attr.put(OperatorContext.MEMORY_MB, memory);
  attr.put(OperatorContext.APPLICATION_WINDOW_COUNT, 2);
  attr.put(OperatorContext.LOCALITY_HOST, "host1");
  attr.put(OperatorContext.PARTITIONER, new TestPartitioner());
  attr.put(OperatorContext.CHECKPOINT_WINDOW_COUNT, 120);
  attr.put(OperatorContext.STATELESS, true);
  attr.put(OperatorContext.SPIN_MILLIS, 20);

  dag.setInputPortAttribute(o1.in, Context.PortContext.BUFFER_MEMORY_MB, portMemory);
  mIn.set(o1.in);
  mOut.set(o1.out1);
}
 
开发者ID:apache,项目名称:apex-core,代码行数:21,代码来源:TestModuleExpansion.java


示例12: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration configuration)
{
  DummyInputGenerator input = dag.addOperator("Input", new DummyInputGenerator());
  FilterOperator filter = dag.addOperator("Filter", new FilterOperator());

  filter.setCondition("(({$}.getNum() % 10) == 0)");

  ConsoleOutputOperator trueConsole = dag.addOperator("TrueConsole", new ConsoleOutputOperator());
  trueConsole.setSilent(true);
  ConsoleOutputOperator falseConsole = dag.addOperator("FalseConsole", new ConsoleOutputOperator());
  falseConsole.setSilent(true);
  ConsoleOutputOperator errorConsole = dag.addOperator("ErrorConsole", new ConsoleOutputOperator());
  errorConsole.setSilent(true);

  dag.getMeta(filter).getMeta(filter.input).getAttributes().put(Context.PortContext.TUPLE_CLASS, DummyPOJO.class);

  dag.addStream("Connect", input.output, filter.input);

  dag.addStream("ConditionTrue", filter.truePort, trueConsole.input);
  dag.addStream("ConditionFalse", filter.falsePort, falseConsole.input);
  dag.addStream("ConditionError", filter.error, errorConsole.input);
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:24,代码来源:FilterAppTest.java


示例13: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
@SuppressWarnings("unchecked")
public void setup(Context.OperatorContext context)
{
  this.timeIncrement = context.getValue(Context.OperatorContext.APPLICATION_WINDOW_COUNT) * context.getValue(Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS);
  validate();
  windowStateMap.setup(context);
  dataStorage.setup(context);
  if (retractionStorage != null) {
    retractionStorage.setup(context);
  }
  if (implicitWatermarkGenerator != null) {
    implicitWatermarkGenerator.setup(context);
  }
  for (Component component : components.values()) {
    component.setup(context);
  }
  if (this.windowOption instanceof WindowOption.GlobalWindow) {
    windowStateMap.put(Window.GlobalWindow.INSTANCE, new WindowState());
  }
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:22,代码来源:AbstractWindowedOperator.java


示例14: populateDAG

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void populateDAG(DAG dag, Configuration conf)
{
  // Add FSInputModule as input and PartFileWriter as output operators to dag.
  FSInputModule input = dag.addModule("HDFSInputModule", new FSInputModule());
  PartFileWriter output = dag.addOperator("PartFileCopy", new PartFileWriter());

  dag.setInputPortAttribute(output.input, Context.PortContext.PARTITION_PARALLEL, true);
  dag.setInputPortAttribute(output.blockMetadataInput, Context.PortContext.PARTITION_PARALLEL, true);

  // Create a stream for blockData, fileMetadata, blockMetadata from Input to PartFileWriter
  dag.addStream("BlocksMetaData", input.blocksMetadataOutput, output.blockMetadataInput).setLocality(DAG.Locality.CONTAINER_LOCAL);
  dag.addStream("BlocksData", input.messages, output.input).setLocality(DAG.Locality.CONTAINER_LOCAL);
  dag.addStream("FileMetaData", input.filesMetadataOutput, output.fileMetadataInput);
}
 
开发者ID:DataTorrent,项目名称:app-templates,代码行数:16,代码来源:Application.java


示例15: activate

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void activate(Context context)
{
  super.activate(context);
  Preconditions.checkArgument(getPojoClass() != null);
  getter = PojoUtils.createGetter(getPojoClass(),
          ((OrderedBucketManagerPOJOImpl)bucketManager).getKeyExpression(), Object.class);
}
 
开发者ID:DataTorrent,项目名称:Megh,代码行数:9,代码来源:DedupUsingOrderedExpiryTest.java


示例16: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(Context.OperatorContext context)
{
  super.setup(context);
  configuration = new Configuration();
  try {
    fs = getFSInstance();
  } catch (IOException e) {
    throw new RuntimeException("creating fs", e);
  }
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:12,代码来源:AbstractFSBlockReader.java


示例17: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(Context.OperatorContext context)
{
  super.setup(context);
  startingTime = System.currentTimeMillis();
  watermarkTime = System.currentTimeMillis() + 10000;
  i = 1;
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:9,代码来源:KeyedWindowedMergeOperatorTestApplication.java


示例18: setup

import com.datatorrent.api.Context; //导入依赖的package包/类
@Override
public void setup(OperatorContext context)
{
  long windowDurationMillis = context.getValue(OperatorContext.APPLICATION_WINDOW_COUNT) *
      context.getValue(Context.DAGContext.STREAMING_WINDOW_SIZE_MILLIS);
  maxEventsPerWindow = (long)(windowDurationMillis / 1000.0 * maxEventsPerSecond);
  logger.debug("max-events per-second {} per-window {}", maxEventsPerSecond, maxEventsPerWindow);

  try {
    eventloop = new DefaultEventLoop("EventLoop-" + context.getId());
    eventloop.start();
  } catch (IOException ex) {
    throw new RuntimeException(ex);
  }
}
 
开发者ID:apache,项目名称:apex-malhar,代码行数:16,代码来源:AbstractFlumeInputOperator.java


示例19: createQueryResult

import com.datatorrent.api.Context; //导入依赖的package包/类
protected PubSubWebSocketAppDataResult createQueryResult(DAG dag, Configuration conf, AppDataSingleSchemaDimensionStoreHDHT store)
{
  PubSubWebSocketAppDataResult wsOut = new PubSubWebSocketAppDataResult();
  URI queryUri = getQueryUri(dag, conf);
  wsOut.setUri(queryUri);
  dag.addOperator("QueryResult", wsOut);
  // Set remaining dag options

  dag.setAttribute(store, Context.OperatorContext.COUNTERS_AGGREGATOR,
      new BasicCounters.LongAggregator<MutableLong>());
  
  return wsOut;
}
 
开发者ID:yahoo,项目名称:streaming-benchmarks,代码行数:14,代码来源:ApplicationDimensionComputation.java


示例20: addAttributeToArgs

import com.datatorrent.api.Context; //导入依赖的package包/类
public static void addAttributeToArgs(Attribute<String> attribute, Context context, List<CharSequence> vargs)
{
  String value = context.getValue(attribute);
  if (value != null) {
    vargs.add(String.format("-D%s=$'%s'", attribute.getLongName(),
        value.replace("\\", "\\\\\\\\").replaceAll("['\"$]", "\\\\$0")));
  }
}
 
开发者ID:apache,项目名称:apex-core,代码行数:9,代码来源:StramClientUtils.java



注:本文中的com.datatorrent.api.Context类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java TvInputService类代码示例发布时间:2022-05-23
下一篇:
Java StateModelDefinition类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap