本文整理汇总了Java中com.google.cloud.dataflow.sdk.testing.TestPipeline类的典型用法代码示例。如果您正苦于以下问题:Java TestPipeline类的具体用法?Java TestPipeline怎么用?Java TestPipeline使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。
TestPipeline类属于com.google.cloud.dataflow.sdk.testing包,在下文中一共展示了TestPipeline类的6个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。
示例1: test_reading
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
@Test
public void test_reading() throws Exception {
final File file =
new File(getClass().getResource("/sample.csv").toURI());
assertThat(file.exists()).isTrue();
final Pipeline pipeline = TestPipeline.create();
final PCollection<String> output =
pipeline.apply(Read.from(CsvWithHeaderFileSource.from(file.getAbsolutePath())));
DataflowAssert
.that(output)
.containsInAnyOrder("a:1, b:2, c:3, ", "a:4, b:5, c:6, ");
pipeline.run();
}
开发者ID:obradovicluka,项目名称:dataflow-playground,代码行数:18,代码来源:CsvWithHeaderFileSourceTest.java
示例2: parseBooks_returnsNgramsCounts
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
@Test
public void parseBooks_returnsNgramsCounts() {
// Arrange
Pipeline p = TestPipeline.create();
PCollection<String> input = p.apply(Create.of(testFile));
// Act
PCollection<KV<String, Integer>> output = LoadBooks.applyPipelineToParseBooks(input);
// Assert
DataflowAssert.that(output)
.containsInAnyOrder(
KV.of("despatch when art", 10),
KV.of("despatch when came", 10),
KV.of("despatch when published", 12),
KV.of("despatch where was", 10),
KV.of("despatch which made", 45),
// There are two entries for "despatch which addressed".
// Each entry has a different part of speech for "addressed".
KV.of("despatch which addressed", 12 + 46),
KV.of("despatch which admitted", 13),
KV.of("despatch which allow", 14),
KV.of("despatch which announced", 50),
KV.of("despatch which answer", 32));
}
开发者ID:GoogleCloudPlatform,项目名称:cloud-bigtable-examples,代码行数:26,代码来源:LoadBooksTest.java
示例3: testCSVToMapLineCombineFn
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
@Test
public void testCSVToMapLineCombineFn() {
final String[] csv = new String[] {
"000,Tokyo,120",
"001,Osaka,100",
"002,Kyoto,140"
};
final List<String> csvlist = Arrays.asList(csv);
Pipeline p = TestPipeline.create();
PCollection<String> maplines = p.apply(Create.of(csvlist)).setCoder(StringUtf8Coder.of());
PCollectionView<Map<String,String>> mapview = maplines.apply(Combine.globally(new CSVToMapLineCombineFn()).asSingletonView());
final String[] dummy = new String[] {
"000",
"001",
"002"
};
List<String> dummylist = Arrays.asList(dummy);
DoFnTester<String,String> fnTester = DoFnTester.of(new AAA(mapview));
fnTester.setSideInputInGlobalWindow(mapview, csvlist);
//dummylines.apply(ParDo.of(fnTester));
List<String> results = fnTester.processBatch(dummylist);
System.out.println(results);
//p.apply()
}
开发者ID:topgate,项目名称:retail-demo,代码行数:30,代码来源:MainPipelineTest.java
示例4: testCountWords
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
/** Example test that tests a PTransform by using an in-memory input and inspecting the output. */
@Test
@Category(RunnableOnService.class)
public void testCountWords() throws Exception {
Pipeline p = TestPipeline.create();
PCollection<String> input = p.apply(Create.of(WORDS).withCoder(StringUtf8Coder.of()));
PCollection<String> output = input.apply(new CountWords())
.apply(ParDo.of(new FormatAsTextFn()));
DataflowAssert.that(output).containsInAnyOrder(COUNTS_ARRAY);
p.run();
}
开发者ID:sinmetal,项目名称:iron-hippo,代码行数:15,代码来源:WordCountTest.java
示例5: testUserExceptionUnwrapping
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
@Test(expectedExceptions = UserException.class)
public void testUserExceptionUnwrapping(){
Pipeline p = TestPipeline.create(); // use regular pipeline as Spark doesn't return user exceptions yet
PCollection<String> pstrings = p.apply(Create.of(Lists.newArrayList("Some", "Values")));
pstrings.apply(DataflowUtils.throwException(new UserException("fail")));
DataflowCommandLineProgram.runPipeline(p);
}
开发者ID:broadinstitute,项目名称:gatk-dataflow,代码行数:8,代码来源:DataflowCommandLineProgramUnitTest.java
示例6: setUp
import com.google.cloud.dataflow.sdk.testing.TestPipeline; //导入依赖的package包/类
@Before
public void setUp() {
pipeline = TestPipeline.create();
}
开发者ID:GoogleCloudPlatform,项目名称:policyscanner,代码行数:5,代码来源:FilterOutMatchingStateTest.java
注:本文中的com.google.cloud.dataflow.sdk.testing.TestPipeline类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。 |
请发表评论