• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java ParseException类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.hive.ql.parse.ParseException的典型用法代码示例。如果您正苦于以下问题:Java ParseException类的具体用法?Java ParseException怎么用?Java ParseException使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



ParseException类属于org.apache.hadoop.hive.ql.parse包,在下文中一共展示了ParseException类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: rewrite

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
public String rewrite(String sourceQry) throws RewriteException {
    String result = sourceQry;
    ASTNode tree = null;
    try {
        ParseDriver pd = new ParseDriver();
        tree = pd.parse(sourceQry, queryContext, true);
        tree = ParseUtils.findRootNonNullToken(tree);
        this.rwCtx = new RewriteContext(sourceQry, tree, queryContext.getTokenRewriteStream());
        rewrite(tree);
        result = toSQL();
    } catch (ParseException e) {
       LOG.error("Could not parse the query {} ", sourceQry, e);
        throw new RewriteException("Could not parse query : " , e);
    }
    return result;
}
 
开发者ID:apache,项目名称:incubator-atlas,代码行数:17,代码来源:HiveASTRewriter.java


示例2: getMockedCubeContext

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
/**
 * Gets the mocked cube context.
 *
 * @param ast the ast
 * @return the mocked cube context
 * @throws ParseException    the parse exception
 * @throws LensException  the lens exception
 */
private CubeQueryContext getMockedCubeContext(ASTNode ast) throws ParseException, LensException {
  CubeQueryContext context = Mockito.mock(CubeQueryContext.class);
  if (ast.getToken().getType() == HiveParser.TOK_QUERY) {
    if (((ASTNode) ast.getChild(0)).getToken().getType() == HiveParser.KW_CUBE) {
      // remove cube child from AST
      for (int i = 0; i < ast.getChildCount() - 1; i++) {
        ast.setChild(i, ast.getChild(i + 1));
      }
      ast.deleteChild(ast.getChildCount() - 1);
    }
  }
  StringBuilder builder = new StringBuilder();
  HQLParser.toInfixString(ast, builder);
  Mockito.when(context.toHQL()).thenReturn(builder.toString());
  Mockito.when(context.toAST(Matchers.any(Context.class))).thenReturn(ast);
  return context;
}
 
开发者ID:apache,项目名称:lens,代码行数:26,代码来源:TestRewriting.java


示例3: testQueryWithExprMeasureAndDimExprWithChainRefInFilter

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWithExprMeasureAndDimExprWithChainRefInFilter() throws ParseException,
    LensException {

  /* In this query a dimension attribute referenced through join chain name is used in filter. If the
  source column for such a dimension attribute and the  queried measure are not present in the same derived cube,
  then query shall be disallowed.

  cubestate.name is a dimension attribute used in where clause(filter) and referenced through join chain name. It is
  queryable through chain source column cityid. cityid and roundedmsr1( expression over msr1) are not present in the
  same derived cube, hence query shall be disallowed with appropriate exception. */

  testFieldsCannotBeQueriedTogetherError(
      "select sum(roundedmsr1) from basecube where cubestatename = 'foo' and " + TWO_DAYS_RANGE,
      Arrays.asList("cubestate.name", "d_time", "msr1"));
}
 
开发者ID:apache,项目名称:lens,代码行数:17,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例4: testQueryWithReferencedDimAttributeAndMeasure

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWithReferencedDimAttributeAndMeasure() throws ParseException,
    LensException {

  /* In this query a referenced dimension attribute is used in select statement. If the source column for such a
  dimension attribute and the  queried measure are not present in the same derived cube, then query shall be
  disallowed.

  cityStateCapital is a referenced dimension attribute used in select statement. It is queryable through chain source
  column cityid. cityid and msr1 are not present in the same derived cube, hence query shall be disallowed with
  appropriate exception. */

  testFieldsCannotBeQueriedTogetherError(
      "select citystatecapital, SUM(msr1) from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("citystatecapital", "d_time", "msr1"));
}
 
开发者ID:apache,项目名称:lens,代码行数:17,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例5: testQueryWtihTimeDimAndReplaceTimeDimSwitchTrue

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWtihTimeDimAndReplaceTimeDimSwitchTrue() throws ParseException, LensException {

  /* If a time dimension and measure are not present in the same derived cube, then query shall be disallowed.

  The testQuery in this test uses d_time in time range func. d_time is a time dimension in basecube.
  d_time is present as a dimension in derived cube where as msr4 is not present in the same derived cube, hence
  the query shall be disallowed.

  The testQuery in this test uses its own queryConf which has CubeQueryConfUtil.REPLACE_TIMEDIM_WITH_PART_COL
  set to true. */

  Configuration queryConf = new Configuration(conf);
  queryConf.setBoolean(CubeQueryConfUtil.REPLACE_TIMEDIM_WITH_PART_COL, true);

  testFieldsCannotBeQueriedTogetherError("select msr4 from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("d_time", "msr4"), queryConf);
}
 
开发者ID:apache,项目名称:lens,代码行数:19,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例6: testQueryWtihTimeDimAndReplaceTimeDimSwitchFalse

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWtihTimeDimAndReplaceTimeDimSwitchFalse() throws ParseException, LensException {

  /* If a time dimension and measure are not present in the same derived cube, then query shall be disallowed.

  The testQuery in this test uses d_time in time range func. d_time is a time dimension in basecube.
  d_time is present as a dimension in derived cube where as msr4 is not present in the same derived cube, hence
  the query shall be disallowed.

  The testQuery in this test uses its own queryConf which has CubeQueryConfUtil.REPLACE_TIMEDIM_WITH_PART_COL
  set to false */

  Configuration queryConf = new Configuration(conf);
  queryConf.setBoolean(CubeQueryConfUtil.REPLACE_TIMEDIM_WITH_PART_COL, false);

  testFieldsCannotBeQueriedTogetherError("select msr4 from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("d_time", "msr4"), queryConf);
}
 
开发者ID:apache,项目名称:lens,代码行数:19,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例7: testFieldsCannotBeQueriedTogetherError

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
private void testFieldsCannotBeQueriedTogetherError(final String testQuery, final List<String> conflictingFields,
    final Configuration queryConf)
  throws ParseException, LensException {

  try {

    String hqlQuery = rewrite(testQuery, queryConf);
    fail("Expected Query Rewrite to fail with FieldsCannotBeQueriedTogetherException, however it didn't happen. "
        + "Query got re-written to:" + hqlQuery);
  } catch(FieldsCannotBeQueriedTogetherException actualException) {

    SortedSet<String> expectedFields = new TreeSet<>(conflictingFields);

    FieldsCannotBeQueriedTogetherException expectedException =
        new FieldsCannotBeQueriedTogetherException(new ConflictingFields(expectedFields));
    assertEquals(actualException, expectedException);
  }
}
 
开发者ID:apache,项目名称:lens,代码行数:19,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例8: testCastStatement

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testCastStatement() throws ParseException, LensException {
  String castSelect = "cast(( a  +  b ) as tinyint), cast(( a  +  b ) as smallint), cast(( a  +  b ) as int),"
    + " cast(( a  +  b ) as bigint), cast(( a  +  b ) as float), cast(( a  +  b ) as double),"
    + " cast(( a  +  b ) as boolean), cast( a  as date), cast( b  as datetime), cast( a  as timestamp),"
    + " cast(( a  +  b ) as string), cast(( a  +  b ) as binary), cast(( a  +  b ) as decimal(3,6)),"
    + " cast(( a  +  b ) as decimal(5)), cast(( a  +  b ) as varchar(10)), cast(( a  +  b ) as char(20)),"
    + " cast( '17.29'  as decimal(4,2))";
  castSelect = "3.1415926BD";
  String query = "select " + castSelect + " from table limit 1";
  ASTNode tree = HQLParser.parseHQL(query, conf);
  System.out.println(tree.dump());
  ASTNode selectAST = HQLParser.findNodeByPath(tree, TOK_INSERT, TOK_SELECT);
  String genQuery = HQLParser.getString(selectAST);
  Assert.assertEquals(genQuery, castSelect);
}
 
开发者ID:apache,项目名称:lens,代码行数:17,代码来源:TestHQLParser.java


示例9: testExpressionHavingRefcol

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testExpressionHavingRefcol() throws ParseException, LensException {
  String colsSelected = " union_join_ctx_cityid, union_join_ctx_cityname_msr1_expr, "
      + "union_join_ctx_cityname_msr2_expr ";
  String whereCond = "(" + TWO_MONTHS_RANGE_UPTO_DAYS + ")";
  String rewrittenQuery = rewrite("select " + colsSelected + " from basecube where " + whereCond, conf);
  assertTrue(rewrittenQuery.contains("UNION ALL"));
  String expectedInnerSelect1 = "SELECT (basecube.union_join_ctx_cityid) as `alias0`, sum(case  "
      + "when ((cubecityjoinunionctx.name) = 'blr') then (basecube.union_join_ctx_msr1) else 0 end) "
      + "as `alias1`, 0 as `alias2` FROM TestQueryRewrite.c1_union_join_ctx_fact1 basecube ";
  String expectedInnerSelect2 = "SELECT (basecube.union_join_ctx_cityid) as `alias0`, "
      + "sum(case  when ((cubecityjoinunionctx.name) = 'blr') then (basecube.union_join_ctx_msr1) else 0 end) "
      + "as `alias1`, 0 as `alias2` FROM TestQueryRewrite.c1_union_join_ctx_fact2 basecube";
  String expectedInnerSelect3 = "SELECT (basecube.union_join_ctx_cityid) as `alias0`, 0 as `alias1`, "
      + "sum(case  when ((cubecityjoinunionctx.name) = 'blr') then (basecube.union_join_ctx_msr2) else 0 end) "
      + "as `alias2` FROM TestQueryRewrite.c1_union_join_ctx_fact3 basecube";
  String outerSelect = "SELECT (basecube.alias0) as `union_join_ctx_cityid`, sum((basecube.alias1)) "
      + "as `union_join_ctx_cityname_msr1_expr`, sum((basecube.alias2)) as `union_join_ctx_cityname_msr2_expr` FROM";
  String outerGroupBy = "GROUP BY (basecube.alias0)";
  compareContains(expectedInnerSelect1, rewrittenQuery);
  compareContains(expectedInnerSelect2, rewrittenQuery);
  compareContains(expectedInnerSelect3, rewrittenQuery);
  compareContains(outerSelect, rewrittenQuery);
  compareContains(outerGroupBy, rewrittenQuery);
}
 
开发者ID:apache,项目名称:lens,代码行数:26,代码来源:TestUnionAndJoinCandidates.java


示例10: testDuplicateMeasureProjectionInJoinCandidate

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testDuplicateMeasureProjectionInJoinCandidate() throws ParseException, LensException {
  // union_join_ctx_msr2 is common between two storage candidates and it should be answered from one
  // and the other fact will have it replaced with 0
  String colsSelected = " union_join_ctx_notnullcityid, sum(union_join_ctx_msr22) , "
      + "sum(union_join_ctx_msr2), sum(union_join_ctx_msr4) ";
  String whereCond =  "(" + TWO_MONTHS_RANGE_UPTO_DAYS + ")";
  String rewrittenQuery = rewrite("select " + colsSelected + " from basecube where " + whereCond, conf);
  assertTrue(rewrittenQuery.contains("UNION ALL"));
  String expectedInnerSelect1 = "SELECT case  when (basecube.union_join_ctx_cityid) is null then 0 "
      + "else (basecube.union_join_ctx_cityid) end as `alias0`, 0 as `alias1`, "
      + "sum((basecube.union_join_ctx_msr2)) as `alias2`, sum((basecube.union_join_ctx_msr4)) "
      + "as `alias3` FROM TestQueryRewrite.c1_union_join_ctx_fact4 basecube";
  String expectedInnerSelect2 = "SELECT case  when (basecube.union_join_ctx_cityid) is null then 0 else "
      + "(basecube.union_join_ctx_cityid) end as `alias0`, sum((basecube.union_join_ctx_msr22)) as `alias1`, "
      + "0 as `alias2`, 0 as `alias3` FROM TestQueryRewrite.c1_union_join_ctx_fact3 basecube";
  String outerSelect = "SELECT (basecube.alias0) as `union_join_ctx_notnullcityid`, sum((basecube.alias1)) "
      + "as `sum(union_join_ctx_msr22)`, sum((basecube.alias2)) as `sum(union_join_ctx_msr2)`, "
      + "sum((basecube.alias3)) as `sum(union_join_ctx_msr4)` FROM";
  String outerGroupBy = "GROUP BY (basecube.alias0)";
  compareContains(expectedInnerSelect1, rewrittenQuery);
  compareContains(expectedInnerSelect2, rewrittenQuery);
  compareContains(outerSelect, rewrittenQuery);
  compareContains(outerGroupBy, rewrittenQuery);
}
 
开发者ID:apache,项目名称:lens,代码行数:26,代码来源:TestUnionAndJoinCandidates.java


示例11: testChainsWithMultipleStorage

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testChainsWithMultipleStorage() throws ParseException, HiveException, LensException {
  Configuration conf = new Configuration(hconf);
  conf.unset(CubeQueryConfUtil.DRIVER_SUPPORTED_STORAGES); // supports all storages
  String dimOnlyQuery = "select testDim2.name, testDim2.cityStateCapital FROM testDim2 where " + TWO_DAYS_RANGE;
  CubeQueryRewriter driver = new CubeQueryRewriter(conf, hconf);
  CubeQueryContext rewrittenQuery = driver.rewrite(dimOnlyQuery);
  rewrittenQuery.toHQL();
  Dimension citydim = CubeMetastoreClient.getInstance(hconf).getDimension("citydim");
  Set<String> cdimTables = new HashSet<>();
  for (CandidateDim cdim : rewrittenQuery.getCandidateDims().get(citydim)) {
    cdimTables.add(cdim.getName());
  }
  Assert.assertTrue(cdimTables.contains("citytable"));
  Assert.assertTrue(cdimTables.contains("citytable2"));
  Assert.assertFalse(cdimTables.contains("citytable3"));
  Assert.assertFalse(cdimTables.contains("citytable4"));
}
 
开发者ID:apache,项目名称:lens,代码行数:19,代码来源:TestJoinResolver.java


示例12: testAggregateResolverOff

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testAggregateResolverOff() throws ParseException, LensException {
  Configuration conf2 = getConfWithStorages("C1,C2");
  conf2.setBoolean(CubeQueryConfUtil.DISABLE_AGGREGATE_RESOLVER, true);

  // Test if raw fact is selected for query with no aggregate function on a
  // measure, with aggregate resolver disabled
  String query = "SELECT cityid, testCube.msr2 FROM testCube WHERE " + TWO_DAYS_RANGE;
  CubeQueryContext cubeql = rewriteCtx(query, conf2);
  String hQL = cubeql.toHQL();
  Assert.assertEquals(1, cubeql.getCandidates().size());
  Candidate candidate = cubeql.getCandidates().iterator().next();
  Assert.assertTrue(candidate instanceof StorageCandidate);
  Assert.assertEquals("c1_testFact2_raw".toLowerCase(),
      ((StorageCandidate) candidate).getStorageTable().toLowerCase());
  String expectedQL =
    getExpectedQuery(cubeName, "SELECT testcube.cityid as `cityid`, testCube.msr2 as `msr2` from ", null, null,
      getWhereForHourly2days("c1_testfact2_raw"));
  compareQueries(hQL, expectedQL);
  conf2.set(CubeQueryConfUtil.DRIVER_SUPPORTED_STORAGES, "C2");
  aggregateFactSelectionTests(conf2);
  conf2.set(CubeQueryConfUtil.DRIVER_SUPPORTED_STORAGES, "C1,C2");
  rawFactSelectionTests(conf2);
}
 
开发者ID:apache,项目名称:lens,代码行数:25,代码来源:TestAggregateResolver.java


示例13: testAbsoluteValidity

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testAbsoluteValidity() throws ParseException, HiveException, LensException {
  CubeQueryContext ctx =
    rewriteCtx("select msr12 from basecube where " + TWO_DAYS_RANGE + " or " + TWO_DAYS_RANGE_BEFORE_4_DAYS,
      getConf());
  List<CandidateTablePruneCause> causes = findPruningMessagesForStorage("c3_testfact_deprecated",
    ctx.getStoragePruningMsgs());
  assertTrue(causes.isEmpty());

  causes = findPruningMessagesForStorage("c4_testfact_deprecated",
    ctx.getStoragePruningMsgs());
  assertTrue(causes.isEmpty());

  // testfact_deprecated's validity should be in between of both ranges. So both ranges should be in the invalid list
  // That would prove that parsing of properties has gone through successfully

  causes = findPruningMessagesForStorage("c1_testfact_deprecated",
    ctx.getStoragePruningMsgs());
  assertEquals(causes.size(), 1);
  assertEquals(causes.get(0).getCause(), TIME_RANGE_NOT_ANSWERABLE);

  causes = findPruningMessagesForStorage("c2_testfact_deprecated",
    ctx.getStoragePruningMsgs());
  assertEquals(causes.size(), 1);
  assertEquals(causes.get(0).getCause(), TIME_RANGE_NOT_ANSWERABLE);
}
 
开发者ID:apache,项目名称:lens,代码行数:27,代码来源:TestTimeRangeResolver.java


示例14: testSelectExprPromotionToGroupByWithSpacesInDimensionAliasAndWithAsKeywordBwColAndAlias

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testSelectExprPromotionToGroupByWithSpacesInDimensionAliasAndWithAsKeywordBwColAndAlias()
  throws ParseException, LensException, HiveException {

  String inputQuery = "select name as `Alias With  Spaces`, SUM(msr2) as `TestMeasure` from testCube join citydim"
    + " on testCube.cityid = citydim.id where " + LAST_HOUR_TIME_RANGE;

  String expectedRewrittenQuery = "SELECT (citydim.name) as `Alias With  Spaces`, sum((testcube.msr2)) "
    + "as `TestMeasure` FROM TestQueryRewrite.c2_testfact testcube inner JOIN TestQueryRewrite.c2_citytable citydim "
    + "ON ((testcube.cityid) = (citydim.id)) WHERE ((testcube.dt) = '"
    + getDateUptoHours(getDateWithOffset(HOURLY, -1)) + "') GROUP BY (citydim.name)";

  String actualRewrittenQuery = rewrite(inputQuery, getConfWithStorages("C2"));

  assertEquals(actualRewrittenQuery, expectedRewrittenQuery);
}
 
开发者ID:apache,项目名称:lens,代码行数:17,代码来源:TestCubeRewriter.java


示例15: testSelectExprPromotionToGroupByWithSpacesInDimensionAliasAndWithoutAsKeywordBwColAndAlias

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testSelectExprPromotionToGroupByWithSpacesInDimensionAliasAndWithoutAsKeywordBwColAndAlias()
  throws ParseException, LensException, HiveException {

  String inputQuery = "select name `Alias With  Spaces`, SUM(msr2) as `TestMeasure` from testCube join citydim"
    + " on testCube.cityid = citydim.id where " + LAST_HOUR_TIME_RANGE;

  String expectedRewrittenQuery = "SELECT (citydim.name) as `Alias With  Spaces`, sum((testcube.msr2)) "
    + "as `TestMeasure` FROM TestQueryRewrite.c2_testfact testcube inner JOIN TestQueryRewrite.c2_citytable citydim "
    + "ON ((testcube.cityid) = (citydim.id)) WHERE ((testcube.dt) = '"
    + getDateUptoHours(getDateWithOffset(HOURLY, -1)) + "') GROUP BY (citydim.name)";

  String actualRewrittenQuery = rewrite(inputQuery, getConfWithStorages("C2"));

  assertEquals(actualRewrittenQuery, expectedRewrittenQuery);
}
 
开发者ID:apache,项目名称:lens,代码行数:17,代码来源:TestCubeRewriter.java


示例16: testCubeWhereQueryWithMeasureWithDataCompletenessAndFailIfPartialDataFlagSet

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testCubeWhereQueryWithMeasureWithDataCompletenessAndFailIfPartialDataFlagSet() throws ParseException,
    LensException {
  /*In this query a measure is used for which dataCompletenessTag is set and the flag FAIL_QUERY_ON_PARTIAL_DATA is
  set. The partitions for the queried range are present but some of the them have incomplete data. So, the query
  throws NO_CANDIDATE_FACT_AVAILABLE Exception*/
  Configuration conf = getConf();
  conf.setStrings(CubeQueryConfUtil.COMPLETENESS_CHECK_PART_COL, "dt");
  conf.setBoolean(CubeQueryConfUtil.FAIL_QUERY_ON_PARTIAL_DATA, true);

  LensException e = getLensExceptionInRewrite("select SUM(msr9) from basecube where "
      + TWO_DAYS_RANGE, conf);
  assertEquals(e.getErrorCode(), LensCubeErrorCode.NO_CANDIDATE_FACT_AVAILABLE.getLensErrorInfo().getErrorCode());
  NoCandidateFactAvailableException ne = (NoCandidateFactAvailableException) e;
  PruneCauses.BriefAndDetailedError pruneCauses = ne.getJsonMessage();
  /*Since the Flag FAIL_QUERY_ON_PARTIAL_DATA is set, and the queried fact has incomplete data, hence, we expect the
  prune cause to be INCOMPLETE_PARTITION. The below check is to validate this.*/
  for(String part: INCOMPLETE_PARTITION.errorFormat.split("%s")) {
    assertTrue(pruneCauses.getBrief().contains(part), pruneCauses.getBrief());
  }
}
 
开发者ID:apache,项目名称:lens,代码行数:22,代码来源:TestCubeRewriter.java


示例17: parseTest

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void parseTest()
    throws CommandNeedRetryException, SemanticException, ParseException {
  String hiveQl = "select t1.c2 from (select t2.column2 c2, t3.column3 from db1.table2 t2 join db2.table3 t3 on t2.x = t3.y) t1";
  HiveViewDependency hiveViewDependency = new HiveViewDependency();
  String[] result = hiveViewDependency.getViewDependency(hiveQl);
  String[] expctedResult = new String[]{"db1.table2", "db2.table3"};
  Assert.assertEquals(expctedResult, result);
}
 
开发者ID:thomas-young-2013,项目名称:wherehowsX,代码行数:10,代码来源:HiveViewDependencyParserTest.java


示例18: testQueryWithDimensionAndMeasure

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWithDimensionAndMeasure() throws ParseException, LensException {

  /* If all the queried dimensions are present in a derived cube, and one of the queried measure is not present in
  the same derived cube, then query shall be disallowed.

  dim2 and msr1 are not present in the same derived cube, hence query shall be disallowed with appropriate
  exception. */

  testFieldsCannotBeQueriedTogetherError("select dim2, SUM(msr1) from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("dim2", "d_time", "msr1"));
}
 
开发者ID:apache,项目名称:lens,代码行数:13,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例19: testQueryWithDimensionAndMeasureInExpression

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWithDimensionAndMeasureInExpression() throws ParseException, LensException {

  /* If all the queried dimensions are present in a derived cube, and one of the queried measure is not present in
  the same derived cube, then query shall be disallowed.

  dim2 and roundedmsr1 (expression over msr1) are not present in the same derived cube, hence query shall be
  disallowed with appropriate exception. */

  testFieldsCannotBeQueriedTogetherError("select dim2, sum(roundedmsr1) from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("dim2", "d_time", "msr1"));
}
 
开发者ID:apache,项目名称:lens,代码行数:13,代码来源:FieldsCannotBeQueriedTogetherTest.java


示例20: testQueryWithDimensionInExpressionAndMeasure

import org.apache.hadoop.hive.ql.parse.ParseException; //导入依赖的package包/类
@Test
public void testQueryWithDimensionInExpressionAndMeasure() throws ParseException, LensException {

  /* If all the queried dimensions are present in a derived cube, and one of the queried measure is not present in
  the same derived cube, then query shall be disallowed.

  substrexprdim2( an expresison over dim2) and msr1 are not present in the same derived cube, hence query shall be
  disallowed with appropriate exception. */

  testFieldsCannotBeQueriedTogetherError("select substrexprdim2, SUM(msr1) from basecube where " + TWO_DAYS_RANGE,
      Arrays.asList("dim2", "d_time", "dim2chain.name", "msr1"));
}
 
开发者ID:apache,项目名称:lens,代码行数:13,代码来源:FieldsCannotBeQueriedTogetherTest.java



注:本文中的org.apache.hadoop.hive.ql.parse.ParseException类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java TriggerExecutor类代码示例发布时间:2022-05-22
下一篇:
Java Serializer类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap