• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java EtlLogger类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中com.infobright.logging.EtlLogger的典型用法代码示例。如果您正苦于以下问题:Java EtlLogger类的具体用法?Java EtlLogger怎么用?Java EtlLogger使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



EtlLogger类属于com.infobright.logging包,在下文中一共展示了EtlLogger类的17个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: main

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
/**
 * @param args
 */
public static void main(String[] args) throws IOException, UsageException {
  CLArgs clargs = new CLArgs(args);
  int port2 = clargs.getPort();
  ConsoleEtlLogger.Level logLevel = clargs.getLogLevel();
  EtlLogger logger2 = new ConsoleEtlLogger(logLevel);
  new Agent(port2, logger2).execute();
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:11,代码来源:Agent.java


示例2: Agent

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
private Agent(int port, EtlLogger logger) {
  this.port = port;
  this.logger = logger;
  this.workers = new HashSet<AgentThread>();
  new Reaper().start();
  logger.info("Infobright remote load agent started on port " + port);
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:8,代码来源:Agent.java


示例3: readColumnTypes

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
/**
 * Reads the abstract column types from the SQL metadata.
 * 
 * @param md The metadata from the JDBC result set
 * @param charset the character set to use to encode String values
 *   for CHAR, VARCHAR column types
 * @param logger the logger to use
 * @param checkValues whether to check strings for length and
 * throw an exception immediately. Useful if implementing an error
 * path for rejected records. 
 *
 * @return list of column types from the table in the database
 * @throws SQLException
 */
public static List<AbstractColumnType> readColumnTypes(ResultSetMetaData md, Charset charset, EtlLogger logger, boolean checkValues) throws SQLException {
  List<AbstractColumnType> columns = new ArrayList<AbstractColumnType>();
  for (int i = 1; i <= md.getColumnCount(); i++) {
    // In theory, could find out the character set encoding for each
    // column from the database, and pass it here, instead of relying on
    // the character set parameter being passed in. However, the character
    // encoding is not available from the standard JDBC/SQL metadata.
    AbstractColumnType colType = AbstractColumnType.getInstance(md.getColumnName(i), md.getColumnType(i),
        md.getColumnTypeName(i), md.getPrecision(i), md.getScale(i), charset, logger);
    colType.setCheckValues(checkValues);
    columns.add(colType);
  }
  return columns;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:29,代码来源:BrighthouseRecord.java


示例4: TeradataBinaryRecord

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
TeradataBinaryRecord(List<AbstractColumnType> columns, Charset charset, EtlLogger logger) {
  super(columns, charset);
  this.logger = logger;
  nullind = new NullIndicator(columns.size()); 
  byteBuffer = ByteBuffer.allocate(BUFFER_SIZE);
  byteBuffer.order(ByteOrder.LITTLE_ENDIAN);
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:8,代码来源:TeradataBinaryRecord.java


示例5: AgentThread

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
public AgentThread(Socket socket, long id, EtlLogger logger) throws IOException {
  this.socket = socket;
  this.in = new DataInputStream(socket.getInputStream());
  this.out = new DataOutputStream(socket.getOutputStream());
  this.logger = logger;
  this.id = id;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:8,代码来源:AgentThread.java


示例6: NamedPipeOutputStream

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
/**
 * Creates and returns an OutputStream which can write to the named pipe.
 * @param pipeName
 * @param proxy The client proxy to use (or null)
 * @param logger 
 * @throws IOException 
 */
public NamedPipeOutputStream(String pipeName, ClientProxy proxy, EtlLogger logger) throws IOException {
  this.logger = logger;
  if (this.logger != null) {
    this.logger.debug(String.format("creating named pipe client \"%s\"", pipeName));
  }
  namedPipe = new NamedPipeFactory(proxy).createClient(pipeName); 
  namedPipe.connect();
  if (logger != null) {
    logger.debug("NamedPipeFactory.createClient(name) returned " + namedPipe);
  }
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:19,代码来源:NamedPipeOutputStream.java


示例7: InfobrightNamedPipeLoader

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
/**
 * Constructs a new loader instance.  No database action taken yet.
 *
 * @param tableName
 * @param connection
 * @param logger
 * @param dataFormat
 * @param charset
 * @param agentPort  the port number on the remote machine where the
 *                   agent is running (defaults to 5555)
 * @throws Exception
 */
public InfobrightNamedPipeLoader(String tableName, Connection 
    connection, EtlLogger logger, DataFormat dataFormat, Charset charset,
    int agentPort)
        throws Exception {

  this.tableName = tableName;
  this.connection = connection;
  this.dataFormat = dataFormat;
  this.logger = logger;
  this.charset = charset;

  String hostName = _getHostName(connection);
  boolean isLocal = _isLocal(hostName);
  
  // Use LOAD DATA LOCAL INFILE if (1) connection is remote; (2) local client
  // is Linux/Unix; and (3) the IB release supports it. In this case all pipe
  // operations are done locally using Unix semantics.
  boolean useLocalInfile =
      !isLocal && new OSType().isUnix() && new IBVersionUtil(connection).isSupportsLocalInfile();
  
  if (isLocal || useLocalInfile) {
    proxy = null;
    factory = new NamedPipeFactory();
  } else {
    proxy = new ClientProxy(hostName, agentPort);
    factory = new NamedPipeFactory(proxy);
  }

  strategy = factory.getStrategy(logger);
  id = LoaderInstanceTracker.register(this);
  
  // the named pipe name will be the prefix with a date/time stamp appended.
  // since multiple loaders may be started in the same millisecond, we append the instance # to the 
  // end of the name to ensure uniqueness.
  pipeName = String.format("%s_%tH_%<tM_%<tS_%<tL-%d", this.pipeNamePrefix, new Date(), id);
  sql = dataFormat.getLoadSQL(getEscapedPipeName(pipeName), tableName, useLocalInfile);
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:50,代码来源:InfobrightNamedPipeLoader.java


示例8: getStrategy

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
public PipeCallStrategy getStrategy(EtlLogger logger) {
  if (osType.isUnix()) {
    return new UnixPipeCallStrategy(proxy, logger);
  } else if (osType.isWindows()) {
    return new WindowsPipeCallStrategy(proxy, logger);
  } else {
    throw new UnsupportedOperationException("Unsupported platform");
  }
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:10,代码来源:NamedPipeFactory.java


示例9: getInstance

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
/**
 * Factory method that creates an instance of the column type
 * appropriate for the JDBC column type supplied.  When new
 * column types are added, they must also be added to this method.
 * 
 * @param columnName
 * @param columnType @see java.sql.Types
 * @param columnTypeName 
 * @param precision
 * @param scale
 * @param charset
 * @param logger
 * @return a column type
 */
public static AbstractColumnType getInstance(String columnName, int columnType,
    String columnTypeName, int precision, int scale, Charset charset,
    EtlLogger logger) {
  if (logger != null) {
    String logMsg;
    if (scale == 0) {
      logMsg = String.format("Column: %s %s(%d)", columnName, columnTypeName, precision);
    } else {
      logMsg = String.format("Column: %s %s(%d,%d)", columnName, columnTypeName, precision, scale);
    }
    logger.info(logMsg);
  }
  AbstractColumnType col = null;
  if (columnType == Types.VARCHAR) {
    col = new VarcharType(precision, charset);
  } else if (columnType == Types.SMALLINT) {
    col = new SmallintType();
  } else if (columnType == Types.INTEGER) {
    // MEDIUMINT is MySQL-specific and does not have its own sql Type
    if ("MEDIUMINT".equalsIgnoreCase(columnTypeName)) {
      col = new MediumintType();
    } else {
      col = new IntegerType();
    }
  } else if (columnType == Types.TINYINT || columnType == Types.BOOLEAN) {
    col = new TinyintType();
  } else if (columnType == Types.BIGINT) {
    col = new BigintType();
  } else if (columnType == Types.FLOAT || columnType == Types.REAL) {
    col = new FloatType();
  } else if (columnType == Types.DOUBLE) {
    col = new DoubleType();
  } else if (columnType == Types.CHAR) {
    col = new CharType(precision, charset);
  } else if (columnType == Types.TIMESTAMP) {
    // TIMESTAMP, DATETIME are treated the same
    col = new DatetimeType();
  } else if (columnType == Types.DATE) {
    if (precision == 4) {
      col = new YearType(); // show up as precision 4
    } else { /* precision == 10 */
      col = new DateIntType();
    }
  } else if (columnType == Types.BINARY) {
    col = new BinaryType(precision);
  } else if (columnType == Types.VARBINARY) {
    col = new VarbinaryType(precision);
  } else if (columnType == Types.LONGVARCHAR) {
    col = new TextType(precision, charset);
  } else if (columnType == Types.DECIMAL) {
    col = new DecimalType(precision, scale);
  } else if (columnType == Types.TIME) {
    col = new TimeType();
  } else {
    throw new RuntimeException("Unsupported type (" + columnTypeName + "," + columnType
        + ") for column " + columnName);
  }
  col.setColumnName(columnName);
  return col;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:75,代码来源:AbstractColumnType.java


示例10: WindowsPipeCallStrategy

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
WindowsPipeCallStrategy(ClientProxy proxy, EtlLogger logger) {
  this.proxy = proxy;
  this.logger = logger;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:5,代码来源:WindowsPipeCallStrategy.java


示例11: UnixPipeCallStrategy

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
public UnixPipeCallStrategy(ClientProxy proxy, EtlLogger logger) {
  this.proxy = proxy;
  this.logger = logger;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:5,代码来源:UnixPipeCallStrategy.java


示例12: getEtlLogger

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
public EtlLogger getEtlLogger() {
  return logger;
}
 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:4,代码来源:InfobrightNamedPipeLoader.java


示例13: databaseSetup

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
void databaseSetup(InfobrightLoaderMeta meta, InfobrightLoader step) throws KettleException {
  
  db = new Database(meta.getDatabaseMeta());
  db.connect();

  // FIXME: This will fail if the first row of the table contains a value that
  // cannot be read by Java. For example, a DATE field that contains the value
  // '0000-00-00'. In this case, the Kettle error message will misleadingly say
  // that the table doesn't exist. There doesn't seem to be any workaround.
  // See Pentaho JIRA: PDI-2117.
  //
  requiredRowMeta = meta.getRequiredFields(step); 
  requiredFields = requiredRowMeta.getFieldNames();
  
  try {
    // once the loader is built, this db connection cannot be used by this thread anymore.
    // the loader is using it and any other uses of the connection will block.
    if (meta.getInfobrightProductType() == null) {
      meta.setDataFormat(DataFormat.TXT_VARIABLE); // default for ICE
    }
    DataFormat dataFormat = DataFormat.valueForDisplayName(meta.getInfobrightProductType());

    Connection conn = db.getConnection();
    String tableName = meta.getDatabaseMeta().getQuotedSchemaTableCombination(step.environmentSubstitute(meta.getSchemaName()), step.environmentSubstitute(meta.getTablename()));
    EtlLogger logger = new KettleEtlLogger(step);
    loader = new InfobrightNamedPipeLoader(tableName, conn, logger, dataFormat);
    record = loader.createRecord(false); // TODO set to true to support error path
    loader.start();
    
  } catch (Exception e) {
    db.disconnect();
    db = null;
    if (loader != null) {
      try {
        loader.killQuery();
      } catch (SQLException e1) {
        throw new KettleDatabaseException(e1);
      }
    }
    throw new KettleDatabaseException(e);
  }
}
 
开发者ID:icholy,项目名称:geokettle-2.0,代码行数:43,代码来源:InfobrightLoaderData.java


示例14: databaseSetup

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
void databaseSetup(InfobrightLoaderMeta meta, InfobrightLoader step) throws KettleException {
  
  db = new Database(step, meta.getDatabaseMeta());
  db.connect();

  // FIXME: This will fail if the first row of the table contains a value that
  // cannot be read by Java. For example, a DATE field that contains the value
  // '0000-00-00'. In this case, the Kettle error message will misleadingly say
  // that the table doesn't exist. There doesn't seem to be any workaround.
  // See Pentaho JIRA: PDI-2117.
  //
  requiredRowMeta = meta.getRequiredFields(step); 
  requiredFields = requiredRowMeta.getFieldNames();
  
  try {
    // once the loader is built, this db connection cannot be used by this thread anymore.
    // the loader is using it and any other uses of the connection will block.
    if (meta.getInfobrightProductType() == null) {
      meta.setDataFormat(DataFormat.TXT_VARIABLE); // default for ICE
    }
    DataFormat dataFormat = DataFormat.valueForDisplayName(meta.getInfobrightProductType());
    int agentPort = meta.getAgentPort();
    Charset charset = meta.getCharset();
    Connection conn = db.getConnection();
    String tableName = meta.getDatabaseMeta().getQuotedSchemaTableCombination(step.environmentSubstitute(meta.getSchemaName()), step.environmentSubstitute(meta.getTablename()));
    EtlLogger logger = new KettleEtlLogger(step);
    loader = new InfobrightNamedPipeLoader(tableName, conn, logger, dataFormat, charset, agentPort);
    loader.setTimeout(30);
    String debugFile = meta.getDebugFile();
    if (debugFile != null) {
      OutputStream debugOutputStream = new FileOutputStream(debugFile);
      loader.setDebugOutputStream(debugOutputStream);
    }
    record = loader.createRecord(false); // TODO set to true to support error path
    loader.start();
    
  } catch (Exception e) {
    db.disconnect();
    db = null;
    if (loader != null) {
      try {
        loader.killQuery();
      } catch (SQLException e1) {
        throw new KettleDatabaseException(e1);
      }
    }
    throw new KettleDatabaseException(e);
  }
}
 
开发者ID:yintaoxue,项目名称:read-open-source-code,代码行数:50,代码来源:InfobrightLoaderData.java


示例15: databaseSetup

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
void databaseSetup(InfobrightLoaderMeta meta, InfobrightLoader step) throws KettleException {
  
  db = new Database(step, meta.getDatabaseMeta());
  db.connect();

  // FIXME: This will fail if the first row of the table contains a value that
  // cannot be read by Java. For example, a DATE field that contains the value
  // '0000-00-00'. In this case, the Kettle error message will misleadingly say
  // that the table doesn't exist. There doesn't seem to be any workaround.
  // See Pentaho JIRA: PDI-2117.
  //
  requiredRowMeta = meta.getRequiredFields(step); 
  requiredFields = requiredRowMeta.getFieldNames();
  
  try {
    // once the loader is built, this db connection cannot be used by this thread anymore.
    // the loader is using it and any other uses of the connection will block.
    if (meta.getInfobrightProductType() == null) {
      meta.setDataFormat(DataFormat.TXT_VARIABLE); // default for ICE
    }
    DataFormat dataFormat = DataFormat.valueForDisplayName(meta.getInfobrightProductType());
    int agentPort = meta.getAgentPort();
    Charset charset = meta.getCharset();
    Connection conn = db.getConnection();
    String tableName = meta.getDatabaseMeta().getQuotedSchemaTableCombination(step.environmentSubstitute(meta.getSchemaName()), step.environmentSubstitute(meta.getTableName()));
    EtlLogger logger = new KettleEtlLogger(step);
    loader = new InfobrightNamedPipeLoader(tableName, conn, logger, dataFormat, charset, agentPort);
    loader.setTimeout(30);
    String debugFile = meta.getDebugFile();
    if (debugFile != null) {
      OutputStream debugOutputStream = new FileOutputStream(debugFile);
      loader.setDebugOutputStream(debugOutputStream);
    }
    record = loader.createRecord(false); // TODO set to true to support error path
    loader.start();
    
  } catch (Exception e) {
    db.disconnect();
    db = null;
    if (loader != null) {
      try {
        loader.killQuery();
      } catch (SQLException e1) {
        throw new KettleDatabaseException(e1);
      }
    }
    throw new KettleDatabaseException(e);
  }
}
 
开发者ID:bsspirit,项目名称:kettle-4.4.0-stable,代码行数:50,代码来源:InfobrightLoaderData.java


示例16: databaseSetup

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
void databaseSetup( InfobrightLoaderMeta meta, InfobrightLoader step ) throws KettleException {

    db = new Database( step, meta.getDatabaseMeta() );
    db.connect();

    // FIXME: This will fail if the first row of the table contains a value that
    // cannot be read by Java. For example, a DATE field that contains the value
    // '0000-00-00'. In this case, the Kettle error message will misleadingly say
    // that the table doesn't exist. There doesn't seem to be any workaround.
    // See Pentaho JIRA: PDI-2117.
    //
    requiredRowMeta = meta.getRequiredFields( step );
    requiredFields = requiredRowMeta.getFieldNames();

    try {
      // once the loader is built, this db connection cannot be used by this thread anymore.
      // the loader is using it and any other uses of the connection will block.
      if ( meta.getInfobrightProductType() == null ) {
        meta.setDataFormat( DataFormat.TXT_VARIABLE ); // default for ICE
      }
      DataFormat dataFormat = DataFormat.valueForDisplayName( meta.getInfobrightProductType() );
      int agentPort = meta.getAgentPort();
      Charset charset = meta.getCharset();
      Connection conn = db.getConnection();
      String tableName =
        meta
          .getDatabaseMeta().getQuotedSchemaTableCombination(
            step.environmentSubstitute( meta.getSchemaName() ),
            step.environmentSubstitute( meta.getTableName() ) );
      EtlLogger logger = new KettleEtlLogger( step );
      loader = new InfobrightNamedPipeLoader( tableName, conn, logger, dataFormat, charset, agentPort );
      loader.setTimeout( 30 );
      String debugFile = meta.getDebugFile();
      if ( debugFile != null ) {
        OutputStream debugOutputStream = new FileOutputStream( debugFile );
        loader.setDebugOutputStream( debugOutputStream );
      }
      record = loader.createRecord( false ); // TODO set to true to support error path
      loader.start();

    } catch ( Exception e ) {
      db.disconnect();
      db = null;
      if ( loader != null ) {
        try {
          loader.killQuery();
        } catch ( SQLException e1 ) {
          throw new KettleDatabaseException( e1 );
        }
      }
      throw new KettleDatabaseException( e );
    }
  }
 
开发者ID:pentaho,项目名称:pentaho-kettle,代码行数:54,代码来源:InfobrightLoaderData.java


示例17: createRecord

import com.infobright.logging.EtlLogger; //导入依赖的package包/类
BrighthouseRecord createRecord(List<AbstractColumnType> columns, Charset charset, EtlLogger logger); 
开发者ID:EidosMedia,项目名称:infobright-core,代码行数:2,代码来源:BrighthouseRecordFactory.java



注:本文中的com.infobright.logging.EtlLogger类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java ServerEntry类代码示例发布时间:2022-05-22
下一篇:
Java BigDecimalField类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap