• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java EsOutputFormat类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.elasticsearch.hadoop.mr.EsOutputFormat的典型用法代码示例。如果您正苦于以下问题:Java EsOutputFormat类的具体用法?Java EsOutputFormat怎么用?Java EsOutputFormat使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



EsOutputFormat类属于org.elasticsearch.hadoop.mr包,在下文中一共展示了EsOutputFormat类的16个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: Run

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
public static void Run(String input, Configuration conf) 
            throws IOException, ClassNotFoundException, InterruptedException {
        Job job = Job.getInstance(conf);
//        job.setJobName(Hdfs2es.class.getName());
        job.setJarByClass(Hdfs2es.class);
        
        job.setMapOutputKeyClass(Text.class);
        job.setMapOutputValueClass(Text.class);
        
        job.setMapperClass(MapTask.class);
        job.setInputFormatClass(SequenceFileInputFormat.class);
        job.setOutputFormatClass(EsOutputFormat.class);
        
        job.setNumReduceTasks(0);
        
        job.setOutputKeyClass(NullWritable.class);
        job.setOutputValueClass(Text.class);
        
        FileInputFormat.addInputPath(job, new Path(input));
        
        
        job.setSpeculativeExecution(false);
        job.waitForCompletion(true);
    }
 
开发者ID:chaopengio,项目名称:elasticsearch-mapreduce,代码行数:25,代码来源:Hdfs2es.java


示例2: sinkConfInit

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Override
public void sinkConfInit(FlowProcess<JobConf> flowProcess, Tap<JobConf, RecordReader, OutputCollector> tap, JobConf conf) {

    conf.setOutputFormat(EsOutputFormat.class);
    // define an output dir to prevent Cascading from setting up a TempHfs and overriding the OutputFormat
    Settings set = loadSettings(conf, false);

    Log log = LogFactory.getLog(EsTap.class);
    InitializationUtils.setValueWriterIfNotSet(set, CascadingValueWriter.class, log);
    InitializationUtils.setValueReaderIfNotSet(set, JdkValueReader.class, log);
    InitializationUtils.setBytesConverterIfNeeded(set, CascadingLocalBytesConverter.class, log);
    InitializationUtils.setFieldExtractorIfNotSet(set, CascadingFieldExtractor.class, log);

    // NB: we need to set this property even though it is not being used - and since and URI causes problem, use only the resource/file
    //conf.set("mapred.output.dir", set.getTargetUri() + "/" + set.getTargetResource());
    HadoopCfgUtils.setFileOutputFormatDir(conf, set.getResourceWrite());
    HadoopCfgUtils.setOutputCommitterClass(conf, EsOutputFormat.EsOldAPIOutputCommitter.class.getName());

    if (log.isTraceEnabled()) {
        log.trace("Initialized (sink) configuration " + HadoopCfgUtils.asProperties(conf));
    }
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:23,代码来源:EsHadoopScheme.java


示例3: init

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
private void init(TableDesc tableDesc, boolean read) {
    Configuration cfg = getConf();
    // NB: we can't just merge the table properties in, we need to save them per input/output otherwise clashes occur which confuse Hive

    Settings settings = HadoopSettingsManager.loadFrom(cfg);
    //settings.setProperty((read ? HiveConstants.INPUT_TBL_PROPERTIES : HiveConstants.OUTPUT_TBL_PROPERTIES), IOUtils.propsToString(tableDesc.getProperties()));
    if (read) {
        // no generic setting
    }
    else {
        // replace the default committer when using the old API
        HadoopCfgUtils.setOutputCommitterClass(cfg, EsOutputFormat.EsOutputCommitter.class.getName());
    }

    Assert.hasText(tableDesc.getProperties().getProperty(TABLE_LOCATION), String.format(
            "no table location [%s] declared by Hive resulting in abnormal execution;", TABLE_LOCATION));
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:18,代码来源:EsStorageHandler.java


示例4: testBasicMultiSave

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Test
public void testBasicMultiSave() throws Exception {
    JobConf conf = createJobConf();
    conf.set(ConfigurationOptions.ES_RESOURCE, "oldapi/multi-save");

    MultiOutputFormat.addOutputFormat(conf, EsOutputFormat.class);
    MultiOutputFormat.addOutputFormat(conf, PrintStreamOutputFormat.class);
    //MultiOutputFormat.addOutputFormat(conf, TextOutputFormat.class);

    PrintStreamOutputFormat.stream(conf, Stream.OUT);
    //conf.set("mapred.output.dir", "foo/bar");
    //FileOutputFormat.setOutputPath(conf, new Path("foo/bar"));

    conf.setClass("mapred.output.format.class", MultiOutputFormat.class, OutputFormat.class);
    runJob(conf);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:17,代码来源:AbstractMROldApiSaveTest.java


示例5: run

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Override
public int run(String[] args) throws Exception{
	Configuration conf = super.getConf();
	optParser(args);
			
	conf.set("es.nodes", this.servers);
	conf.set("prefix",this.prefix);
	conf.set("es.resource", this.index + "/{"+this.prefix+"SiteName}");
	conf.set("es.mapping.id",this.prefix+"Id");
	
	Job job = Job.getInstance(conf,"Description");
	job.setJarByClass(EsFeeder.class);
	job.setMapperClass(datacentermr.EsFeederMapper.class);
	job.setSpeculativeExecution(false);
	
	job.setOutputFormatClass(EsOutputFormat.class);
	job.setOutputKeyClass(NullWritable.class);
	job.setMapOutputValueClass(MapWritable.class);
	
	job.setNumReduceTasks(0);
	FileInputFormat.addInputPath(job, new Path(this.input));
	
	System.exit(job.waitForCompletion(true) ? 0 : 1);
	return 0;
	}
 
开发者ID:jucaf,项目名称:datacentermr,代码行数:26,代码来源:EsFeeder.java


示例6: testBasicMultiSave

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Test
public void testBasicMultiSave() throws Exception {
    JobConf conf = createJobConf();
    conf.set(ConfigurationOptions.ES_RESOURCE, "oldapi-multi-save/data");

    MultiOutputFormat.addOutputFormat(conf, EsOutputFormat.class);
    MultiOutputFormat.addOutputFormat(conf, PrintStreamOutputFormat.class);
    //MultiOutputFormat.addOutputFormat(conf, TextOutputFormat.class);

    PrintStreamOutputFormat.stream(conf, Stream.OUT);
    //conf.set("mapred.output.dir", "foo/bar");
    //FileOutputFormat.setOutputPath(conf, new Path("foo/bar"));

    conf.setClass("mapred.output.format.class", MultiOutputFormat.class, OutputFormat.class);
    runJob(conf);
}
 
开发者ID:elastic,项目名称:elasticsearch-hadoop,代码行数:17,代码来源:AbstractMROldApiSaveTest.java


示例7: configs

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Parameters
public static Collection<Object[]> configs() throws IOException {
    JobConf conf = HdpBootstrap.hadoopConfig();

    conf.setInputFormat(SplittableTextInputFormat.class);
    conf.setOutputFormat(EsOutputFormat.class);
    conf.setReducerClass(IdentityReducer.class);
    HadoopCfgUtils.setGenericOptions(conf);
    conf.setNumMapTasks(2);
    conf.setInt("actual.splits", 2);
    conf.setNumReduceTasks(0);


    JobConf standard = new JobConf(conf);
    standard.setMapperClass(TabMapper.class);
    standard.setMapOutputValueClass(LinkedMapWritable.class);
    standard.set(ConfigurationOptions.ES_INPUT_JSON, "false");
    FileInputFormat.setInputPaths(standard, new Path(TestUtils.gibberishDat(conf)));

    JobConf json = new JobConf(conf);
    json.setMapperClass(IdentityMapper.class);
    json.setMapOutputValueClass(Text.class);
    json.set(ConfigurationOptions.ES_INPUT_JSON, "true");
    FileInputFormat.setInputPaths(json, new Path(TestUtils.gibberishJson(conf)));

    return Arrays.asList(new Object[][] { { standard, "" }, { json, "json-" } });
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:28,代码来源:AbstractExtraMRTests.java


示例8: configs

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Parameters
public static Collection<Object[]> configs() throws IOException {
    Configuration conf = HdpBootstrap.hadoopConfig();
    HadoopCfgUtils.setGenericOptions(conf);

    Job job = new Job(conf);
    job.setInputFormatClass(TextInputFormat.class);
    job.setOutputFormatClass(EsOutputFormat.class);
    job.setMapOutputValueClass(LinkedMapWritable.class);
    job.setMapperClass(TabMapper.class);
    job.setNumReduceTasks(0);


    Job standard = new Job(job.getConfiguration());
    File fl = new File(TestUtils.sampleArtistsDat());
    long splitSize = fl.length() / 3;
    TextInputFormat.setMaxInputSplitSize(standard, splitSize);
    TextInputFormat.setMinInputSplitSize(standard, 50);

    standard.setMapperClass(TabMapper.class);
    standard.setMapOutputValueClass(LinkedMapWritable.class);
    TextInputFormat.addInputPath(standard, new Path(TestUtils.sampleArtistsDat(conf)));

    Job json = new Job(job.getConfiguration());
    json.setMapperClass(Mapper.class);
    json.setMapOutputValueClass(Text.class);
    json.getConfiguration().set(ConfigurationOptions.ES_INPUT_JSON, "true");
    TextInputFormat.addInputPath(json, new Path(TestUtils.sampleArtistsJson(conf)));

    return Arrays.asList(new Object[][] {
            { standard, "" },
            { json, "json-" } });
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:34,代码来源:AbstractMRNewApiSaveTest.java


示例9: testBasicMultiSave

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Test
public void testBasicMultiSave() throws Exception {
    Configuration conf = createConf();
    conf.set(ConfigurationOptions.ES_RESOURCE, "mrnewapi/multi-save");

    MultiOutputFormat.addOutputFormat(conf, EsOutputFormat.class);
    MultiOutputFormat.addOutputFormat(conf, PrintStreamOutputFormat.class);
    //MultiOutputFormat.addOutputFormat(conf, TextOutputFormat.class);

    PrintStreamOutputFormat.stream(conf, Stream.OUT);
    //conf.set("mapred.output.dir", "foo/bar");

    conf.setClass("mapreduce.outputformat.class", MultiOutputFormat.class, OutputFormat.class);
    runJob(conf);
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:16,代码来源:AbstractMRNewApiSaveTest.java


示例10: configs

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Parameters
public static Collection<Object[]> configs() {
    JobConf conf = HdpBootstrap.hadoopConfig();

    conf.setInputFormat(SplittableTextInputFormat.class);
    conf.setOutputFormat(EsOutputFormat.class);
    conf.setReducerClass(IdentityReducer.class);
    HadoopCfgUtils.setGenericOptions(conf);
    conf.setNumMapTasks(2);
    conf.setInt("actual.splits", 2);
    conf.setNumReduceTasks(0);


    JobConf standard = new JobConf(conf);
    standard.setMapperClass(TabMapper.class);
    standard.setMapOutputValueClass(LinkedMapWritable.class);
    standard.set(ConfigurationOptions.ES_INPUT_JSON, "false");
    FileInputFormat.setInputPaths(standard, new Path(TestUtils.sampleArtistsDat(conf)));

    JobConf json = new JobConf(conf);
    json.setMapperClass(IdentityMapper.class);
    json.setMapOutputValueClass(Text.class);
    json.set(ConfigurationOptions.ES_INPUT_JSON, "true");
    FileInputFormat.setInputPaths(json, new Path(TestUtils.sampleArtistsJson(conf)));

    return Arrays.asList(new Object[][] {
            { standard, "" },
            { json, "json-" }
    });
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:31,代码来源:AbstractMROldApiSaveTest.java


示例11: runMrJob

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
/**
 * 执行MrJob,elasticsearch自动索引
 * @throws IOException
 */
public static void runMrJob () throws IOException {
    JobConf conf = new JobConf();
    conf.set("es.nodes", "192.168.13.134:9200");//设置es地址
    conf.set("es.resource", "docindex/attachment");//设置index位置
    conf.set("es.mapping.id", "file");//设置mapping的id
    conf.set("es.input.json", "yes");//设置json输入格式
    conf.setOutputFormat(EsOutputFormat.class);//设置输出格式
    conf.setMapOutputValueClass(Text.class);//设置输出value格式
    conf.setMapperClass(EsMapper.class);//设置MapperClass

    JobClient.runJob(conf);//执行job,将json文件写入elasticsearch,elasticsearch会自动建索引
}
 
开发者ID:hackty,项目名称:hadooptools,代码行数:17,代码来源:ElasticsearchMr.java


示例12: ESEntityExtractor

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
public ESEntityExtractor(Class<T> t) {
    super();
    this.deepJobConfig = new ESDeepJobConfig(t);
    this.inputFormat = new EsInputFormat<>();
    this.outputFormat = new EsOutputFormat();

}
 
开发者ID:Stratio,项目名称:deep-spark,代码行数:8,代码来源:ESEntityExtractor.java


示例13: testBasicMultiSave

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Test
public void testBasicMultiSave() throws Exception {
    Configuration conf = createConf();
    conf.set(ConfigurationOptions.ES_RESOURCE, "mrnewapi-multi-save/data");

    MultiOutputFormat.addOutputFormat(conf, EsOutputFormat.class);
    MultiOutputFormat.addOutputFormat(conf, PrintStreamOutputFormat.class);
    //MultiOutputFormat.addOutputFormat(conf, TextOutputFormat.class);

    PrintStreamOutputFormat.stream(conf, Stream.OUT);
    //conf.set("mapred.output.dir", "foo/bar");

    conf.setClass("mapreduce.outputformat.class", MultiOutputFormat.class, OutputFormat.class);
    runJob(conf);
}
 
开发者ID:elastic,项目名称:elasticsearch-hadoop,代码行数:16,代码来源:AbstractMRNewApiSaveTest.java


示例14: getOutputCommitter

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@Override
public OutputCommitter getOutputCommitter(TaskAttemptContext context) throws IOException, InterruptedException {
    return new EsOutputFormat.EsOutputCommitter();
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:5,代码来源:PrintStreamOutputFormat.java


示例15: getOutputFormat

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
@SuppressWarnings("unchecked")
@Override
public OutputFormat<Object, Map<Writable, Writable>> getOutputFormat() throws IOException {
    return new EsOutputFormat();
}
 
开发者ID:xushjie1987,项目名称:es-hadoop-v2.2.0,代码行数:6,代码来源:EsStorage.java


示例16: ESCellExtractor

import org.elasticsearch.hadoop.mr.EsOutputFormat; //导入依赖的package包/类
public ESCellExtractor(Class<Cells> cellsClass) {
    super();
    this.deepJobConfig = new ESDeepJobConfig(cellsClass);
    this.inputFormat = new EsInputFormat<>();
    this.outputFormat = new EsOutputFormat();
}
 
开发者ID:Stratio,项目名称:deep-spark,代码行数:7,代码来源:ESCellExtractor.java



注:本文中的org.elasticsearch.hadoop.mr.EsOutputFormat类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java FieldId类代码示例发布时间:2022-05-22
下一篇:
Java ServiceHelper类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap