• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    公众号

Java JvmEnv类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.hadoop.mapred.JvmManager.JvmEnv的典型用法代码示例。如果您正苦于以下问题:Java JvmEnv类的具体用法?Java JvmEnv怎么用?Java JvmEnv使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



JvmEnv类属于org.apache.hadoop.mapred.JvmManager包,在下文中一共展示了JvmEnv类的5个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: launchTaskJVM

import org.apache.hadoop.mapred.JvmManager.JvmEnv; //导入依赖的package包/类
/**
 * Launch a new JVM for the task.
 * 
 * This method launches the new JVM for the task by executing the
 * the JVM command using the {@link Shell.ShellCommandExecutor}
 */
void launchTaskJVM(TaskController.TaskControllerContext context) 
                                    throws IOException {
  initializeTask(context);

  JvmEnv env = context.env;
  List<String> wrappedCommand = 
    TaskLog.captureOutAndError(env.setup, env.vargs, env.stdout, env.stderr,
        env.logSize, true);
  ShellCommandExecutor shexec = 
      new ShellCommandExecutor(wrappedCommand.toArray(new String[0]), 
                                env.workDir, env.env);
  // set the ShellCommandExecutor for later use.
  context.shExec = shexec;
  shexec.execute();
}
 
开发者ID:rekhajoshm,项目名称:mapreduce-fork,代码行数:22,代码来源:DefaultTaskController.java


示例2: launchTaskJVM

import org.apache.hadoop.mapred.JvmManager.JvmEnv; //导入依赖的package包/类
/**
 * Launch a new JVM for the task.
 * 
 * This method launches the new JVM for the task by executing the
 * the JVM command using the {@link Shell.ShellCommandExecutor}
 */
void launchTaskJVM(TaskController.TaskControllerContext context) 
                                    throws IOException {
  JvmEnv env = context.env;
  List<String> wrappedCommand = 
    TaskLog.captureOutAndError(env.setup, env.vargs, env.stdout, env.stderr,
        env.logSize, true);
  ShellCommandExecutor shexec = 
      new ShellCommandExecutor(wrappedCommand.toArray(new String[0]), 
                                env.workDir, env.env);
  // set the ShellCommandExecutor for later use.
  context.shExec = shexec;
  shexec.execute();
}
 
开发者ID:rhli,项目名称:hadoop-EAR,代码行数:20,代码来源:DefaultTaskController.java


示例3: launchTaskJVM

import org.apache.hadoop.mapred.JvmManager.JvmEnv; //导入依赖的package包/类
/**
 * Launch a task JVM that will run as the owner of the job.
 * 
 * This method launches a task JVM by executing a setuid
 * executable that will switch to the user and run the
 * task.
 */
@Override
void launchTaskJVM(TaskController.TaskControllerContext context) 
                                      throws IOException {
  JvmEnv env = context.env;
  // get the JVM command line.
  String cmdLine = 
    TaskLog.buildCommandLine(env.setup, env.vargs, env.stdout, env.stderr,
        env.logSize, true);

  StringBuffer sb = new StringBuffer();
  //export out all the environment variable before child command as
  //the setuid/setgid binaries would not be getting, any environmental
  //variables which begin with LD_*.
  for(Entry<String, String> entry : env.env.entrySet()) {
    sb.append("export ");
    sb.append(entry.getKey());
    sb.append("=");
    sb.append(entry.getValue());
    sb.append("\n");
  }
  sb.append(cmdLine);
  // write the command to a file in the
  // task specific cache directory
  writeCommand(sb.toString(), getTaskCacheDirectory(context));
  
  // Call the taskcontroller with the right parameters.
  List<String> launchTaskJVMArgs = buildLaunchTaskArgs(context);
  ShellCommandExecutor shExec =  buildTaskControllerExecutor(
                                  TaskCommands.LAUNCH_TASK_JVM, 
                                  env.conf.getUser(),
                                  launchTaskJVMArgs, env.workDir, env.env);
  context.shExec = shExec;
  try {
    shExec.execute();
  } catch (Exception e) {
    LOG.warn("Exception thrown while launching task JVM : " + 
        StringUtils.stringifyException(e));
    LOG.warn("Exit code from task is : " + shExec.getExitCode());
    LOG.warn("Output from task-contoller is : " + shExec.getOutput());
    throw new IOException(e);
  }
  if(LOG.isDebugEnabled()) {
    LOG.debug("output after executing task jvm = " + shExec.getOutput()); 
  }
}
 
开发者ID:rhli,项目名称:hadoop-EAR,代码行数:53,代码来源:LinuxTaskController.java


示例4: launchTaskJVM

import org.apache.hadoop.mapred.JvmManager.JvmEnv; //导入依赖的package包/类
/**
 * Launch a task JVM that will run as the owner of the job.
 * 
 * This method launches a task JVM by executing a setuid executable that will
 * switch to the user and run the task. Also does initialization of the first
 * task in the same setuid process launch.
 */
@Override
void launchTaskJVM(TaskController.TaskControllerContext context) 
                                      throws IOException {
  JvmEnv env = context.env;
  // get the JVM command line.
  String cmdLine = 
    TaskLog.buildCommandLine(env.setup, env.vargs, env.stdout, env.stderr,
        env.logSize, true);

  StringBuffer sb = new StringBuffer();
  //export out all the environment variable before child command as
  //the setuid/setgid binaries would not be getting, any environmental
  //variables which begin with LD_*.
  for(Entry<String, String> entry : env.env.entrySet()) {
    sb.append("export ");
    sb.append(entry.getKey());
    sb.append("=");
    sb.append(entry.getValue());
    sb.append("\n");
  }
  sb.append(cmdLine);
  // write the command to a file in the
  // task specific cache directory
  writeCommand(sb.toString(), getTaskCacheDirectory(context, 
      context.env.workDir));
  
  // Call the taskcontroller with the right parameters.
  List<String> launchTaskJVMArgs = buildLaunchTaskArgs(context, 
      context.env.workDir);
  ShellCommandExecutor shExec =  buildTaskControllerExecutor(
                                  TaskControllerCommands.LAUNCH_TASK_JVM, 
                                  env.conf.getUser(),
                                  launchTaskJVMArgs, env.workDir, env.env);
  context.shExec = shExec;
  try {
    shExec.execute();
  } catch (Exception e) {
    int exitCode = shExec.getExitCode();
    LOG.warn("Exit code from task is : " + exitCode);
    // 143 (SIGTERM) and 137 (SIGKILL) exit codes means the task was
    // terminated/killed forcefully. In all other cases, log the
    // task-controller output
    if (exitCode != 143 && exitCode != 137) {
      LOG.warn("Exception thrown while launching task JVM : "
          + StringUtils.stringifyException(e));
      LOG.info("Output from LinuxTaskController's launchTaskJVM follows:");
      logOutput(shExec.getOutput());
    }
    throw new IOException(e);
  }
  if (LOG.isDebugEnabled()) {
    LOG.info("Output from LinuxTaskController's launchTaskJVM follows:");
    logOutput(shExec.getOutput());
  }
}
 
开发者ID:rekhajoshm,项目名称:mapreduce-fork,代码行数:63,代码来源:LinuxTaskController.java


示例5: initializeTask

import org.apache.hadoop.mapred.JvmManager.JvmEnv; //导入依赖的package包/类
private void initializeTask() throws IOException {
  tip.setJobConf(localizedJobConf);

  // ////////// The central method being tested
  tip.localizeTask(task);
  // //////////

  // check the functionality of localizeTask
  for (String dir : trackerFConf.getStrings(MRConfig.LOCAL_DIR)) {
    File attemptDir =
        new File(dir, TaskTracker.getLocalTaskDir(task.getUser(), jobId
            .toString(), taskId.toString(), task.isTaskCleanupTask()));
    assertTrue("attempt-dir " + attemptDir + " in localDir " + dir
        + " is not created!!", attemptDir.exists());
  }

  attemptWorkDir =
      lDirAlloc.getLocalPathToRead(TaskTracker.getTaskWorkDir(
          task.getUser(), task.getJobID().toString(), task.getTaskID()
              .toString(), task.isTaskCleanupTask()), trackerFConf);
  assertTrue("atttempt work dir for " + taskId.toString()
      + " is not created in any of the configured dirs!!",
      attemptWorkDir != null);

  TaskRunner runner = task.createRunner(tracker, tip);
  tip.setTaskRunner(runner);

  // /////// Few more methods being tested
  runner.setupChildTaskConfiguration(lDirAlloc);
  TaskRunner.createChildTmpDir(new File(attemptWorkDir.toUri().getPath()),
      localizedJobConf);
  attemptLogFiles = runner.prepareLogFiles(task.getTaskID(),
      task.isTaskCleanupTask());

  // Make sure the task-conf file is created
  Path localTaskFile =
      lDirAlloc.getLocalPathToRead(TaskTracker.getTaskConfFile(task
          .getUser(), task.getJobID().toString(), task.getTaskID()
          .toString(), task.isTaskCleanupTask()), trackerFConf);
  assertTrue("Task conf file " + localTaskFile.toString()
      + " is not created!!", new File(localTaskFile.toUri().getPath())
      .exists());

  // /////// One more method being tested. This happens in child space.
  localizedTaskConf = new JobConf(localTaskFile);
  TaskRunner.setupChildMapredLocalDirs(task, localizedTaskConf);
  // ///////

  // Initialize task via TaskController
  TaskControllerContext taskContext =
      new TaskController.TaskControllerContext();
  taskContext.env =
      new JvmEnv(null, null, null, null, -1, new File(localizedJobConf
          .get(TaskTracker.JOB_LOCAL_DIR)), null, localizedJobConf);
  taskContext.task = task;
  // /////////// The method being tested
  taskController.initializeTask(taskContext);
  // ///////////
}
 
开发者ID:rekhajoshm,项目名称:mapreduce-fork,代码行数:60,代码来源:TestTaskTrackerLocalization.java



注:本文中的org.apache.hadoop.mapred.JvmManager.JvmEnv类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java LazyMap类代码示例发布时间:2022-05-23
下一篇:
Java RepositoryResolver类代码示例发布时间:2022-05-23
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap