为了账号安全,请及时绑定邮箱和手机立即绑定

java hadoop spark 报错

java hadoop spark 报错

婷婷同学_ 2018-12-06 17:43:33
在做项目的时候,设置了Thread.currentThread.setClassLoaderContext(classLoaderContext);在执行项目的时候在hadoop的位置报错 public FSDataInputStream getFSDataInputStream(String srcFilePath) { System.setProperty("HADOOP_HOME", "/usr/local/hadoop"); FSDataInputStream dataInputStream = null; OutputStream output = null; String FileName = String.format("data-%s", System.currentTimeMillis()); String destFilePath = TEMP_FILE_PATH + FileName; final String path = srcFilePath; Configuration config = new Configuration(); // config.set("fs.default.name", "hdfs://master:9000"); config.set("fs.hdfs.impl", org.apache.hadoop.hdfs.DistributedFileSystem.class.getName()); FileSystem fileSystem = null; try { fileSystem = FileSystem.get(URI.create(path), config); if (!fileSystem.exists(new Path(path))) { logger.error("文件不存在"); return null; } dataInputStream = fileSystem.open(new Path(path)); return dataInputStream; } catch (IOException ex) { logger.error(ex.getStackTrace().toString()); return null; } }在加粗的位置报错;错误信息为: [DEBUG] 2016-07-30 13:11:26,209(466) --> [main] org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:320): Failed to detect a valid hadoop home directoryjava.io.IOException: HADOOP_HOME or hadoop.home.dir are not set. at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:302) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:327) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) at org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:116) at org.apache.hadoop.security.Groups.<init>(Groups.java:93) at org.apache.hadoop.security.Groups.<init>(Groups.java:73) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:293) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:283) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:260) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:789) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:774) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:647) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2753) at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2745) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2611) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370) at com.itcchina.zs57s.splitter.tools.HadoopTools.getFSDataInputStream(HadoopTools.java:74) at com.itcchina.zs57s.splitter.handler.HadoopWorkItem.start(HadoopWorkItem.java:37) at ItcWorkFlowClassify.ItcWorkItem.runWorker(ItcWorkItem.java:297) at ItcWorkFlowClassify.ItcWorkFlow.start(ItcWorkFlow.java:71) at ItcWorkFlowClassify.ItcWorkItem.runWorker(ItcWorkItem.java:276) at Main.main(Main.java:25)  
查看完整描述

1 回答

?
慕村9548890

TA贡献1884条经验 获得超4个赞

自己发现了问题,原来自己在使用Runtime.getRuntime().exec()的时候参数给错了,默认给了第二个参数,因此获取不到环境变量

  ExecutorService executorService = Executors.newCachedThreadPool();
        executorService.execute(new Runnable() {
            public void run() {
                switch (taskCommandType) {
                    case start:
                        try {


                            currentProcess = Runtime.getRuntime().exec(String.format("java -jar %s  %s %s", itcWorkFlow4jFilePath, m_mainWorkFilePath, fileList[0]),
                                    null);
                            DataInputStream dataInputStream = new DataInputStream(currentProcess.getErrorStream());
                            ///  byte[] buffer = new byte[1024 * 1024];
                            //  int totalSize = 0;
                            //   int currentSize = 0;
                            String tempStr = "";
                            StringBuffer errorMessage = new StringBuffer();
                            while ((tempStr = dataInputStream.readLine()) != null) {
                                errorMessage.append(tempStr);
                            }
                            dataInputStream.close();
//                            String errorMessage = new String(buffer, 0, totalSize);
//                            buffer = null;
                            logger.error(errorMessage.toString());

                        } catch (IOException e) {
                            e.printStackTrace();
                        }
                        break;
                    case stop:
                        currentProcess.destroy();
                        break;
                    case list:
                        break;

 

查看完整回答
反对 回复 2018-12-16
  • 1 回答
  • 0 关注
  • 769 浏览

添加回答

举报

0/150
提交
取消
微信客服

购课补贴
联系客服咨询优惠详情

帮助反馈 APP下载

慕课网APP
您的移动学习伙伴

公众号

扫描二维码
关注慕课网微信公众号