dongguozhong
[未知用户] Caused by: java.io.IOException: Cannot run program "Rscript": java.io.IOException: error=13, Permission denied
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
有提示这个,是不是不能调用R
bsspirit
[未知用户] 你在执行什么操作的时候,出现这个错误呢? rmr2? rhdfs?
1. 先确认Java程序,是不是运行都正常。
2. 如果是rmr2,看本地环境的/usr/bin/R,/usr/bin/Rscript,是不是装好了,thrift是不是装好了?
dongguozhong
[未知用户] 1.JAVA程序运行正常
2.根据您文档前面所有运行都可以,执行wordcount的时候提示
packageJobJar: [/tmp/RtmpSbSXJC/rmr-local-env2c932a90e626, /tmp/RtmpSbSXJC/rmr-global-env2c937efce0c2, /tmp/RtmpSbSXJC/rmr-streaming-map2c9359dcb07d, /tmp/RtmpSbSXJC/rmr-streaming-reduce2c93f412d85, /tmp/RtmpSbSXJC/rmr-streaming-combine2c9312b57f78, /home/hadoop_tmp/hadoop-unjar3036903685334201444/] [] /tmp/streamjob2684192384705100214.jar tmpDir=null
13/04/28 10:07:05 INFO util.NativeCodeLoader: Loaded the native-hadoop library
13/04/28 10:07:05 WARN snappy.LoadSnappy: Snappy native library not loaded
13/04/28 10:07:05 INFO mapred.FileInputFormat: Total input paths to process : 1
13/04/28 10:07:08 INFO streaming.StreamJob: getLocalDirs(): [/home/hadoop_tmp/mapred/local]
13/04/28 10:07:08 INFO streaming.StreamJob: Running job: job_201304272135_0004
13/04/28 10:07:08 INFO streaming.StreamJob: To kill this job, run:
13/04/28 10:07:08 INFO streaming.StreamJob: /hadoop/conan/hadoop-1.0.4/libexec/../bin/hadoop job -Dmapred.job.tracker=172.16.19.241:9001 -kill job_201304272135_0004
13/04/28 10:07:08 INFO streaming.StreamJob: Tracking URL: http://namenode.local:50030/jobdetails.jsp?jobid=job_201304272135_0004
13/04/28 10:07:09 INFO streaming.StreamJob: map 0% reduce 0%
13/04/28 10:08:42 INFO streaming.StreamJob: map 100% reduce 100%
13/04/28 10:08:42 INFO streaming.StreamJob: To kill this job, run:
13/04/28 10:08:42 INFO streaming.StreamJob: /hadoop/conan/hadoop-1.0.4/libexec/../bin/hadoop job -Dmapred.job.tracker=172.16.19.241:9001 -kill job_201304272135_0004
13/04/28 10:08:42 INFO streaming.StreamJob: Tracking URL: http://namenode.local:50030/jobdetails.jsp?jobid=job_201304272135_0004
13/04/28 10:08:42 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201304272135_0004_m_000000
13/04/28 10:08:42 INFO streaming.StreamJob: killJob...
Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce, :
hadoop streaming failed with error code 1
Deleted hdfs://172.16.19.241:9000/tmp/RtmpSbSXJC/file2c9328dd6066
我的/usr/bin/R,/usr/bin/Rscript在namenode上都安装了,但是thrift根据你的文档里没有写,就没有安装,怎么安装
dongguozhong
还有本篇文章的实例需要安装第六步中的rhbase么
bsspirit
[未知用户] 本例不需要rhbase,rhbase部分请参考
http://cos.name/2013/04/rhadoop4-rhbase/
bsspirit
[未知用户] 这步操作thrift不用装。
你能把tracking日志都打印出来吗?找到准确的异常信息!
http://namenode.local:50030/jobdetails.jsp?jobid=job_201304272135_0004
这个异常只是告诉你map重试次数过多,看不出来问题。
ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1
dongguozhong
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 17 more
Caused by: java.lang.RuntimeException: configuration exception
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230)
at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
... 22 more
Caused by: java.io.IOException: Cannot run program "Rscript": java.io.IOException: error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
... 23 more
Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.(UNIXProcess.java:148)
at java.lang.ProcessImpl.start(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
这种情况是不是我datanode上的R环境没有安装好
dongguozhong
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:372)
at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1121)
at org.apache.hadoop.mapred.Child.main(Child.java:249)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 9 more
Caused by: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:93)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:64)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34)
... 14 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:88)
... 17 more
Caused by: java.lang.RuntimeException: configuration exception
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:230)
at org.apache.hadoop.streaming.PipeMapper.configure(PipeMapper.java:66)
... 22 more
Caused by: java.io.IOException: Cannot run program "Rscript": java.io.IOException: error=2, No such file or directory
at java.lang.ProcessBuilder.start(ProcessBuilder.java:460)
at org.apache.hadoop.streaming.PipeMapRed.configure(PipeMapRed.java:214)
... 23 more
Caused by: java.io.IOException: java.io.IOException: error=2, No such file or directory
at java.lang.UNIXProcess.(UNIXProcess.java:148)
at java.lang.ProcessImpl.start(ProcessImpl.java:65)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:453)
这种情况 是我datanode上的R环境没有安装好么,datanode上所有的配置需要和namenode一致,是么
dongguozhong
[未知用户] 问题解决了,是datanode的问题,谢谢您耐心回答
vetri
hi ,
i am using rmr2 and rhdfs ,R-1.15.2, and cdh4
also set hadoop ,hadoop_cmd and hadoop_streaming environment variables.
please guide me urgents......please
Streaming command failed!
Error in rmr(map=map,reduce=reduce,combine=combine.....:
hadoop streaming error failed with error code 1
bsspirit
[未知用户] :-)
bsspirit
[未知用户] 你可以把错误日志贴出来吗?
seven-hxl1234
rjava后,包内没有jri是,是下载节点选错了吗,rjava不是带jri吗?求帮助rjava正常,但是希望Java调用r,求帮助
seven-hxl1234
[未知用户] 首先谢谢你回我邮件。问题解决了 给大家分享下:jdk环境问题,当时安装了jdk1.6配置好了就没关openjdk,虽然R CMD javareconf 执行后现显示jdk1.6安装rjava还是出了问题,openjdk卸载了 就可以了 rjava中有了 jri
bsspirit
[未知用户] 解决问题就好。上文中已经有写了JDK一定要用官方的。
===================
但JDK一定要用Oracle SUN官方的版本,请从官网下载,操作系统的自带的OpenJDK会有各种不兼容。JDK请选择1.6.x的版本,JDK1.7版本也会有各种的不兼容情况。
http://www.oracle.com/technetwork/java/javase/downloads/index.html
mchotcat-qmeng
为什么用hdfs.cat() 函数, 输出的是乱码?我是在mac环境下
bsspirit
[未知用户] 我没有用过mac,都是在linux做的实验。
你查查是不是hadoop本身原因造成的?
CarrieZhang
The same problem as ada.
When I call mapreduce, it shows the following message:
> out = mapreduce(input = a.dfs,map = function(k,v) keyval(v, 1),reduce = function(k,vv) keyval(k, length(vv)))
packageJobJar: [/tmp/Rtmp1MOG1F/rmr-local-env1ce26f1e7f40, /tmp/Rtmp1MOG1F/rmr-global-env1ce26fa484c1, /tmp/Rtmp1MOG1F/rmr-streaming-map1ce259675df6, /tmp/Rtmp1MOG1F/rmr-streaming-reduce1ce2520244d, /app/hadoop/tmp/hadoop-unjar6189303437379135382/] [] /tmp/streamjob98282612706550507.jar tmpDir=null
13/05/26 16:35:44 INFO mapred.FileInputFormat: Total input paths to process : 1
13/05/26 16:35:44 INFO streaming.StreamJob: getLocalDirs(): [/app/hadoop/tmp/mapred/local]
13/05/26 16:35:44 INFO streaming.StreamJob: Running job: job_201305261306_0006
13/05/26 16:35:44 INFO streaming.StreamJob: To kill this job, run:
13/05/26 16:35:44 INFO streaming.StreamJob: /usr/lib/hadoop/hadoop/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:54311 -kill job_201305261306_0006
13/05/26 16:35:44 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?jobid=job_201305261306_0006
13/05/26 16:35:45 INFO streaming.StreamJob: map 0% reduce 0%
13/05/26 16:36:23 INFO streaming.StreamJob: map 100% reduce 100%
13/05/26 16:36:23 INFO streaming.StreamJob: To kill this job, run:
13/05/26 16:36:23 INFO streaming.StreamJob: /usr/lib/hadoop/hadoop/libexec/../bin/hadoop job -Dmapred.job.tracker=localhost:54311 -kill job_201305261306_0006
13/05/26 16:36:23 INFO streaming.StreamJob: Tracking URL: http://localhost:50030/jobdetails.jsp?jobid=job_201305261306_0006
13/05/26 16:36:23 ERROR streaming.StreamJob: Job not successful. Error: # of failed Map Tasks exceeded allowed limit. FailedCount: 1. LastFailedTask: task_201305261306_0006_m_000000
13/05/26 16:36:23 INFO streaming.StreamJob: killJob...
Streaming Command Failed!
Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce, :
hadoop streaming failed with error code 1
Deleted hdfs://localhost:54310/tmp/Rtmp1MOG1F/file1ce21a895897
CarrieZhang
[未知用户] 问一下,是否需要设置streaming的变量呢?似乎默认的是mapreduce返回值不为0便是错的?
bsspirit
[未知用户] 这个问题,似乎出现的很普遍。
我想请大家首先确认,用JAVA写的mapreduce程序,是能正常运行的。
hadoop官方的例子:
http://hadoop.apache.org/docs/r1.0.4/mapred_tutorial.html
http://wiki.apache.org/hadoop/WordCount
To run the example, the command syntax is
bin/hadoop jar hadoop-*-examples.jar wordcount [-m ] [-r ]
===================================
成功运行完,JAVA的例子,R的环境应该就没有错误了。包着一层调试看不到本质的错误,是比较麻烦的。