V2EX = way to explore
V2EX 是一个关于分享和探索的地方
现在注册
已注册用户请  登录
HeartJ
V2EX  ›  Hadoop

Hadoop streaming C++ MapReduce 求助

  •  
  •   HeartJ · 2016-07-06 13:31:20 +08:00 · 4660 次点击
    这是一个创建于 3097 天前的主题,其中的信息可能已经有所发展或是发生改变。

    最近在写 C++的 MapReduce 程序,简单的查词 demo 跑过了,

    但在实际工作要用的 MR 程序中报错了,提示错误内容如下:

    map 100% reduce 26% INFO mapreduce.Job: Task Id : attempt_1467765498879_0091_r_000000_0, Status : FAILED Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 134 at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:322) at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:535) at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134) at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1656) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) map 100% reduce 100%

    INFO mapreduce.Job: Job job_1467765498879_0091 failed with state FAILED due to: Task failed task_1467765498879_0091_r_000000 Job failed as tasks failed. failedMaps:0 failedReduces:1

    INFO mapreduce.Job: Counters: 39 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=3181370 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=1202484763 HDFS: Number of bytes written=0 HDFS: Number of read operations=27 HDFS: Number of large read operations=0 HDFS: Number of write operations=0 Job Counters Failed reduce tasks=4 Killed map tasks=1 Launched map tasks=10 Launched reduce tasks=4 Data-local map tasks=9 Rack-local map tasks=1 Total time spent by all maps in occupied slots (ms)=223454 Total time spent by all reduces in occupied slots (ms)=19038 Total time spent by all map tasks (ms)=111727 Total time spent by all reduce tasks (ms)=19038 Total vcore-milliseconds taken by all map tasks=111727 Total vcore-milliseconds taken by all reduce tasks=19038 Total megabyte-milliseconds taken by all map tasks=228816896 Total megabyte-milliseconds taken by all reduce tasks=19494912 Map-Reduce Framework Map input records=18446925 Map output records=75252 Map output bytes=2037977 Map output materialized bytes=2188535 Input split bytes=954 Combine input records=0 Spilled Records=75252 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=613 CPU time spent (ms)=80580 Physical memory (bytes) snapshot=2821369856 Virtual memory (bytes) snapshot=25988837376 Total committed heap usage (bytes)=2422210560 File Input Format Counters Bytes Read=1202483809 16/07/06 13:31:06 ERROR streaming.StreamJob: Job not successful! Streaming Command Failed!

    调试中遇到的错误有 subprocess failed with code 134 和 subprocess failed with code 139 ,

    有哪位朋友知道什么原因吗?

    请指教一下,谢谢。

    5 条回复    2016-08-06 00:23:13 +08:00
    yaoyuan7571
        1
    yaoyuan7571  
       2016-07-06 15:08:41 +08:00
    可以试着看下 reduce 程序中是否有代码或数据错误导致程序崩溃,减少 reducer 数目,然后在代码中输出一些调试日志,然后在 jobtracker 界面查看日志,确定程序崩溃位置,推荐使用 java 开发
    HeartJ
        2
    HeartJ  
    OP
       2016-07-06 16:15:55 +08:00
    @yaoyuan7571 多谢。确实应该用 Java 写,采用 streaming 方式还是有不便的地方。
    ooonme
        3
    ooonme  
       2016-07-07 09:01:54 +08:00
    为什么不上 spark
    cljnnn
        4
    cljnnn  
       2016-07-28 10:58:12 +08:00
    @ooonme spark 现在支持 C++了?
    ooonme
        5
    ooonme  
       2016-08-06 00:23:13 +08:00 via iPhone
    @cljnnn JVM 的世界干嘛用 C 艹,以前都是 java 调 c ,现在 c 调 java ,然后在 native c😜,其实我只用 scala
    关于   ·   帮助文档   ·   博客   ·   API   ·   FAQ   ·   实用小工具   ·   917 人在线   最高记录 6679   ·     Select Language
    创意工作者们的社区
    World is powered by solitude
    VERSION: 3.9.8.5 · 24ms · UTC 21:29 · PVG 05:29 · LAX 13:29 · JFK 16:29
    Developed with CodeLauncher
    ♥ Do have faith in what you're doing.